Yes, here’s my understanding:
it’s essentially a massive collection of forum posts – all text.
Files/binaries are encoded into text, and split into multiple posts if they exceed the max size for a single post. The names of posts and relationships between multiple posts can be obfuscated too.
Indexers provide .nzb files which are kinda like .torrent files, they indicate where in Usenet all the posts needed for a complete download are located.
You give an .nzb file to an nzb downloader, which finds the post(s), downloads, (merges,) decodes the final result into its original binary form, and does a hash check to make sure everything is correct.
There’s some open source software like Radarr, for example, which can automate the entire process start to finish (in Radarr’s case, for movies specifically)
With Radarr it goes like this: Add movie -> Radarr searches via indexer(s) for a .nzb matching the criteria -> .nzb gets sent to nzb downloader -> downloaded from usenet server(s) -> completed download is moved (and optionally renamed) by Radarr to desired location
Thank you for your hard work in keeping this instance alive.
Just wanted to let you know I just set up $10/mo recurring donation for the forseeable future!
Correct me if I’m wrong but that puts us at 50% community funded now 🔥