Now that the web is rich in garbage I try to think of how the UX can be made tolerable again. Consider this scenario:

Bob sees an article of interest and decides to share it. I have no idea why Bob’s experience was decent enough that he feels the article is worthy of sharing, but I’m getting tor-hostility, CAPTCHAs, popups, dysfunction that requires JavaScript perms, etc. In short, Bob’s link goes to a shit hole.

So how can we fix this?

What if Bob copies the full text of the article and creates an archive of sorts in the fediverse? That solves the enshitification problem but it risks harassment from copyright police. Or does it not? The fair use doctrine specifically permits a work to be quoted for the purpose of commentary. It’s also easily justified because the web has become so exclusive (e.g. Tor blocking) that a case can be made for including a copy of the article along with Bob’s commentary. Because what happens now? Alice the Tor user gets blocked from the page and can only read people’s comments which have no context because the web is broken. Bob copying the original text enables Alice to appreciate Bob’s work (his commentary).

I also wonder if bilingual people can go a step further in mitigating copyright harassment. Suppose Juan reads the English article, machine translates it into Spanish, then corrects the flaws because he’s fluent in Spanish, and then posts the Spanish version. Do copyrights survive translation? If Juan comments in Spanish, then surely the Spanish translation is critical to non-English speakers understanding Juan’s post.

I think this idea would benefit the permacomputing movement because avoiding web enshitification is a way to access content with less resources. The original poster may have to run a shit ton of heavy JavaScript to reach the text, but then everyone attending his thread can function with a simple text client.

#askFedi #lawFedi