I should have been more precise, but this is all in the context of news about a cutting-edge LLM using a fraction of the cost of ChatGPT, and comments calling it all “reactionary autocorrect” and “literally reactionary by design”.
I disagree that it’s “reactionary by design”. I agree that it’s usage is 90% reactionary. Many companies are effectively trying to use it in a way that attempts to reinforce their deteriorating status quo. I work in software so I always see people calling this shit a magic wand to problems of the falling rate of profit and the falling rate of production. I’ll give you an extrememly common example that i’ve seen across multiple companies an industries.
Problem: Modern companies do not want to be responsible for the development and education of their employees. They do not want to pay for the development of well functioning specialized tools for the problems their company faces. They see it as a money and time sink. This often presents itself as:
- missing, incomplete, incorrect documentation
- horrible time wasting meeting practices
I’ve seen the following be pitched as AI Bandaids:
Proposal: push all your documentation into a RAG LLM so that users simply ask the robot and get what they want
Reality: The robot hallucinates things that aren’t there in technical processes. Attempts to get the robot to correct this involves the robot sticking to marketing style vagaries that aren’t even grounded in the reality of how the company actually works (things as simple as the robot assuming how a process/team/division is organized rather than the reality). Attempts to simply use it as a semantic search index end up linking to the real documentation which is garbage to begin with and doesn’t actually solve anyone’s real problems.
Proposal: We have too many meetings and spend ~4 hours on zoom. Nobody remembers what happens in the meetings, nobody takes notes, it’s almost like we didn’t have them at all. We are simply not good at working meetings and it’s just chat sessions where the topic is the project. We should use AI features to do AI summaries of our meetings.
Reality: The AI summaries cannot capture action items correctly if at all. The AI summaries are vague and mainly result in metadata rather than notes of important decisions and plans. We are still in meetings for 4 hours a day, but now we just copypasta useless AI summaries all over the place.
Don’t even get me started on CoPilot and code generation garbage. Or making “developers productive”. It all boils down to a million monkey problem.
These are very common scenarios that I’ve seen that ground the use of this technology in inherently reactionary patterns of social reproduction. By the way I do think DeepSeek and Duobao are an extremely important and necessary step because it destroys the status quo of Western AI development. AI in the West is made to be inefficient on purpose because it limits competition. The fact that you cannot run models locally due to their incredible size and compute demand is a vendor lock-in feature that ensures monetization channels for Western companies. The PayGo model bootstraps itself.
I think the need to have a shared monoculture is a deeply reactionary way of thinking that prevents us from developing human empathy. You don’t need to say “Bazinga” at the same time as another person in order for you to relate to, care for, and understand strangers. I think the yearning for monoculture in people 25-40 is a mirror of boomers who complain that they cannot relate to kids anymore because nobody really believes in the pledge of alligence or some such other “things r different” nonsense. Yeah I haven’t played Hoop and Stick 3, we don’t need to play the same video games to relate to each other.
It’s a crutch for a brutal culture where you are too scared to show a modicum of openness or vulnerability with other humans because deep down you need to be reassured that they won’t scam/harm you simply because they believe in the magic words of Burgerstan. People are uncomfortable with change and things they don’t know because we’ve built a society where change often begets material vulnerability in people, and information and even cultural media have become a weapon to be used against others.
Monoculture was never good, it simply was. Also despite this being a real aesthetic trend, you should also remember that the vast majority of consumer technology produced at the same time was not clear plastic tech. If anything the monoculture of tech products of that era was that gross beige that yellows in about a year or two. It’s just not aesthetic enough to remember, and in 10 years everything just defaulted black. I’ve actually never seen a clear plastic Dreamcast/ Dreamcast controller IRL. I’ve been a tech guy forever and despite knowing about it, I only know of 1 person that had actually experienced the Dreamcast internet. This is much nostalgia bait vs actual how things were.
To put it into perspective for everyone of those phones with clear plastic, there were 1000 of these