Alt Text: an image of Agent Smith from The Matrix with the following text superimposed, “1999 was described as being the peak of human civilization in ‘The Matrix’ and I laughed because that obviously wouldn’t age well and then the next 25 years happened and I realized that yeah maybe the machines had a point.”
I work in the gaming industry and every week I receive emails about how AI is gonna revolutionize my job and get sent to time wasting training about how to use Figma AI or other shit like that because it’s the best thing ever according to HR… and it never is obviously.
At best, it’s gonna make middle managing jobs easier but for devs like me, as long as the “AI” stays out of our engines and stays into the equivalent of cooperative vision boards, it does nothing for me. Not once have I tried to use it for it to turn actually useful. It’s mediocre at best and I can’t believe there are game devs that actually try to code with it, can’t wait to see these hot garbage products come on the market.
Gawd, me too. They’ve started scraping my LinkedIn recommenders to try bait me in.
For context, I work at a university. The subject was something like “xxxxxx recommends you for a company like us” implying my contact had actually been behind it, but obviously they didn’t.
And obviously it reads like it was written by one of the GPTs.
Had they seen our profiles, they’d actually know what it is we do and how ridiculous recommending a chat AI is. That’s sooooo beneath our knowledge and expertise. Like a random suggesting Ivermectin to Dr Faucci.
From their example, seems like all they’ve “innovated” is a new, less reliable way to write database queries !
Yep. And query languages being some of the quickest and fastest things an analyst can do with 100% knowledge of the data and any wrangling/conditions that need to be done to assure accurate results.
A bot would never be able to accurately answer these questions off my data unless I thoroughly trained and tested it. But if it’s GPT-based, I’d always have to double-check so it’d just be a hinderence in workflow. There is no way money would be paid to a third-party for such a situation.
Since there’s a mathematical proof that LLMs without hallucinations are impossible, I think this kind of usage is a lost cause.
I’ve been enjoying Copilot quite a bit while developing, particularly for languages that I’m not familiar with. I’m not worried about it replacing me, because I very clearly use my experience and knowledge to guide it and to coax answers out of it. But when you tell it exactly what you want, it’s really nice to get answers back in the development language without needing to look up syntax.
“Give me some nice warning message css” was an easy, useful one.
It’s effectively a better Google search.