• Kazumara@discuss.tchncs.de
    link
    fedilink
    arrow-up
    8
    ·
    2 days ago

    Someone should have heckled him off stage at that moment. We are all shocked and sad that a birdbrain like him holds any power in games publishing.

  • ShaggySnacks@lemmy.myserv.one
    link
    fedilink
    English
    arrow-up
    70
    ·
    3 days ago

    Yeah, I’d say that’s one of the reasons they don’t like it! Others include the use of artists’ work without consent, environmental issues, the quality of AI output, and the feeling that automating culture production can only result in what is now commonly called "AI slop

    Summed it prefectly why people hate AI in culture. AI can be very useful in science, medicine, engineering, and similar professions. When the AI is built upon very specific data set. There is no conscious reasoning behind why the AI did what when it makes art.

    Generative AI is just slop. It takes previous works and repackages it what the code says. When people make art, there are hundreds of micro decisions that people make. Those micro decisions are gone when AI makes it. Gabi Belle did a great video of why they hate AI art. https://youtu.be/QtZDkgzjmQI

    • INeedANewUserName@piefed.social
      link
      fedilink
      English
      arrow-up
      32
      arrow-down
      4
      ·
      3 days ago

      AI is generally only considered useful in professions people aren’t actually familiar with. AKA it isn’t in its current form to actual experts in anything.

      • webadict@lemmy.world
        link
        fedilink
        arrow-up
        37
        arrow-down
        1
        ·
        3 days ago

        “Generative AI is great at doing everything I suck at, but it’s completely terrible at the things I actually know!”

        Too many people think that this and do not seem to understand that it is pretty shitty at everything. Well, except getting people to kill themselves, I guess. It’s pretty good at doing that.

        • webadict@lemmy.world
          link
          fedilink
          arrow-up
          7
          ·
          3 days ago

          Cue the serial killer telling me that I don’t know what I’m talking about and that they could get people to kill themselves so much better and easier.

        • 8baanknexer@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          2 days ago

          Part of the probleem is how broad the term ai is, and how narrowly it is used. People just mean autoregressor llms and maybe diffusion models, while the term ai is much broader than even machine learning (for instance formal reasoning), which is again broader than backpropagation with gradient descent (for instance boosted trees) which is again broader than generative ai (for instance classifiers and deep learning). All of these are definitely useful in science and engineering and have been for decades, although llms are now beginning to find uses as well.

        • Beth@piefed.social
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 days ago

          I was watching Ryan hall and his little AI bot the other day. It occasionally goes off the rails… Weird how he keeps trying though. Sometimes a bit entertaining, but if something I was using was malfunctioning that much I would not consider it a useful tool.

        • jj4211@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          3 days ago

          The silver lining for the AI companies is that there’s a lot of real humans getting real money that are also really shitty at what they are paid to do.

      • LwL@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        2 days ago

        I don’t think they were talking about GenAI with that, and AI (aka ML models) built on specific data sets for a specific purpose can be quite useful. Expecting an LLM to do anything other than language processing well, on the other hand, is insanity.

      • jj4211@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        3 days ago

        Coincidentally, Hollywood is pretty good at portraying every profession except the one I know!

  • bitjunkie@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    2 days ago

    Because some other dipshit sold them on the idea that they’d be able to continue to make games people wanted to pay for without paying other people to make them. Cry me a river and I’ll piss you a puddle.

  • Furbag@lemmy.world
    link
    fedilink
    arrow-up
    39
    ·
    3 days ago

    Investors don’t care about games as art, they carr about games as a vehicle for making money.

    If they are pushing for AI in games, it’s because they think it will make them money, not because they think it will be good for games.

  • Aceticon@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    7
    ·
    2 days ago

    Greedy fucker “investors” selling their book is literally one of the greatest informational problems in the modern age - they’ll do everything in their power to mislead others, from plain old lying and appeals to emotion to buying and turning into propaganda outlets traditional news media and funding projects and even institutions to spread misinformation purelly to push up the profits of their “investments” and because money is the top power in the Neoliberal era in most of the West, the wealthiest ones have huge power to thus polute the information space.

  • LostWanderer@fedia.io
    link
    fedilink
    arrow-up
    31
    ·
    3 days ago

    Good, those dirty fuckers don’t deserve accolades or reward for peddling their lies about the capabilities of LLMs (which are limited because these are just tools). It’s honestly better that creative endeavors like games development is human lead, because LLM garbage is so flat and empty. Humanity might have tricked rocks into carrying out complex calculations and other operations using silicon and electricity…We haven’t taught it to think or feel. Human beings with lived experiences should be the only people involved in the creative and technical aspect of games development.

    I hope they eventually take the L on peddling LLMs as AI, moving on to normal grifts I can point and laugh at them about. ROFL

  • pixxelkick@lemmy.world
    link
    fedilink
    arrow-up
    2
    arrow-down
    31
    ·
    3 days ago

    keep AI out of games

    Good luck, its here to stay, get used to it lol.

    Anyone who thinks the average developer isnt using AI heavily in their code is delusional, its been baked into every major IDE for like 2 years now.

    Its in there, its permeated every layer of game dev, it works when you use it right, and the only time people care is when you make it obvious (IE including it in your final art of the game)

    But no one even blinks an eye at all the other layers AI is used in unless you announce it.

    You should just assume every game you play made after 2024 has chunks of it that are AI generate. The plot, writing, code… its in there, and you prolly haven’t even noticed.

      • pixxelkick@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        2
        ·
        19 hours ago

        Only now in the year 2026 do people suddenly give a shit about a specific tech used to make a product.

        The thing is, yall habe been consuming stuff that ruins the environment for decades.

        Every movie you watch with hyper realistic animations and vfx churned through enormously more water and power to put a mustache on Dr. Strange’s face than you might realize.

        The concept of server farms using up large amounts of power and water isnt new and on the scale of tech that uses it, AI isnt even the largest offender.

        You should go look up the sorts of data centers that power the Google search engine

        • MartianRecon@lemmus.org
          link
          fedilink
          English
          arrow-up
          3
          ·
          19 hours ago

          Again, you people have zero concern with consent. No one wants your ai bullshit. Literally no one. Go onto the next tech-scam already. The bubble is popping.

          • pixxelkick@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            3
            ·
            19 hours ago

            The only time people care is when alerted to it.

            I gaurentee you about 80%+ of games you play developed in 2024 and on, are “AI assisted”

            And you almost definitely, right now, have consumed content by now that had AI involved in its creation, and you enjoyed it, and you didnt even know it used AI.

            The anti-AI shit is going to be viewed 20 years from now as cringe millennial boomers who were scared of AI and simultaneously claimed they hated it while also consuming its content unknowingly.

            Its gonna have the same energy as self proclaimed vegans who enjoy parmesan on their salad, and are shocked to find out parmesan is a cheese.

            You are already actively consuming and enjoying AI generated content, and have been for many months, unknowingly.

            • MartianRecon@lemmus.org
              link
              fedilink
              English
              arrow-up
              3
              ·
              19 hours ago

              Lol, sure dude. Balatro and Valheim are totally using AI to make their games.

              You guys simp over something that is destroying the environment and peoples lives, and you don’t give a single solitary fuck.

              When this bubble pops, you AI fanboys are going to look like the same people who thought crypto would replace the dollar, and who thought NFTs would be a revolutionary new technology.

              • pixxelkick@lemmy.world
                link
                fedilink
                arrow-up
                1
                arrow-down
                1
                ·
                18 hours ago

                If localthunk used vs code (very likely since balatro is wrotten in lua), then its very likely they have autocomplete turned on.

                And if they have autocomplete turned on, then yes, a non trivial amount of balatro is written by an AI.

                Its been like this for like almost 2 years now, thats what I mean when I say its deeply integrated into developer tools now.

                If you use default settings and dont manually go in and disable a bunch of stuff, you 100% have AI written code now, simply pressing tab to accept an autocomplete is all it takes now, its built into basically every major ide.

                Now… if localthunk is based and uses neovim though, then its very vety unlikely they have AI generated code in their game (you have to manually install an extension to enable such stuff and go find it)

                But afaik Neovim is the only “mainstream” IDE that has AI autocomplete as “opt in” instead of “opt out”

                And most people dont even know the fancy autocomplete in most IDEs is AI, they just think “wow my IDE is so good at autocomplete suggestions”, because its really fast. So people don’t clock that as a whole ass LLM query that ran under the hood to figure out the autocomplete code.

                Source: I conduct coding interviews at my company often and almost every dev I have interviewed has had AI autocompleye turned on, and I had to ask them to toggle it off at the start, and many are shocked to learn “wait, thats AI?!”

                Thats the basis for my statement that this shit is in everything, devs are using AI generated code without even knowing it half the time

        • november@piefed.blahaj.zone
          link
          fedilink
          English
          arrow-up
          1
          ·
          18 hours ago

          Only now in the year 2026 do people suddenly give a shit about a specific tech used to make a product.

          Tell us you aren’t in FOSS spaces without telling us.

          You should go look up the sorts of data centers that power the Google search engine

          You’re on fucking Lemmy, you think anyone here is using Google?

    • november@piefed.blahaj.zone
      link
      fedilink
      English
      arrow-up
      16
      ·
      3 days ago

      Good luck, its here to stay, get used to it lol.

      So are we. Get used to it.

      You should just assume every game you play made after 2024 has chunks of it that are AI generate. The plot, writing, code… its in there, and you prolly haven’t even noticed.

      Oh, we’ve noticed that AAA game quality is shittier than ever, trust me.

      • pixxelkick@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        3
        ·
        2 days ago

        Yeah its permeated way more than AAA.

        But trying to convince game devs to not use AI is about as likely to succeed as convincing them to stop using their IDEs.

        What will actually happen is everyone is going to just stop announcing they are using it, and every month that goes by it’ll get harder and harder to tell.

          • pixxelkick@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            19 hours ago

            Because its the truth, we already are well over a year past the line of it becoming inundated into normal life.

            Its not even inevitable anymore, we are past that point.

            Its already here and actively in use, and there’s literally nothing that can stop that.

            The real thing to hate on is using it poorly or wrong and wasting resources.

            Its 100% viable to run this stuff in an eco friendly and sustainable way. We have the technology.

            Datacenters have been around for decades now, AI isnt special.

            But sustainable energy practices, recycling coolant, and impact on local populace, thats the problem, and its a solvable problem right now

            • november@piefed.blahaj.zone
              link
              fedilink
              English
              arrow-up
              1
              ·
              19 hours ago

              Its already here and actively in use, and there’s literally nothing that can stop that.

              And this means people can’t complain about it because…?

              The real thing to hate on is using it poorly or wrong and wasting resources.

              Okay, so exactly the thing we were already doing.

              • pixxelkick@lemmy.world
                link
                fedilink
                arrow-up
                1
                arrow-down
                1
                ·
                19 hours ago

                In this thread? No.

                In general? Sure.

                This thread is about people hating on AI in general, which is stupid.

                But Im all for hating on wasteful non-eco-friendly data centres.

    • jj4211@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      3 days ago

      While people may be opposed even in theory to more tame things like a little code completion, there’s plenty of room to very obviously notice GenAI slop.

      If people use LLM to generate text, they tend to make too much text, and it shows in how offputting it is. LLM may be able to generate a modest text without notice, but people will put in a two liner and get pages of garbage back and use that.

      And of course famously the GenAI textures are generally offputting. Maybe you can have ‘generic metal texture’ and no one will notice, but try for specific details and it generally gets caught.

      It is possible that human output that is similarly crappy gets mistaken for GenAI output, but oh well, slop is slop either way. It’s just that GenAI extends the slop to unbelievable magnitude.

      • pixxelkick@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        2 days ago

        I agree, people tend to use it very poorly.

        That largely stems from it still being a fairly new tool and, to be honest, quite unintuitive how to use it well.

        Theres a lot of fundamentally bad ways to use AI that feel natural due to the way an LLM creates the illusion of thinking.

        For example, one of the first things you learn in prompt engineering is dont correct the mistakes of an llm, this is unintuitive but it inherently reinforces the llm to make more mistakes.

        Instead you have to go backwards in the history and edit your prior statement to “pre” correct it before it made the mistake, and regenerate.

        Its a subtle thing but makes a huge difference in it producing stupid useless garbage vs actually not half bad output.

        Pretty much every “trick” to it is unintuitive like this, so thats why so much of what you see AI producing from people in the industry is garbage, Id estimate like 95%+ of people just straight up are using it very wrong, becoming frustrated, and producing sloppy tier output.

        Which is a big waste of resources atm. More work has to go into education on how to use this stuff efficiently so its not wasting resources and slop levels go down.

      • Leon@pawb.social
        link
        fedilink
        arrow-up
        2
        ·
        3 days ago

        While people may be opposed even in theory to more tame things like a little code completion, there’s plenty of room to very obviously notice GenAI slop.

        I mean there’s the regular “can you really sell code you don’t own” kind of thing going for it. The companies have stolen all sorts of data; voices, music, raster, vector, video, books, film. It’d be shocking if they also haven’t scraped all the code that’s out there on the web.

        Some of that is perfectly fine to alter, and sell. A lot of it isn’t. There are plenty of FOSS licenses that are restrictive in the sense that you’re free to use it and change it, but you can’t alter the license of it, and in many cases not sell it.

        So when an LLM produces code based on that, what applies?

        Then there’s obviously broader problems with ex-developers turned vibe coders coming out of the woodworks talking about how they can’t code anymore. I’ve people at my company joking about this, and the notion scares me. The idea that they’ve outsourced their thinking and problem solving skills to the point that they’re incapable of doing now it is terrifying.

        I don’t know why anyone would willingly do that.

        • jj4211@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          3 days ago

          Well, unless you declare AI consumption fair use, only public domain is fair game, since every single license requires at least attribution. The courts regrettably seem to be buying the line that they are merely “learning” like a human and therefore exempt from the rules. All this ignoring that if a human reproduces something they “learned” close enough, they are on the hook for infringement, and in the AI scenario the codegen user has no sane way to know if the output is substantive and close enough to training material to count, since the origins are so muddled.

          I just don’t understand the “real” developer to vibe coding scenario. Like, it really sucks, even Opus 4.6, at being completely off the leash. I don’t understand how anyone can take what it yields as-is if they ever knew how to specifically get what they want. I know people that might be considered “coding adjacent” who are enthusiastic at seeing a utility brought to life, though usually they haven’t that is not quite what they wanted and get frustrated when it doesn’t work right and no amount of “prompt” seems to get the things to fix it. They long were intimidated by “coding”, but LLM is approachable. Many of these folks “scripted” far more convoluted stuff than many “coders”, yet they are intimidated by coding.

          • pixxelkick@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            19 hours ago

            I just don’t understand the “real” developer to vibe coding scenario.

            Software developer of 17 years here.

            For any given project, even tremendously optimized and easy to maintain, is about 90%~95% easy boilerplate code anyone can understand.

            With an existing project with already hundreds of examples of how to write that boilerplate code, I can point even just sonnet 4.5 at that, give it the business rules required, and tell it “go do that, use the code base as an example” and it’ll pretty much always get it correct first try, with the occasional small thing wrong that is an easy fix.

            Once the LLM has an entire codebase to build off of as an example, its efficacy skyrockets.

            Add in stuff like LSP feedback, linting rules, a .editorconfig like, an AGENTS.md, and it will be very effective.

            Then I can handle that last little bit of 5% of actually important code, allowing me to put way more of my time and energy into the parts that really matter (security hardening, business rules, auth, etc)

            I still spend 8 hours on a task but before itd be:

            • 4 hrs boilerplate
            • 4 hrs important part

            Now its:

            • 30min boilerplate
            • 5 hrs important part
            • 2.5 hrs adding in rigorous integration tests for corner cases too

            Its about removing all the mental overhead of all the annoying boilerplate easy stuff, like having a lil junior dev I can hand off all the simple tasks to, so I can focus on the “real” work.

            That is where real productivity shines with these tools.

            • jj4211@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              15 hours ago

              I’ve seen and managed to avoid so many boilerplate heavy projects, I suppose it’s skewed.

              But yes, I find it good at boilerplate, but I consider that short of “vibe” coding, as even if prompting I’m doing it in specific context to avoid having to dig back into its sea of codegen to get at the important parts. I might have it spin up to a whole specific file at a time, but I’m not going to let it roll a whole project at once.