• KnilAdlez [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    19
    ·
    18 hours ago

    It’s strange to me to call out the climate costs, especially since this would have far less climate impact than having to call to google’s servers to use the full sized models. LLMs aren’t magically worse for the environment, its the hardware they run on for the full sized models that is incredibly power hungry. A 4 GB LLM would probably use less power than a modern video game on a person’s computer, and run for less time.


    • AstroStelar [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      11 hours ago

      I was thinking the same thing. The article claims 30,000 to 60,000 tons of CO2e emissions from sending 4GB of data to hundreds of millions of phones. For reference, estimates for the US AI industry’s emissions are between 30 and 80 million tons per year, global total emissions are 80 billion. How often this gets repeated for new versions is unclear.

      As for inference, Chrome won’t even use it for its biggest use case: everything done via the search bar and “AI mode” is still sent to high-param models in Google’s servers, likely because user data is Alphabet’s cash cow. The local model is only used in very niche cases:

      the features that do use the local model (Help-Me-Write in <textarea>, tab-group AI suggestions, smart paste, page summary) are buried in textarea-context menus and tab-group right-click menus that the average user will discover, on average, never.

    • chgxvjh [he/him, comrade/them]@hexbear.net
      link
      fedilink
      English
      arrow-up
      10
      ·
      13 hours ago

      LLMs aren’t magically worse for the environment, its the hardware they run on for the full sized models that is incredibly power hungry.

      Not magically, just technically. Within a couple years we just made computers use significantly more energy for no good reason. This shit being into everything is incredibly unpopular.

      • KnilAdlez [none/use name]@hexbear.net
        link
        fedilink
        English
        arrow-up
        5
        ·
        12 hours ago

        I’m not sure if I understand you, it sounds like you think running a small llm locally on your computer will suddenly make it use like 10x more power. That’s not how it works. It’s the servers used to run the full sized models that use that much power, as each one has tens of thousands of processors running at once. And local llms do have usage, especially for accessability. I use a local llm for my home assistant instance so I can use voice commands, which is very helpful as a disabled person.


            • NewOldGuard@lemmy.ml
              link
              fedilink
              English
              arrow-up
              2
              ·
              4 hours ago

              I think their point is that regular web browsing will use less power than web browsing with local LLM calls. Your PC running an LLM is likely gonna hit its TDP limits, while browsing will be a fraction of that. Yes it’s less power than used by a trillion parameter model but I think their point is it’s vastly more than your non-LLM standard browsing would be

              • KnilAdlez [none/use name]@hexbear.net
                link
                fedilink
                English
                arrow-up
                1
                ·
                4 hours ago

                Your PC running an LLM is likely gonna hit its TDP limits

                Debatable for a 4GB model, depending on the hardware. It’s also (most likely) not constantly running, so while yes, it will use more power than not having it, whether or not it is a significant change in the long run depends on many factors.


    • GaveUp [she/her]@hexbear.net
      link
      fedilink
      English
      arrow-up
      4
      ·
      11 hours ago

      It’s an easy angle to win over the unaware general public by hitting points like climate and water usage

      Dishonest, and a little silly to people who understand the math but it is an angle

  • blobjim [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    3
    ·
    20 hours ago

    It’s a really weird article. They’re trying to say that some sleazy thing Anthropic did is the same as Google Chrome having an on-device machine learning model? And making it seem extra nefarious that the file gets redownloaded when you delete it, which seems like a completely normal self-healing mechanism. And finally mentions that you can just turn it off in the AI section of the Chrome settings.

    They could have just written an article about how Chrome is now 4 GB larger if you have AI stuff turned on.

    Firefox has the exact same thing

    And for machine translation

      • spectre [he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        4
        ·
        5 hours ago

        Why are you starting a comment with “fuck you”?

        Are you able to make a point without being a complete asshole about it? Is that how you want to be spoken to online? Please never talk to me that way, to be clear.

        • supdawg813 [comrade/them]@hexbear.net
          link
          fedilink
          English
          arrow-up
          1
          ·
          48 minutes ago

          Please never talk to me that way, to be clear.

          To be clear, they quite literally weren’t lol. Why are you tone policing a reply to someone else’s comment on another person’s post? How incredibly weird and self righteous…

          • spectre [he/him]@hexbear.net
            link
            fedilink
            English
            arrow-up
            1
            ·
            19 minutes ago

            Its not weird at all. The tone of that comment is shit and uncalled for, this sort of hostility has decayed the community [possibly] past the point of no return.

            This isn’t some sort of unchecked fascism, we are talking about AI features in Firefox. I think it is poor behavior to lead in to that conversation with an insult and I’m going to call it out.

            I have been a part of this community for almost a decade, and I’m invested in its conduct. i want to be a part of a community that is above le Reddit and X the everything app^TM; if you aren’t interested in elevating the level of discussion I guess I wonder why you wouldnt just be over there anyway.

    • filt
      link
      fedilink
      English
      arrow-up
      22
      ·
      18 hours ago

      Re-read the article. It’s not remotely like Firefox.

      The point is that Chrome without any authorization from the user, nor notification to the user, is going to utilize 4GB worth of bandwidth as well as disk space. It is dishonest and a bad practice.

      You can’t turn it off in settings, you have to set flags to turn it off. This disqualifies normal everyday users from knowing how or why this is happening.

      Your comments are disingenuous and seems like you’re just here to be a Google and “AI” apologist.

      This behavior by Google is gross, and any company or person that thinks it’s acceptable is equally gross. The author thoughtfully cites pieces of the GDPR and other acceptable computing policies and laws which this behavior contravenes. They also discuss the environmental impacts of this behavior.