• Tja@programming.dev
    link
    fedilink
    arrow-up
    6
    ·
    1 day ago

    How much if a discount are you expecting to start gaming on a 30k card with no video output?

    • cm0002@lemmy.worldOP
      link
      fedilink
      arrow-up
      2
      ·
      1 day ago

      Not for gaming, for running AI open source models and other AI shenanigans. My 4080 Super has been filling my gaming needs and will for years to come, but it’s not enough for my AI interests lol

      The most I can get out of this 4080 is running a ~7B param model, but I want to run cooler shit like that new open source DeepSeek v3 that dropped the other day.

      • Tja@programming.dev
        link
        fedilink
        arrow-up
        6
        ·
        1 day ago

        So you’re waiting for the AI bubble to burst because you can’t wait to run all the cool new AI models?

        • cm0002@lemmy.worldOP
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          1 day ago

          Yea, the underlying tech is what interests me and I have a few potential use cases. Use cases that I would never entrust a random company with. For example, the concept of MS recall is cool, I’d never trust Microshits implementation though. But an open source local version that I’m in control of all the security implementations? Hell yea lol

          • Tja@programming.dev
            link
            fedilink
            arrow-up
            2
            arrow-down
            1
            ·
            1 day ago

            That’s the problem. If the use case is super cool, and 99% of people have no knowledge (or motivation) to set it up for themselves, the online services will keep existing, and the bubble won’t really burst.

            Even if some single companies fail (and they will, there some truly horrendous ideas getting funding), the big players will buy the GPUs wholesale before they hit ebay.

  • TheObviousSolution@lemm.ee
    link
    fedilink
    arrow-up
    12
    ·
    1 day ago

    The problem is that the enterprise level cards can’t really perform at the consumer market level nor are they designed for it. Many don’t even have video outputs.

  • XIIIesq@lemmy.world
    link
    fedilink
    arrow-up
    17
    ·
    2 days ago

    I believe it is likely that there will be a burst at some point, just as with the dot-com burst.

    But I think many people wrongly think that it will be the end of or a major setback for AI.

    I see no reason why in twenty years AI won’t be as prevalent as “dot-com’s” are now.

    • Aceticon@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 day ago

      Some current directions in AI, such as LLMs, seem to be dead-ends in the sense that those approaches cannot be incrementally improved much further to, for example, eliminate hallucinations or simply be capable of using logic along with those probability engines in such a way as to, at minimum, exclude the logically impossible from the results.

      The dot-com stuff on the other hand was the very first bubble from the very first wave into a whole new technological direction that had just been unlocked and gave access to an entire technological branch of new ways of doing things - it the result of the very first wave of investment around the technology domain of worldwide digital communications and all the other tech branches that became possible due to it.

      Basically the Internet was like openning a door to a various new areas of Tech (curiously that wasn’t even all that amazingly complex as Tech goes, kinda like a basic wheel isn’t exactly complicated but look at all that became possible with its invention), whilst the current AI wave (which is mainly the latest wave of work in the branch of Neural Networks, which is over 3 decades old) is more like a handful of massivelly complicated solutions which are the product of decades of work in a specific direction, some of which work in such a way that they can’t be significantly further improved and hence can’t be made to get past certain problems it has (the most obvious example being LLM hallucinations).

      So whilst I do think that in 20 years there will be some prevalence of AI tech companies in some domains were the AI solutions of this wave of development on it do work well enough (say, entity detection on images), I don’t think that will be anywhere comparable to what happened in the 20 years following the start of a new Tech Age which triggered by the Internet.

      Mind you, 2 decades is a lot of time in Tech terms, so maybe somebody will come up with a whole different approach to AI in the meanwhile that breaks through the inherent limitations of the current one, just don’t count on it.

      Edit: just wanted to add that I was there when Darpanet morphed into the Internet and the dot-net bubble that came out of it. At the time everybody and their dog was playing around with making a websites, people were trying new stuff on top of those websites, inventing new comms protocols, wiriting programs that talked to other programs over the network, creating entirely new business models anchored on making a website a storefront - the Internet was Freedom. This AI wave doesn’t feel at all like that - sure plenty of people are deploying models created by others and trying them out, but very few are creating new models and a lot of that Tech comes pre-monetised and locked down by large companies who are trying to get money out of anything people do with it - the whole things is not at all like the “we’ve open this whole new domain, you guys figure out what to do with it” that was the birth of the Internet.

    • Zron@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      1 day ago

      Multiple outlets including LTT and Gamers Nexus have debunked this.

      The only thing you may have to do if you notice unusual performance is reapply thermal paste to the GPU, and that’s only because most thermal paste will dry out after years of sitting around or being used

        • LiveLM@lemmy.zip
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 day ago

          I will sell my Polaris for a ridiculous amount of money. I will sell my Polaris for a ridiculous amount of money. I will sell my Polaris for a ridiculous amount of money.

          Manifesting 🙏🙏🙏🙏

  • Shardikprime@lemmy.world
    link
    fedilink
    arrow-up
    5
    arrow-down
    3
    ·
    1 day ago

    Lol no, I mean it would be a bubble if it didn’t provide anything useful, or transformative, but that’s far from the truth.

    Like it or not, even LLMs have been found to help in health treatments, mental support , workplace efficiency and so on

    AI is here to stay, it’s basically the next industrial revolution

  • hark@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    2 days ago

    There may be a dip in prices for a bit, but since covid, more companies have realized they can get away with manufacturing fewer units and selling them for a higher price.

    • cm0002@lemmy.worldOP
      link
      fedilink
      arrow-up
      41
      arrow-down
      1
      ·
      2 days ago

      Oh but I do, ironically, for the same use cases LMAO. I like to tinker with AI and I like Microshits concept of Recall and similar ideas, like having an AI to be able to search through all my documents with nothing but a sentence or idea of what I’m looking for

      But ain’t no fucking way I’m going to give a closed source AI that I’m not running myself that level of access

    • InFerNo@lemmy.ml
      link
      fedilink
      arrow-up
      8
      ·
      2 days ago

      When the price of those drop, the price of the ones that werent used for that purpose will also drop

        • kitnaht@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          2 days ago

          Most crypto mining outfits undervolt their cards for lower power usage. They aren’t cranking them as you say they are. A dead GPU doesn’t produce anything for you; cranking it up the chance that it will fail. You’re better off running it an extra 4 years at a lower voltage than you are cranking it for 1.

        • ch00f@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          2 days ago

          I thought the efficiency curve for GPUs peaked before 100%. If electricity is your primary cost, driving the GPUs at lower loads saves money.

          So you might end up with GPUs that spent their entire life at a steady 80% load or something.

      • marcos@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        2 days ago

        Yes.

        They don’t exactly age, but top of line chips have very large currents in very small conductors. When you do that with DC current, your conductors deform with time, up to the point that they stop working correctly.

        That said, you probably can get plenty of casual use out of them.

      • MiDaBa@lemmy.ml
        link
        fedilink
        arrow-up
        2
        arrow-down
        2
        ·
        2 days ago

        Supposedly they do but I’ve had surprisingly good luck with used GPUs from ebay. I’m good with warning others against buying used GPUs on ebay though because then costs will stay lower for me.

    • bdonvrA
      link
      fedilink
      arrow-up
      1
      arrow-down
      3
      ·
      2 days ago

      Have you seen the price of new GPUs? Sure ya do. Maybe they only last a few years. That’s alright.

  • jj4211@lemmy.world
    link
    fedilink
    arrow-up
    16
    ·
    2 days ago

    Unfortunately, this time around the majority of AI build up are GPUs that are likely difficult to accomodate in a random build.

    If you want a GPU for graphics, well, many of them don’t even have video ports.

    If your use case doesn’t need those, well, you might not be able to reasonably power and cool the sorts of chips that are being bought up.

    The latest wrinkle is that a lot of that overbuying is likely to go towards Grace Blackwell, which is a standalone unit. Ironically despite being a product built around a GPU but needing a video port, their video port is driven by a non-nvidia chip.

    • cm0002@lemmy.worldOP
      link
      fedilink
      arrow-up
      3
      ·
      2 days ago

      My use case is for my own AI plans as well as some other stuff like mass transcoding 200TB+ of…“Linux ISOs” lol. I already have power/cooling taken care of for the other servers I’m running

      I’ve already got my gaming needs satisfied for years to come (probably)

  • ReCursing@lemmings.world
    link
    fedilink
    arrow-up
    18
    arrow-down
    9
    ·
    2 days ago

    I just want the ai hate bubble to burst, the hype bubble probably needs to go as well but honestly I care less about that

    • EldritchFeminity@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      21
      ·
      2 days ago

      I want the hype bubble to burst because I want them to go back to making AI for useful stuff like cancer screening and stop trying to cram it into my fridge or figure out how to avoid paying their workers, and the hate bubble isn’t going to stop until that does.