• Domi@lemmy.secnd.me
    link
    fedilink
    arrow-up
    4
    ·
    1 day ago

    Hosting a model of that size requires ~800GB of VRAM. Even if they release their models, it wouldn’t make them obsolete since most people and many companies couldn’t host it either way.

    • rcbrk@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      17 hours ago

      Anyone can now provide that service. Why pay OpenAI when you can pay a different service who is cheaper or provides a service more aligned with your needs or ethics or legal requirements?

      • Domi@lemmy.secnd.me
        link
        fedilink
        arrow-up
        1
        ·
        7 hours ago

        Anyone that has 300.000$ per instance, the know-how to set it up, the means to support it and can outbid OpenAI, yes.

        I don’t see that happening on a large scale, just like I don’t see tons of DeepSeek instances being hosted cheaper than the original any time soon.

        If they really are afraid of that they can always license it in a way that forbids reselling.