Honestly an AI firm being salty that someone has potentially taken their work, “distilled” it and selling that on feels hilariously hypocritical.

Not like they’ve taken the writings, pictures, edits and videos of others, “distilled” them and created something new from it.

  • HobbitFoot
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 day ago

    How does this get used to create a better AI? Is it just that combining distillations together gets you a better AI? Is there a selection process?

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 day ago

      Chains of distillation is mostly uncharted territory! There aren’t a lot of distillations because each one is still very expensive (as in at least tens of thousands of dollars, maybe millions of dollars for big models).

      Usually a distillation is used to make a smaller model out of a bigger one.

      But the idea of distillations from multiple models is to “add” the knowledge and strengths of each model together. There’s no formal selection process, it’s just whatever the researchers happen to try. You can read about another example here: https://huggingface.co/arcee-ai/SuperNova-Medius