• deleted@lemmy.world
    link
    fedilink
    English
    arrow-up
    36
    arrow-down
    5
    ·
    edit-2
    1 year ago

    Of course, they must understand every mouse click and key stroke you make in Windows.

    Migrate to Linux.

  • SARGEx117@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 year ago

    Hey they can have my old GPU if they give me a new blank laptop.

    I’ve always wanted to try linux

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    This is the best summary I could come up with:


    Demand for Microsoft’s AI services is apparently so great – or Redmond’s resources so tight – that the software giant plans to offload some of the machine-learning models used by Bing Search to Oracle’s GPU supercluster as part of a multi-year agreement announced Tuesday.

    The partnership essentially boils down to: Microsoft needs more compute resources to keep up with the alleged “explosive growth” of its AI services, and Oracle just happens to have tens of thousands of Nvidia A100s and H100 GPUs available for rent.

    Microsoft was among the first to integrate a generative AI chatbot into its search engine with the launch of Bing Chat back in February.

    You all know the drill by now: you can feed prompts, requests, or queries into Bing Chat, and it will try to look up information, write bad poetry, generate pictures and other content, and so on.

    In this case, Microsoft is using the system alongside its Azure Kubernetes Service to orchestrate Oracle’s GPU nodes to keep up with what’s said to be demand for Bing’s AI features.

    Oracle claims its cloud super-clusters, which presumably Bing will use, can each scale to 32,768 Nvidia A100s or 16,384 H100 GPUs using a ultra-low latency Remote Direct Memory Access (RDMA) network.


    The original article contains 580 words, the summary contains 207 words. Saved 64%. I’m a bot and I’m open source!

  • Fedizen@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    I heard rumors that azure ran on oracle, this is probably why. Microsoft’s pursuit of advanced chatbot technology is surely to be a loser in the long run.

  • Damage@slrpnk.net
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    4
    ·
    1 year ago

    So when will CPUs integrate the hardware necessary to compete with GPUs on these tasks? This situation is ridiculous, the device designed for this isn’t able to keep up with the device designed for something else entirely

    • jmcs@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      24
      ·
      edit-2
      1 year ago

      You are looking at it wrong by taking the names too literally. GPUs are simply processing units optimized for parallel computation and CPUs processing units optimized for general purpose sequential computation. And these optimizations require architectural trade-offs, so to be efficient at both types you’ll need to have both a CPU and GPU.

      So think of it this way, a CPU is actually a General-purpose Sequential Processing Unit and a GPU is a Parallel Processing Unit, but renaming them would only add to the confusion.

    • Deebster@programming.dev
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      GPUs are a lot closer to AI processors (tensor cores and similar) than CPUs. Graphics processing is about doing lots of simple computations simultaneously, which is what AI does - lots and lots of matrix maths. CPUs are more general purpose but can’t compete on raw speed because of this (and some of the hacks to try to get more speed are causing security problems).