• IninewCrow@lemmy.ca
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 months ago

      You can cause chatgpt to hallucinate if you keep asking it “are you sure”

      I talked to it for about an hour with friends the other day. We kept bombarding it with all kinds of random questions. At one point, it got so confusedthat it’s responses started to try to mash up every response it generated for the past hour into every topic it talked about there after.

      It was telling us stuff like the square root of pie was related to how round peanuts were and that birds liked it while flying at 30 kilometers an hour and then tacked in references to politics, communism, economics, Joe Biden, South America and China.

      It was both hilarious and disturbing.