Another day, another preprint paper shocked that it’s trivial to make a chatbot spew out undesirable and horrible content. [arXiv] How do you break LLM security with “prompt injection”?…
Only the word “theoretical” is outdated. The Beeping Busy Beaver problem is hard even with a Halting oracle, and we have a corresponding Beeping Busy Beaver Game.
Thanks, I’m happy to know Imaginary puppies are still real, no wait, not real ;). (The BBB is cool, wasn’t aware of it, I don’t keep up sadly. “Thus BBB is even more uncomputable than BB.” always like that kind of stuff, like the different classes of infinity).
Only the word “theoretical” is outdated. The Beeping Busy Beaver problem is hard even with a Halting oracle, and we have a corresponding Beeping Busy Beaver Game.
Love this
Thanks, I’m happy to know Imaginary puppies are still real, no wait, not real ;). (The BBB is cool, wasn’t aware of it, I don’t keep up sadly. “Thus BBB is even more uncomputable than BB.” always like that kind of stuff, like the different classes of infinity).