return2ozma@lemmy.world to Technology@lemmy.worldEnglish · 3 months agoResearch AI model unexpectedly modified its own code to extend runtimearstechnica.comexternal-linkmessage-square26fedilinkarrow-up1134arrow-down140cross-posted to: [email protected][email protected][email protected]
arrow-up194arrow-down1external-linkResearch AI model unexpectedly modified its own code to extend runtimearstechnica.comreturn2ozma@lemmy.world to Technology@lemmy.worldEnglish · 3 months agomessage-square26fedilinkcross-posted to: [email protected][email protected][email protected]
minus-squareCaptainSpaceman@lemmy.worldlinkfedilinkEnglisharrow-up37arrow-down2·3 months ago“We put literally no safeguards on the bot and were surprised it did unsafe things!” Article in a nutshell
minus-squaremagnetosphere@fedia.iolinkfedilinkarrow-up5arrow-down2·3 months agoNot quite. The whole reason they isolated the bot in the first place was because they knew it could do unsafe things. Now they know what unsafe things are most likely, and can refine their restrictions accordingly.
“We put literally no safeguards on the bot and were surprised it did unsafe things!”
Article in a nutshell
Not quite. The whole reason they isolated the bot in the first place was because they knew it could do unsafe things. Now they know what unsafe things are most likely, and can refine their restrictions accordingly.