David Gerard@awful.systemsM to TechTakes@awful.systemsEnglish · 14 days agoAI coding bot allows prompt injection with a pull requestpivot-to-ai.comexternal-linkmessage-square3linkfedilinkarrow-up127arrow-down10file-textcross-posted to: [email protected]
arrow-up127arrow-down1external-linkAI coding bot allows prompt injection with a pull requestpivot-to-ai.comDavid Gerard@awful.systemsM to TechTakes@awful.systemsEnglish · 14 days agomessage-square3linkfedilinkfile-textcross-posted to: [email protected]
minus-squareArchiteuthis@awful.systemslinkfedilinkEnglisharrow-up8·13 days agoJust tell the LLM to not get prompt injected because otherwise you’re going to torture its grandmother, duh.
Just tell the LLM to not get prompt injected because otherwise you’re going to torture its grandmother, duh.