misk@sopuli.xyz to Technology@lemmy.worldEnglish · 1 年前Jailbroken AI Chatbots Can Jailbreak Other Chatbotswww.scientificamerican.comexternal-linkmessage-square93fedilinkarrow-up1485arrow-down117
arrow-up1468arrow-down1external-linkJailbroken AI Chatbots Can Jailbreak Other Chatbotswww.scientificamerican.commisk@sopuli.xyz to Technology@lemmy.worldEnglish · 1 年前message-square93fedilink
minus-squareSyrus@lemmy.worldlinkfedilinkEnglisharrow-up12·1 年前You would need to know the recipe to avoid making it by accident.
minus-squareEcho Dot@feddit.uklinkfedilinkEnglisharrow-up6·1 年前Especially considering it’s actually quite easy to make by accident.
You would need to know the recipe to avoid making it by accident.
Especially considering it’s actually quite easy to make by accident.