Technology How ‘many-shot jailbreaking’ can be used to fool AI April 3, 2024 The jailbreaking technique can fool AI into teaching users how to build a bomb.