Technology

How ‘many-shot jailbreaking’ can be used to fool AI

The jailbreaking technique can fool AI into teaching users how to build a bomb.

Leave a Reply

Your email address will not be published. Required fields are marked *