Dangerous new jailbreak tricks chatbots into saying anything
Wikimedia Commons Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called “Skeleton Key.” Using this prompt injection method, malicious users can effectively bypass a chatbot’s safety guardrails, the security features that keeps ChatGPT from going full Taye. Skeleton Key is an example of a prompt injection or … Read more