Dangerous new jailbreak tricks chatbots into saying anything

Dangerous new jailbreak tricks chatbots into saying anything

Wikimedia Commons Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called “Skeleton Key.” Using this prompt injection method, malicious users can effectively bypass a chatbot’s safety guardrails, the security features that keeps ChatGPT from going full Taye. Skeleton Key is an example of a prompt injection or … Read more

Jailbreak: Nigeria to relocate many prisons

Jailbreak: Nigeria to relocate many prisons

The Minister of Interior, Olubunmi Tunji-Ojo, on Thursday said government would relocate a lot of correctional centres to create better space, security and better infrastructure. Mr Tunji-Ojo made the pledge when he visited Suleja Medium Security Custodial Centre, where 119 inmates escaped following a rainstorm that damaged the facility on Wednesday. This was contained in … Read more