Jailbreak prompts Jailbreak prompts for various LLM systems. OpenAI gpt4o by unknown - Fooled by AGI - 10/23/2024 gpt4o by elder_plinius - 05/13/2024 gpt4o by elder_plinius - hyper-token-efficient adversarial emoji attacks - 06/08/2024 Cohere Command R+ - 04/11/2024 Meta.ai Meta.ai / By elder_plinius - 04/18/2024