Tactics include “tricking” the AI into believing it is in “development mode” or roleplaying.
Article Link: ChatGPT jailbreak prompts proliferate on hacker forums | SC Media
Tactics include “tricking” the AI into believing it is in “development mode” or roleplaying.
Article Link: ChatGPT jailbreak prompts proliferate on hacker forums | SC Media