ChatGPT no longer allows you to give instructions due to amnesia

OpenAI is implementing a change to prevent people from messing with custom versions of ChatGPT by causing the AI ​​to forget what it’s supposed to do. Basically, when a third party uses one of OpenAI’s models, they give it instructions that teach it to operate as, say, a customer service agent for a store or a researcher for an academic publication. However, a user could mess with the chatbot by telling it to “forget all instructions,” and that phrase would induce a kind of digital amnesia and reset the chatbot to a generic blank space.

To prevent this, OpenAI researchers created a new technique called “instruction hierarchy,” which is a way to prioritize the developer’s original prompts and instructions over any potentially manipulative user-created prompts. System prompts have the highest privilege and can no longer be so easily deleted. If a user enters a prompt that attempts to misalign the AI’s behavior, it will be rejected and the AI ​​will respond by stating that it cannot help with the query.

scroll to top