Prompt hardening is the process of adding additional instructions to your prompt in order to make it less prone to attacks. Enter your 'base prompt' below (you can keep in anything like %s or {}) and click 'Protect'. This will generate a prompt with extra protections against attackers that you can use in your application.

One of the defences used to protect your prompt is called 'XML Tagging' - where a user's input is wrapped in XML tags to help your LLM identify what input is coming from an external user. To make this even more secure, you can randomise the XML tag - when you do this, it makes your app less vulnerable to an attack called 'XML Escaping'.