Instantly improve your prompt with this add-on
🧠 What does it do?
Add this to any prompt you’re working on to improve its output. This addition will force the LLM of your choice to slow down before jumping straight into answering-mode. This helps the model funnel its “thoughts”, which noticeably improves its output.
⚠️ Disclaimer
This works great for everyday prompts that you write yourself, or for prompts that were designed to answer specific questions directly. This is great for brainstorming, but also for more complex challenges you are working on. It is especially powerful when you’re asking the model for an “opinion”, as it builds in some criticality in the model’s answer.
This will not work for all prompts. If your prompt already guides the model through predefined steps, this add-on will not have any additional benefit.
✏️ Prompt
[Insert your prompt] Before we continue, follow these steps: - Analyze what I asked you. What do I really need? What do you read in between the lines? - Now process your analysis. Really take some time to think about it. - Then propose a plan. What do you think will fully complete my request? If needed, feel free to ask additional questions.
💡 Example
Example coming soon.
✨ Tips
- Make sure this part is always added right after your actual prompt.
- Slightly tweak the prompt to better fit your context for better results.
- Even though this prompt builds in some criticality on the model side, please remain critical of the model’s output.
- If you feel like the output does not match your expectations, try to understand what the model got wrong and edit the prompt to mitigate this.
👤 Credit
This prompt was created by Jonathan Flores.