Skip to content

1.8 Meta Prompting

Reusable prompts

You want a prompt you can use again and again across projects, not just once.

Quality matters

The task is important enough to invest in a well-crafted prompt: a grant section, a data analysis plan, a systematic review query.


The AI has seen millions of examples of effective instructions. When you ask it to improve a prompt, it applies patterns you might not think of: adding specificity, clarifying the role, defining the output format, removing ambiguity.

Over time, observing how the AI rewrites your prompts builds your own prompting intuition.


There are three practical ways to use meta prompting, from simple to thorough:

  1. Direct improvement - Paste your draft prompt and ask the AI to make it better. Quick and easy.

  2. Critique, then rewrite - Ask the AI to first analyse what is weak about your prompt, then produce an improved version. The critique step forces it to reason about quality before rewriting.

  3. Guided questioning - Give the AI your rough idea and ask it to ask you clarifying questions before writing the prompt. This is the most powerful approach because it surfaces assumptions and gaps you did not realise existed.


The simplest form. Paste any draft prompt and let the AI refine it:

Prompt: Improve my prompt
You are an expert prompt engineer. Improve the following prompt
for clarity, specificity, and effectiveness. Rewrite it using
prompt engineering best practices.
Original prompt:
[PASTE YOUR PROMPT HERE]
Provide:
1. The rewritten prompt (clean, ready to copy-paste)
2. A brief explanation of what you changed and why

Example: from rough idea to research prompt

Section titled “Example: from rough idea to research prompt”

Here is a real example of how meta prompting transforms a vague idea into something specific and effective.

Your rough input:

I want to compare my field data with satellite imagery for
my bird migration study.

What you ask the AI:

Prompt: Help me write a better prompt
I want to write a prompt that helps me compare my field
observation data with satellite-derived NDVI data for a study on
bird migration timing in Dutch wetlands.
Ask me clarifying questions before writing the prompt.

After answering the AI’s questions, it might produce:

You are an ecological data analyst with expertise in remote sensing
and avian ecology. I have two datasets:
1. Field observations of migratory bird arrival dates at three
NIOO wetland monitoring sites (2019-2024), recorded as
species, date, and site ID.
2. MODIS-derived NDVI time series for the same sites and years,
at 16-day intervals.
Compare the phenological trends: is there a significant
correlation between spring green-up (NDVI inflection point) and
first-arrival dates for early migrants (e.g., barn swallow,
common cuckoo)?
Present your analysis plan as numbered steps. For each step,
specify the statistical method and any R packages you recommend.
Flag assumptions I should verify before running the analysis.

Meta prompting adds an extra step. It is overkill for:

  • Simple questions - “What is the boiling point of water?” does not need a refined prompt.
  • One-off tasks - If you will never reuse the prompt, just write it directly.
  • Time pressure - When you need an answer in seconds, skip the meta step and prompt directly.

Save meta prompting for tasks where quality and reusability justify the investment: grant proposals, data analysis workflows, systematic review queries, or any prompt you plan to share with colleagues.



Based on materials from Prompt Engineering Guide, OpenAI Cookbook, Anthropic Prompt Generator, and Suzgun & Kalai (2024).

Have a meta prompting workflow that works well for your research? Share it with RSO so we can add it to the library.