The 1-3-5 method for research prompts
A model for conducting better research with AI by factoring in what we already know about a topic.
Have you ever tried asking AI chat questions like:
“Can you teach me about [topic]?”
“Please provide a detailed explanation of [topic].”
“I need an overview of [topic].”
If so, good. But you might be selling yourself short. Researching with AI tools has so much potential, but its responses can be about as helpful as a hyper-caffeinated intern if we don’t frame our questions the right way.
What is the right way? For starters, research works best when it takes into account what we already know. When we have some knowledge, for example, overly broad research questions can produce responses that are bloated with re-hashed information.
Key Takeaways
The 1-3-5 method helps you tailor research prompts to match your knowledge level
It breaks prompts into layers—main topic, sub-topics, and key details—for more structured knowledge collection
Starting simple often leads to the best insights, even for experienced researchers
Why your research prompts aren’t working
The symptoms of bad research prompts include:
Vague generalities: The AI gives you surface-level responses (because it wasn’t guided to dig deeper).
Information overload: You get flooded with information unimportant to you.
Contradictory results: Without clear context, the AI might collapse multiple angles and give you conflicting ideas.
Unusable insights: The response may be accurate but utterly impractical because it goes into detail beyond your knowledge level.
The root of the issue is this: we often approach research like a one-size-fits-all exercise. But vague prompts beget vague results. Imagine asking a librarian for "a book on building houses." You’ll get suggestions, some of them irrelevant if only because they’re a mismatch for your knowledge level. The remedy for which is a more structured question. AI chat is a lot like a librarian in that way.
Which is why I developed the 1-3-5 method to improve the effectiveness of my research prompts and save me from all manner of fluffy, scattershot results.
What is the 1-3-5 method?
The 1-3-5 method helps structure research prompts based on how much you already know about the topic (thereby striking a balance between breadth and depth). The 1-3-5 in the name are dual-purpose, in that they represent two things:
A scale to self-assess how much knowledge you have on a topic
The level of detail your prompt requires (given your knowledge level)
Step 1: Self-assess how much knowledge you have on a topic
We’ll answer the question “how much do I know” on a scale of 1 (least amount of knowledge), 3, or 5 (most). Let’s pick topic and exemplify how to answer this question. In this case, let’s talk about “clean energy”.
1: I have almost no knowledge
You’ve heard of the term or topic, but that’s about it.
Example: Maybe you know a little bit about the types of clean power. You know clean energy can mean “good for the environment,” but you’re unclear on what technologies count or how they work.
3: I have a basic understanding
You have some breadth but lack depth. You’re unclear on some of the what and a lot of the how.
Example: You know the basics of clean energy. You understand that solar, wind, and hydropower are key sources, and you’ve heard of things like carbon neutrality.
5: I have advanced knowledge (and I’m filling gaps)
You have a strong grasp of concepts, and can deep dive into at least a few. You potentially even follow industries or trends.
Example: You’re familiar with technologies like geothermal energy or green hydrogen and understand debates around efficiency and scalability.
Step 2: Construct a research prompt based on my existing knowledge
Whether we answer 1, 3 or 5 for our knowledge level dictates the level of detail for our prompt. As a general (and perhaps counterintuitive) rule, the less you know about a topic, the shorter your research prompt will be. Conversely, making our prompt more detailed can proactively guard against responses full of information we already know.
But what do we mean by “level of detail” for our prompt? The definition revolves around the inclusion of three subsets of information for our prompt. They are: one main topic, three sub-topics, and five deeper details or limitations.
Which of those three we use depends upon our specific knowledge level. Here’s the relationship between knowledge level and the requisite detail for your research prompt, given each possible answer to our self-assessment:
1: I want to ask for a broad, high-level view. In this case, my prompt only needs the main topic.
3: Include the main topic and three sub-topics to go for more depth.
5: Along with the main topic and sub-topics, provide extra details you want covered (or limitations that you don’t want to learn about).
And here’s the outline for a research prompt where you can add a topic, sub-topics and a set of limitations. The use of the latter two (limitations and sub-topics) are optional and dependent on your knowledge. Keep reading for a more practical example around our topic of clean energy.
Please tell me about: {TOPIC}.
TOPIC = "[add the topic you want to learn about here]"
In particular, please tell me about:
- [Sub-topic 1: ...]
- [Sub-topic 2: ...]
- [Sub-topic 3: ...]
And in your research, please don't cover the following:
- [Detail 1: ...]
- [Detail 2: ...]
- [Detail 3: ...]
- [Detail 4: ...]
- [Detail 5: ...]
Walkthrough: Using the 1-3-5 method
So that’s it then. We’re learning about clean energy. Let’s walk-through what our research prompts would look like. They’re all about clean energy, but each is reflective of a different level of knowledge coming in.
If I answered ‘1’ for my knowledge level (very little), I’d only include the main topic.
Please tell me about: {TOPIC}.
TOPIC = "clean energy"
If I answered ‘3’ for my knowledge level (some), I’d also include three sub-topics. And these would be three more specific areas I want my research to go.
Please tell me about: {TOPIC}.
TOPIC = "clean energy"
In particular, please tell me about:
- The most effective sources of clean energy
- How the clean energy market has evolved in the last two decades
- What research shows us about the environment impacts of clean energy over fossil fuels
And if I answer ‘5’ for my knowledge level (knowledgeable), I’ll also include five details (or limitations). And these would be much deeper components, such as key facts/terms or limitations. As a caveat, it can be anywhere between two and five key details.
Please tell me about: {TOPIC}.
TOPIC = "clean energy"
In particular, please tell me about:
- The most effective sources of clean energy
- How the clean energy market has evolved in the last two decades
- What research shows us about the environment impacts of clean energy over fossil fuels
And in your research, please don't cover the following:
- Anything relating to politics around energy
- Trends as it relates to clean energy consumption
- How clean energy is produced
As a general (and perhaps counterintuitive) rule: the less you know about a topic, the shorter your research prompt will be.
More practical examples
Here are other variations of research prompts you might already be asking:
“Help me understand [topic] better.”
“Give me an introduction to [topic].”
“What are the basics of [topic]?”
“Can you summarize key points about [topic]?”
“Provide a beginner’s guide to [topic].”
“What are the most important aspects of [topic]?”
“Explain [topic] step-by-step.”
Because, you know, it’s helpful to know what you’re looking for. And my hunch is you’ve asked some of these questions already.
The value of pretending you know less
Sometimes, starting small might be valuable even when we have deeper knowledge. We grasp new concepts best when they’re connected to what we already know. Jumping straight into complex sub-topics can leave us prone to the principal of “what we don’t know we don’t know”. Tread carefully, as that’s the kind of mud where wheels spin.
As an example, say you’re deeply familiar with a topic like marketing funnels, beginning with a broad '1-level' question creates a scaffold for learning exploration. Not only that, but broad questions and broad responses can prime our brains to spot patterns. In other words, going broad (even as experts) allows us to make connections we might otherwise miss.
So if you're tired of AI responses that feel like they were written by someone copying and pasting Wikipedia at 3 a.m., give the 1-3-5 method a shot.