Structure your user research study from start to finish so you capture the right insights, recruit the right participants, and walk away with clear, actionable findings.
Template components
Research Goals & Questions
Write the primary objective of your study and the specific questions you need answered to make a product decision.
Example: Goal: Understand why new users abandon the onboarding flow before completing their first project. Key questions: What steps feel confusing or unnecessary? What do users expect to happen that doesn't? What would make them more confident to continue?
Methodology & Study Design
Describe the research method you will use, the format of the sessions, and how you will capture data.
Example: Method: Moderated usability testing via Zoom. Format: 45-minute sessions with think-aloud protocol. Data capture: Session recordings, observer notes, and a post-task SUS questionnaire. Sessions will be conducted over two weeks with a minimum of 5 participants per segment.
Participant Criteria & Recruitment
Define who qualifies for the study, how many participants you need, and how you will recruit and screen them.
Example: Target: SaaS product managers at companies with 50–500 employees who signed up in the last 90 days but have not created a project. Sample size: 10 participants (5 from SMB segment, 5 from mid-market). Recruitment: In-app intercept survey + UserTesting panel. Screener will exclude anyone in a technical role or with prior product research experience.
Success Criteria & Deliverables
State how you will know the research was successful and what outputs you will deliver to stakeholders when it is complete.
Example: Success criteria: Identify at least 3 distinct friction points supported by observations from 60% or more of participants. Deliverables: Affinity map of themes, top 5 findings with supporting quotes, prioritized list of design recommendations, and a 10-slide readout deck for the product team — due within one week of final session.
Full Copyable Template
<div class="tmpl-full-doc">
<div class="tmpl-full-header">
<div class="tmpl-full-title">Research Plan Template (free)</div>
<div class="tmpl-full-meta">Prepared by: [Researcher Name] | Team: [Product / Design / Research] | Study type: [Moderated interviews / Usability test / Mixed-method] | Version: [v1.0] | Date: [Month DD, YYYY]</div>
</div>
<hr class="tmpl-full-divider">
<div class="tmpl-full-section">
<div class="tmpl-full-section-title">1. Research Goal & Hypothesis</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Research objective</div>
<div class="tmpl-full-field-value">Understand why new trial users of a B2B SaaS product fail to complete setup and do not convert to paid within the first 14 days.</div>
</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Primary research question</div>
<div class="tmpl-full-field-value">What moments of confusion, friction, or unmet expectations cause trial users to abandon onboarding before reaching their first meaningful success milestone?</div>
</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Hypothesis</div>
<div class="tmpl-full-field-value">We believe trial users drop off because the initial setup asks for too much configuration before they see value, and because the language used in onboarding assumes a level of technical familiarity many users do not have.</div>
</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Decisions this research should inform</div>
<div class="tmpl-full-field-value">Prioritize onboarding redesign opportunities, revise setup copy, identify where guided assistance is needed, and determine whether activation metrics should shift from “account created” to “first workflow completed.”</div>
</div>
<div class="tmpl-full-tip">💡 Tip: Tie the goal to a real product decision. A strong research plan makes it obvious what stakeholders will do differently after the study.</div>
</div>
<hr class="tmpl-full-divider">
<div class="tmpl-full-section">
<div class="tmpl-full-section-title">2. Method & Format</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Research method</div>
<div class="tmpl-full-field-value">Moderated remote usability interviews with a brief contextual interview followed by task-based walkthroughs in the live product or prototype.</div>
</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Why this method fits</div>
<div class="tmpl-full-field-value">This format allows the team to observe where users hesitate during onboarding, ask follow-up questions in the moment, and distinguish between usability issues, motivation issues, and expectation mismatch.</div>
</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Session format</div>
<div class="tmpl-full-field-value">45-minute Zoom sessions: 5 minutes intro and consent, 10 minutes context interview, 20 minutes task walkthrough, 10 minutes reflection and wrap-up.</div>
</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Stimuli / materials</div>
<div class="tmpl-full-field-value">Current onboarding flow, welcome email, setup checklist, and a prototype of a simplified “first success” path for comparison if time allows.</div>
</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Sample size</div>
<div class="tmpl-full-field-value">8–10 participants across two segments: recent trial users who did not convert, and users who converted after initial hesitation.</div>
</div>
<div class="tmpl-full-tip">💡 Tip: If the study is evaluative, define the exact experience participants will see. Avoid vague wording like “review the product” when the true focus is a specific flow.</div>
</div>
<hr class="tmpl-full-divider">
<div class="tmpl-full-section">
<div class="tmpl-full-section-title">3. Participant Criteria</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Target participant profile</div>
<div class="tmpl-full-field-value">People responsible for evaluating or implementing a new software tool for their team, such as operations managers, customer success leads, or startup founders at companies with 5–200 employees.</div>
</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Must-have criteria</div>
<div class="tmpl-full-field-value">Started a product trial within the last 30 days; personally completed at least part of onboarding; involved in software evaluation or setup decisions; comfortable sharing screen and speaking in English.</div>
</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Segment quotas</div>
<div class="tmpl-full-field-value">4–5 participants who abandoned the trial before completing setup, and 4–5 participants who completed setup and reached initial value before deciding whether to convert.</div>
</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Exclusion criteria</div>
<div class="tmpl-full-field-value">Current employees, agency consultants who did not directly use the product themselves, participants who work for competitors, and anyone who has not used the onboarding flow in the last 30 days.</div>
</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Incentive</div>
<div class="tmpl-full-field-value">$75 digital gift card for a 45-minute session, sent within 3 business days of participation.</div>
</div>
<div class="tmpl-full-tip">💡 Tip: Define recency. “Used recently” is too loose; “started a trial in the last 30 days” recruits participants with fresher memory and better detail.</div>
</div>
<hr class="tmpl-full-divider">
<div class="tmpl-full-section">
<div class="tmpl-full-section-title">4. Screener Questions</div>
<div class="tmpl-full-q">
<div class="tmpl-full-q-text">1. In the past 30 days, have you signed up for a free trial of a software product for work and personally gone through at least part of the setup or onboarding process?</div>
</div>
<div class="tmpl-full-q">
<div class="tmpl-full-q-text">2. Which of the following best describes your role in choosing or setting up software tools for your team? [I make the final decision / I strongly influence the decision / I evaluate tools and make recommendations / I am not involved in tool selection]</div>
</div>
<div class="tmpl-full-q">
<div class="tmpl-full-q-text">3. What happened after you started the trial? [I stopped before finishing setup / I completed setup but did not continue using it / I completed setup and kept using it / I converted to a paid plan]</div>
</div>
<div class="tmpl-full-q">
<div class="tmpl-full-q-text">4. Approximately how many employees work at your company? [1–4 / 5–24 / 25–99 / 100–200 / 200+]</div>
</div>
<div class="tmpl-full-q">
<div class="tmpl-full-q-text">5. Are you willing to join a 45-minute recorded video call, share your screen, and talk through your experience signing up for and trying a software product?</div>
</div>
<div class="tmpl-full-tip">💡 Tip: Include at least one behavioral screener question about what the participant actually did, not just who they are. Behavioral screeners produce better research recruits.</div>
</div>
<hr class="tmpl-full-divider">
<div class="tmpl-full-section">
<div class="tmpl-full-section-title">5. Session Plan & Schedule</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Discussion guide structure</div>
<div class="tmpl-full-field-value">Start with participant context and what prompted them to try the product, then walk through how they approached onboarding, where they paused or felt uncertain, what they expected to happen next, and what ultimately influenced continuation or drop-off.</div>
</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Core tasks</div>
<div class="tmpl-full-field-value">Ask participants to sign in, review the first-run experience, complete the initial setup checklist, connect one required data source or integration, and explain what they believe the product wants them to do before they can get value.</div>
</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Moderator prompts</div>
<div class="tmpl-full-field-value">“What were you expecting to happen here?”, “What would you do next if you were alone?”, “Is anything unclear or harder than expected?”, and “At what point would you decide this tool is or isn’t worth continuing?”</div>
</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Research timeline</div>
<div class="tmpl-full-field-value">Recruiting: May 6–10; sessions: May 13–17; synthesis: May 20–22; readout and recommendations: May 24.</div>
</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Observer plan</div>
<div class="tmpl-full-field-value">Product manager, designer, and onboarding lifecycle owner may observe live silently; questions and notes collected in a shared doc and discussed after each session, not during.</div>
</div>
<div class="tmpl-full-tip">💡 Tip: Build 10–15 minutes between sessions for note cleanup and immediate debriefs. Teams capture stronger insights when they synthesize while observations are still fresh.</div>
</div>
<hr class="tmpl-full-divider">
<div class="tmpl-full-section">
<div class="tmpl-full-section-title">6. Success Metrics</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Research success criteria</div>
<div class="tmpl-full-field-value">Identify the top 3–5 onboarding friction points with clear evidence, understand which friction points are most likely to affect activation or conversion, and produce actionable recommendations that the product team can prioritize in the next release cycle.</div>
</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Behavioral signals to observe</div>
<div class="tmpl-full-field-value">Time spent hesitating on setup steps, requests for clarification, signs of mismatch between user expectations and system language, failed attempts to complete key actions, and abandonment cues such as “I’d probably stop here.”</div>
</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Output deliverables</div>
<div class="tmpl-full-field-value">A synthesis report with key themes, severity-ranked issues, representative quotes, annotated journey breakdown, and recommended next actions for onboarding copy, flow design, and lifecycle messaging.</div>
</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Downstream product metrics to monitor</div>
<div class="tmpl-full-field-value">Setup completion rate, time to first meaningful action, trial-to-activation rate, activation-to-paid conversion rate, and support ticket volume related to onboarding confusion.</div>
</div>
<div class="tmpl-full-tip">💡 Tip: Separate research success from product success. The study succeeds when it produces decision-ready evidence, even before product metrics improve.</div>
</div>
<hr class="tmpl-full-divider">
<div class="tmpl-full-section">
<div class="tmpl-full-section-title">7. Stakeholder Sign-Off</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Primary stakeholders</div>
<div class="tmpl-full-field-value">Product Manager: [Name], Product Designer: [Name], Head of Customer Success: [Name], Growth Lead: [Name], Research Owner: [Name]</div>
</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Scope agreement</div>
<div class="tmpl-full-field-value">Stakeholders agree that this study focuses on onboarding and early activation only, not broader pricing, feature requests, or long-term retention drivers unless they directly affect setup and first-use success.</div>
</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Decisions enabled by sign-off</div>
<div class="tmpl-full-field-value">Approval to recruit participants, schedule sessions, record calls for internal analysis, and use findings to prioritize onboarding improvements in the next planning cycle.</div>
</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Sign-off status</div>
<div class="tmpl-full-field-value">Research lead approved: [Yes / No] | Product approved: [Yes / No] | Design approved: [Yes / No] | Legal/privacy reviewed if needed: [Yes / No]</div>
</div>
<div class="tmpl-full-field">
<div class="tmpl-full-field-label">Open questions before launch</div>
<div class="tmpl-full-field-value">Do we need separate quotas for self-serve versus sales-assisted trial users? Should we include users who invited teammates during the trial, or keep the study limited to solo evaluators?</div>
</div>
<div class="tmpl-full-tip">💡 Tip: Get explicit agreement on scope before recruiting. This prevents “while you’re at it” requests that dilute the study and make findings harder to act on.</div>
</div>
</div>
How to use it
Define your research goal first Before filling in any other section, write one clear sentence that describes the decision this research will help your team make.
Choose your method based on your question type Use generative methods like interviews when you need to discover problems, and evaluative methods like usability tests when you need to assess a specific solution.
Recruit before you finalize your discussion guide Start the screener and outreach process early so participant scheduling does not delay your study timeline while you finalize your questions.
Share the plan with stakeholders before fielding Get sign-off on goals and success criteria upfront so that findings land with credibility and recommendations move directly into the product roadmap.
What it looks like filled in
Unclear value at the setup step
"I didn't know what a 'workspace' was supposed to be — I thought I was setting up my profile, not the whole company account."
→ Rewrite the workspace setup screen headline to explain who this configuration affects and why it matters before users proceed.
Premature feature exposure overwhelming new users
"There were like eight things I could do and I just wanted to create one thing and see if it worked — I didn't know where to start."
→ Design a focused first-run experience that surfaces only the single next action needed to reach the first value moment, hiding advanced features until session two.
Missing progress feedback causes drop-off anxiety
"I saved it and nothing happened — no confirmation, nothing. I wasn't sure if it actually worked or if I had to do something else."
→ Add inline save confirmation and a visible progress indicator throughout onboarding so users always know their current step and what comes next.
Why teams skip the template
Manually reviewing hours of session recordings Watching and re-watching recordings to find recurring themes takes days of researcher time and is highly dependent on who is doing the reviewing.
Synthesizing notes into themes by hand Sorting sticky notes or spreadsheet rows into affinity clusters is time-consuming and introduces bias based on which quotes you happened to notice or remember.
Writing up findings without missing critical patterns When analysis is done manually across 10 or more sessions, low-frequency but high-impact signals are routinely overlooked because they do not cluster visually the way dominant themes do.
Analyze your user research plans and study designs automatically — no template needed