How to Know if Your Idea’s the Right One — A Founder’s Guide for Successful Early-Stage Customer Discovery
Starting Up

How to Know if Your Idea’s the Right One — A Founder’s Guide for Successful Early-Stage Customer Discovery

Jeanette Mellinger, BetterUp's Head of UXR and former Head of UXR for Uber Eats, unpacks her approachable three-step playbook for early-stage customer discovery so founders can build something users really want.

So you’ve got a startup idea swirling around in your head. As an initial step, you might grab a coffee with a trusted friend and ask, “So what do you think?” and poke around for some early research into the competitive landscape. But as “The Mom Test” has taught us, this sort of casual idea validation with friends and family is not nearly enough to build an enduring business that can go the difference — more rigorous customer discovery is required

Yet all too often, founders and product leaders fail to make this leap. Some neglect the research process entirely in the early days, instead waiting until they’ve spun up an MVP for folks to test out. Others do start talking to potential customers when the idea is still squishy and moldable, but without processes and structure in place to ensure this is time well spent.

Jeanette Mellinger, Head of UX Research at BetterUp and former Head of UX Research for Uber Eats, has spent her career helping folks do this right. She also spent years as First Round's User Research Expert in Residence, teaming up with early-stage startup leaders as they formulate and validate their ideas.

“When I started working with founders as a user research expert, the first few founders that I spoke with, one after another, said: ‘I care deeply about my customers, I’ve spoken with dozens of them. But wow, I’m excited to work with you because now I want to add a bit of rigor to this, I didn’t get much out of these conversations,” says Mellinger. 

These are the folks who go out without a plan, fail to check biases and often don’t think carefully about who they’re interviewing. Even after potentially hundreds of conversations, these founders emerge from interviews with little substance. Sound familiar? 

“If you’re going to put time and effort into having these conversations, you might as well not waste them. It’ll probably take more time than you'd like, but doing it right saves you from having to course correct down the road.” Enter her playbook for early-stage idea validation.

She gears her advice specifically for founders — because they’re likely the folks embarking on this early research. Nor is she advocating to bring in a UX researcher from the very beginning. But you can (and should) establish healthy habits long before spinning up a dedicated UX team.

Often, by the time companies bring in a more dedicated UX research practice, it’s at the stage where they’re asking: Did we build this well? But the real power in this research is doing it as early as possible — to actually help shape where you’re going. 

In this exclusive interview, Mellinger walks through each step of her customer discovery process in depth, starting by laying the groundwork for all that’s to come with a thorough research plan that narrows the scope of who you’ll talk to and what you’re trying to learn.

She then moves on to the user interview process itself, offering tactical advice on how to ask better questions, avoid bias and open up space for unexpected answers. And in step three, Mellinger walks us through her tested process for reflecting and analyzing all the different bits of insights you’ve gathered over the course of each interview. Along the way, she flags plenty of common mistakes to look out for in each phase. 

While Mellinger’s approach brings more structure to the customer discovery process, it’s quite beginner-friendly (and can even save you time) with tips like conducting “mini research sprints” or her template for 5-minute debriefs. 

Whether founders are toying around with a new startup idea or years into building a multi-product business, understanding user needs is essential for success. Let’s dive into Mellinger’s tactical guide. 


You’re bursting with excitement about your product idea and can’t wait to tell other people about it. But before charging full-speed ahead into reaching out to potential customers and setting up meetings, Mellinger suggests taking a pause. “Planning your research takes more time than you’d like, but when done well, it will save you ample time later,” she says. Otherwise, you’ll end up with a scattershot of insights, without any clear throughline tying them into actionable data. 

Even just a bit of forethought and upfront planning can completely transform your approach. To start, she encourages folks to approach their user research by chunking it down into sprints — focusing on just 5-8 narrower conversations at a time. Why? Because when you’re starting to dip your toe into user research (particularly for products that don’t even exist yet) there’s an incredibly long laundry list of things you’d like to learn. But you don’t have unlimited time with your interview subjects. 

This sprint approach to research enables you to narrow your focus — like on a particular ICP or a single hypothesis you want to validate — for the entirety of a sprint, before moving on to the next sprint with a new piece to focus on validating.  

It’s so tempting to want to ask about 10 things at once, but that’s only going to yield shallow, inch-deep responses. This is your opportunity to go deep. What are the one or two things you must learn now?

She boils it down to a couple easy easy steps (and to put these ideas into practice, Mellinger sketches out a hypothetical example of a founder who has an idea to make a brand-new kitchen tool: a frying pan that heats in half the time of a traditional pan). 

Jeanette Mellinger, Head of UXR at BetterUp

Narrow it down to: What decision do I want to make next? 

“This is often the question that we skip over, and it’s actually a lot harder to define than teams expect,” says Mellinger. “What is the next decision I want to make or the next action that I want to take with what I’m building?” 

The key here is to get incredibly narrow. For our hypothetical pan example, the ultimate decision the founder wants to make might be whether they actually should build the fast-heating pan. But for a research sprint, this is far too broad. Zooming in, a few key questions to define first are who the right chef is for the product, and what the main value proposition is for that particular chef. Or, what are the most important features to build first? 

A common mistake Mellinger sees here is trying to ask too many different things at once. “When you try and ask 10 different things, you lose the opportunity to go deep, and those variables can start to conflict with one another,” she says.

One of Mellinger’s top strategies for narrowing down the decision to focus the research sprint around is to mentally place yourself in the debrief meeting a few weeks later. What do you hope to have gleaned from the research to take into that debrief meeting? 

Prioritize quality over quantity with your interviewees 

While talking to as many people as possible may sound like a strategy for success, Mellinger has a contrarian take here: Less is more.

“If you talk with too many different types of people — even with the same set of questions — you’re going to probably get some conflicting feedback. Of course, you want diversity of opinion, but talking to people who have such disparate needs unfortunately means that only some of them are who you’re going to want to serve.”

To avoid drowning in all the noise, the mini research sprint approach comes in handy. Start with just 5-6 people. If you’ve narrowed down a target customer type enough — and pair that with a strong central decision and some learning goals — you can glean more qualitative data points and patterns from five deep conversations than from 50 unfocused ones. 

“We get nervous when we hear, ‘Oh gosh, I only spoke with five people.’ But that’s my user research magic number. If you bring that focus into it, you can get something that’s much deeper than you’d ever expect,” she says.

To find that customer type for your research sprint, Mellinger recommends focusing on two dimensions: 

  • Psychographics: This is how people behave and think. In the pan example, Mellinger’s counsel would be to target chefs who value innovation and speed — as these are the types of end users that would likely invest in a pan that does the job quicker. 
  • Demographics: These are the easier details to pinpoint (think age, geography, years of experience) and make up the second step of the evaluation process. Thinking back to the fast-paced, creative chefs, it would follow to target major hubs like New York City — where kitchens are crowded and the restaurant industry is competitive. 

If there’s more than one target group for your product, try five interviews with each. Narrowing your sample as methodically as possible ensures a product that’s not only good, but good for those who need it most. 


Armed with an in-depth plan, founders and startup leaders are ready to start sitting down for interviews. But this eager mindset can get plenty of well-meaning folks into trouble — this isn’t just an opportunity to hear how great your idea might be. It’s an opportunity to hear the daunting points, too.

Remember, you want to learn the scary things when you’re conducting research — that’s how you avoid spending all of your time developing a product, only to find out that no one wants it. 

Mellinger shares her tactics for how to make the most of each conversation, while artfully dodging common biases that crop up along the way. She also shares the exact interview framework structure she prescribes to the founders she advises.

Collect better data by checking your bias

Bias is pesky and persistent, especially in the interview process. While it may be impossible to completely shed inadvertent feelings and opinions, placing them behind a mask of well-structured questions is a must if you want real feedback. 

“It’s not that people are trying to deceive you,” says Mellinger. “But bias from both ends can unfortunately create an environment of false signals. All of a sudden you’re building something that people don’t want as much as they indicated.”

Mellinger notes three types of bias to avoid when conducting user research.  

Confirmation bias: Put earmuffs on your “happy ears”

The more you have at stake, the more that you’re probably listening for what you want to hear.

When you’re asking for feedback, don’t only listen for the things that sound good. Make space for the things you don’t want to hear — those are the ones that’s what will make your product better.

With the right framing, everything can be interpreted as a green flag. You think making a faster-heating pan is a great idea, so no matter what happens over the course of the interview, you’re predisposed to leave thinking the other person agrees with you. 

The primary culprit that allows confirmation bias to flourish is how you phrase the questions. To avoid falling into the positivity trap, cut out leading questions.

“A leading question is one that we already have an answer to in our minds,” says Mellinger. “We’re not so much asking it to get new information as we are to validate our own thinking.” 

Certain nuances of language have the power to compel people to answer a certain way. Take a seemingly innocuous question like: “Do you think this is easy?” or “Do you find that difficult?” “We’re implying that it should, in fact, be easy or that we do think that it’s a struggle,” says Mellinger. Even if the interviewee doesn’t feel that way at all, the question’s tone can steer them away from their true experience. 

When structuring questions to avoid confirmation bias, Mellinger uses a few key strategies:

  • Avoid ‘yes or no’ questions. If your question can be answered with a simple yes or no, that’s a big red flag that you’re asking a leading question. In user research interviews, questions like these almost always start with, “Do you…” On top of their leading nature, they also lead to a less rich answer than when you ask, “Tell me about X, Y or Z.” 
  • Be careful with adjectives. Another hint that you’re headed towards a leading question? Sprinkling in any sort of adjective (like “easy”). If there’s an adjective in your questions, there’s an implied opinion. Instead, try rephrasing to a more neutral position: “Tell me about this experience…” 
  • Course-correct with follow-ups. While it’s best to avoid adjective-laden questions altogether, there may be times when you can’t escape asking something a bit more pointed. In that case, Mellinger has two tips. First, give the interviewee an out: “To what extent was this easy if at all?” Next, follow up with the counter: “To what extent was this difficult?” to balance the scales. 
Let’s say you start to hear something that sounds really good. That’s your alarm right there. The moment you hear, “Oh yeah, this product is great,” that’s a sign to ask the counter. What is something you can push on here? 

Acquiescence bias: Dodge attempts at people-pleasing

This particular strain of bias starts from a pretty lovely (but not all that helpful in a research setting) sentiment that we as humans are inclined to be nice to one another, and try to make other people feel good or hope that people will feel good about us. So when conducting customer research, the less you reveal your hand the better. 

The more we signal the ways in which we want someone to answer, the more they want to make us feel validated.

Mellinger’s solution is straightforward: hold off on talking about the product. This allows you to get a better read on what’s most important to the person before they inadvertently jump into the mode of being helpful to you.

Introducing acquiescence bias into your user research can start as early as the subject line of your recruitment email. “Hold off on talking about your product. You want to share a least a little bit of information about who you are and whom you want to speak with, but minimal information when you can about what your product is,” says Mellinger. 

“You can say: ‘I’d like to speak with chefs who are working in a busy kitchen and want to learn how to maximize their time in the kitchen.’ But you don’t need to say, ‘I’m trying to build the fastest-heating pan,’” she says. 

Otherwise, your conversations will center around frying pans — even if it’s not all that important to them. “Once they figure out you’re interested in pans, they’re going to want to talk to you about pans — even if they could care less about pans. But what if there was something else that was really concerning to them outside of pans?”

And once you sit down with folks, it can be equally tempting to ground the conversation in the context of your product idea. Think starting the interview off with: “I would like to talk to you today about pans.” But instead, Mellinger suggests a more open-ended opener. “Most of the talking I do as a user researcher is at the beginning of the interview, and a lot less later on. So in the beginning, I’d say, ‘Hey, I’m Jeanette, I want to learn more about how you operate in the kitchen. I’m going to eventually show you an early idea, but for now, I want to better understand you and your world. So anything that I ask you, feel free to elaborate on what’s most true for you. And you don’t have to answer any questions that you’re not comfortable with.’”

Here, Mellinger also has one unexpected tip. “It sounds a little silly, but I start these conversations by putting my hand into a fist on top of my head and tell them that I want them to talk as freely as if they have a microphone on their brain,” she says. “It's so out of the ordinary for people that it makes them pause. And it will come up later in the interview — they often repeat back to me, ‘I guess if I had a microphone on my brain, I would tell you X, Y, Z.’ It’s something they wouldn’t have otherwise felt like they should share.” 

Hindsight bias: Ditch the crystal balls

As founders know (often a little too well), predicting the future is all but impossible. Yet, in user interviews, it can be tempting to ask questions like, “Would you use my product?” or, “How much would you pay for this?” While you may get some educated guesses, they’re impossible to bank on. So Mellinger underscores that it’s best to rely on the strongest predictor of human behavior: past behavior.

“In the pan example, I’d ask questions to find out how much they’re investing in kitchen tools now, how they've thought about that and how that’s changed over time,” says Mellinger. That’s a much more resonant indicator of how they might weigh pricing decisions in the future.

Another tip if you really want to hone in on pricing: “Come up with an obviously very high number, say $1,000. Watch that person’s reaction, and you might just get an interesting off-the-cuff answer like, ‘No way, I’d never spend more than $150 on a pan.’ And boom, there’s your price point.”

Structure the time: Discover the unexpected by zooming out, in, then out again

Even expert user researchers who are questioning pros would never wing it when it comes to sitting down to an interview. Taking a little bit of time to game out how you’re going to structure the time can mean the difference between a meh conversation, and unlocking unexpected insights. 

Across the board, whether you’re conducting research for a brand-new idea or gathering feedback on an existing product, Mellinger prescribes the zoom out, zoom in, zoom out framework. 

Practically speaking, you begin the interview by zooming out on the bigger context into the space and the person to whom you’re speaking. Next, zoom in to more specifics about your idea and problem space. Finally, wrap up by zooming out once more with some higher-level questions. 

To put these ideas into practice, Mellinger shares an example from a real-life startup she teamed up with. “I was working with a team that wanted to serve small businesses by building their online presence, specifically through Instagram. For them, the ‘zoom out’ looked like getting an understanding of how people ran their businesses in general. Zooming in a little bit more, I had them ask about digital strategy more broadly, including how they’ve built the online presence for their business. What social apps did they have a profile on? It took most of the interviewees a while to even get to Instagram, but that allowed us to learn some things about YouTube and its problems, and that gave us a completely different — and probably better — window to explore.”  

The final ‘zoom out’ is an opportunity to contextualize all your findings. Asking the customer, “If you had a magic wand to fix X problem, what would you ask for?” allows them to revisit what’s most important to them. It could be something completely different from the zoom in, or it could confirm everything they said before.

This is a marked difference from how that same team had initially been approaching user interviews. “Before we were working together, they often led in their recruitment emails with, ‘We're working on making Instagram easier for you.’ They'd start their interviews with, ‘We want to make, Instagram easier for you,’ and then dive right into the interviewee’s feedback on Instagram.” This pointed approach didn’t leave any room for the interview subject to talk about other platforms that might be giving them trouble and that were a deeper pain point for them. 

Practical tips for asking better questions

With that framing in mind on avoiding bias and structuring your time with the interview subject, Mellinger shares her punchlist of two tips for asking stronger questions.

  • Ask one short question at a time. Humans tend to offer a lot of preamble. When asking questions, we jumble two or three together in hopes of getting more out of the recipient. In reality, this kind of structure often lends itself to the exact opposite. Give people the space to answer in a nuanced way instead of forcing them to try and remember a handful of different queries. “This is something you can even start practicing at the dinner table tonight,” says Mellinger.  
  • Ask each question two to three different ways. This is especially important if a question is particularly important to you. Even if you feel like you got good information out of the initial ask, try reframing the question to see if you can discover any kind of new insight. As Mellinger sees it, probing at from different angles can help uncover all sorts of data points. 


The end of customer conversations is far from the end of the research process. You’ve now got dozens of (possibly conflicting) bits of insights and data from your user interviews. It’s time to piece it all together. While all the information may be collected, the real findings come from taking a look back. 

Here, Mellinger prescribes a three-pronged approach: short analysis after each call, hour-long weekly syncs, and longer comprehensive analysis. 

Jeanette Mellinger's 3-step approach to research analysis

But before diving in, the first thing to nail down is your documentation hygiene. 

It’s impossible to store every bit of insight in your head — that’s why Mellinger recommends recording every single interview. That way, even if the feedback isn’t relevant right now, you’ll always have a transcript to refer back to. But a raw transcript alone isn’t enough, because it’s unlikely that you’ll have the time to listen back to every conversation. So she also recommends a note-taking template (and ideally, bringing in another person to act as your note-taker). 

“As I take notes, I’m jotting down any quotes that either confirm or disprove my hypothesis, as well as any strong emotional reactions. I might put an “H” next to something related to my hypothesis, or an “E” next to a big emotional response. It will make your life easier later on, because you’re going to get hundreds of pieces of qualitative data in the interview. This practice will make sorting through everything much easier later on,” says Mellinger.

With your notes in hand, here’s how to proceed with the three-pronged analysis approach.

The short version: have a post-game 15-minute huddle 

After every interview you conduct, Mellinger recommends putting 15 minutes on the calendar with your co-interviewer (like your co-founder, if they also sat in on the call) or notetaker. “This practice gets people a lot further than they think — it will make your later synthesis so much faster. It’s tempting to schedule back-to-back calls, but it’s so much better to debrief when it’s fresh,” says Mellinger.

This is a space to share initial thoughts and reactions, particularly related to three key topics:

  • Data: During this huddle-up, write down any piece of data that stood out. Mellinger suggests specifically looking for things that both supported and countered your original hypothesis. Successful interviewers are able to recall both what a customer said (direct quotes) as well as what they showed (think body language, tone, a big audible sigh). 
  • Insights: Insights are the early interpretations of what’s going on. Look at the data you’ve collected and translate that into a general sentiment. Ask: “What patterns am I starting to see? Is that my opinion, or is that the story the data is telling?” 
  • Action: To wrap up this short debrief, address at least one actionable item that can serve as a next step. What’s one bit of insight you want to follow up on in your next interview? Do you need to change any of the questions to better avoid bias? Whatever it is, move forward with not only reflections but concrete to-do’s. 
You don’t want to draw any conclusions after only one call, but it’s okay if some high-level takeaways start forming.

The medium version: block out an hour-long weekly meeting

The next step from the huddle is a more formalized, recurring meeting to synthesize your findings across multiple conversations — like at the end of a 5-interview research sprint. Mellinger finds these medium debriefs particularly useful for sharing your findings more broadly with the team — who may not be sitting down with you for each of these interviews.

“On Friday mornings, we put an hour on the calendar to synthesize what we learned that week,” she says. “This is where all of those shorter debriefs become helpful. You don't have to go back and reread all the transcripts, you just need to look at the follow-up notes and pull out the highlights,” says Mellinger.

This is also an open space to start throwing around the ‘whys.’ Why does a customer do X? Why is that important to them? These types of questions start to bring out the patterns across different user answers. 

The longer half-day version: systematically look back at all of your interviews 

This version of analysis is not dissimilar to the evidence boards seen in crime shows and murder mysteries — with pins and strings interconnecting to form a complete picture of the case. 

“The goal of this stage is to look back holistically and see if there’s anything that wasn’t captured in your shorter debriefs,” says Mellinger. “After you have everything laid out properly, start to clump those pieces of data together and see which ones relate to one another. That’s where you begin to see the patterns that emerge.”

She recommends the affinity diagramming method, where you put a bunch of post-its on the wall, each representing an insight from a user interview, and start clumping them together to see which ideas relate to one another. 

If you’re not quite ready to level up to affinity diagramming, that’s okay, says Mellinger. “At least try incorporating the small and medium analysis into your process,” she says. 

So often teams will want to jump right into solutioning and action mode. But the more you ground yourself in the data, the more conviction you’ll have in your idea.


Successful user research strikes the balance between speed and patience. While it’s important not to start the process too late, taking the time to get a strong understanding of the findings is equally crucial. 

You have to think about user research from the very beginning — you can’t be slow to get there. If there’s a lack of rigor around investigating customer experience at any point, you’re doing your product a real disservice.

“Moving too fast makes you miss the root cause of the problem,” she says — pointing to the famous (if disputed) Henry Ford line. “That means you don't need to take everything that customers say at exact face value. Ask yourself: ‘Why are they asking for a faster horse? What’s going on here?’ The more you ground yourself in that data and synthesize, the more conviction you’ll have in what customers are really telling you.”

To make something that customers love (rather than a product that quickly fizzles out and fails to find product-market fit), set out early to deeply understand the customer ask and where it’s coming from — even if that requires a little more time and nuance up front.