The early user research playbook for founders — Jeanette Mellinger’s expert advice for validating your idea with high-quality interviews
Episode 90

The early user research playbook for founders — Jeanette Mellinger’s expert advice for validating your idea with high-quality interviews

Our guest today is Jeanette Mellinger, Head of UX Research at BetterUp and our User Research Expert in Residence at First Round. Jeanette unspools her tested playbook for high-quality customer interviews, with particular advice for founders in the very early days of validating an idea.

Play Episode

Our guest today is Jeanette Mellinger, Head of UX Research at BetterUp and our User Research Expert in Residence at First Round.


In today’s conversation, Jeanette unspools her tested playbook for high-quality customer interviews, with particular advice for founders in the very early days of validating an idea, including: 

You can follow Jeanette on Twitter at @jnetmell 


You can email us questions directly at [email protected] or follow us on Twitter @firstround and @brettberson

Brett:

 Well thank you so much for joining us.

Jeanette: Yeah, it's my pleasure. Thanks for having me.

Brett: Uh, to kick off the conversation, maybe we could start really broadly on the topic of user research. And I think it's kind of one of those things, that everyone knows about and knows they should be doing it, but often either don't do it or don't do it well.

And I was hoping to spend a lot of our time together exploring kind of the nuances of what Excellent. Customer research looks like and sounds like. Um, and so maybe with that context, we could start with some of the things that you see as common mistakes or traps that founders tend to fall into when they're doing this important work of talking to customer.

Jeanette: Of course. Yeah. And of course it's hard to do something well when we even have a hard time defining it ourselves Sometimes as user researchers, do you wanna call my job user research or UX research is this customer feedback? Um, and I think that's, that speaks to the, the, the nuance and a lot of the different ways that one could do this.

But I'd say that actually one of the first mistakes is that we don't call it user research, um, especially if I'm thinking. Let's see, an early stage founder, um, they've gotten the memo, which is great, that they should be getting to know their customers and integrating their needs into their initial product and making sure they're solving a real problem.

That's, that's a huge leap for us to have really adopted that mindset. But then as founders go out and get to know their customers, if they don't recognize that what they're doing is user research, whichever formal title you want to give in and, and recognize that there's some rigor that comes with it, you might not actually end up getting out of it as much as you really, really can.

So I think in a way, the first mistake is not recognizing that there's a bit of, um, a skill and an expertise to that particular part of your initial job, um, which is understandable. You're, when you're starting a company, there's a million hats that you're wearing and you're not expected to be an expert at all of them.

Um, and yeah, I'll dive into that in, in a moment in terms of some of the main skills I think are worth adopting from the practice of, of UX research. But the first one is to say, yep, I'm gonna wear this as a, as a skill hat. And I, it's a really different thing than just. Poking a friend in chatting. Right?

What, what's the, the leap between those two things, the chatting and the, experienced user research that you could be doing? So there, there's that first mindset piece. I wanna first ground you in a story that really stood out to me when, uh, I first started working with First Round as a user research expert.

Uh, the first few founders that I spoke with, one after the other said, I do care deeply about my customers. I've spoken with dozens of them, in one case, even hundreds of them. But wow, I'm so excited to work with you because now I want to add a little bit of rigor to this. I didn't get that much out of those conversations.

I sort of have a hint of what I might need to do, but it feels like it's just my intuition now is the time for rigor. And it's that moment of saying, wow, you've put in this much time, this much effort, and not gotten out of it very much. This is part of why I'm excited to talk about these tips, because I don't want people to get, stuck in that moment.

It's natural that your first. Few conversations might be a little bit more loose and you do wanna add rigor, but stop before you've gone to dozens or hundreds. Um, and so the main three things that I think, um, I've noticed is patterns across working with now dozens of early stage companies through first round and outside of it is that there's, um, three main steps.

One, um, when you're planning research, I find that teams don't tend to be nearly focused enough in it. I can dive into that a bit more. Um, two, when you're in the learning phase, you're asking questions. It's just so easy to inadvertently bring in bias or honestly not, not dig deep enough and find something that's unexpected.

Um, something that's actual discovery. Uh, and, and three, when you're trying to figure out what to do with your learnings, um, there's very little analysis done, uh, and let alone done in, in a depth that can get you to some real unexpected insights. But it's nice that it ends up mapping to I think, the main very high level steps of research.

One, you're planning, research. Two, you're conducting it, you're learning three, you're acting on it. And so it's that. I think I'm a big fan of frameworks as a researcher, and the one I I often tell teams is one, focus, two better questions. Three, analyze.

Brett: So with that kind of high level architecture, why don't we take them in turn and maybe start with the first topic, which is planning research. What does that looks like when it's done well?

Jeanette: Yeah. it probably takes more time than you'd like, but saves you a lot of time later. done well, I like to think of asking teams to, uh, chunk their research into many research sprints. So let's go back to that example of a, let's say, a few dozen conversations you wanna have. It's not to say that you shouldn't have these few dozen conversations, but instead chunk them into, five to eight very focused conversations, one after the other.

Um, and so then in terms of structuring those and saying, how do I think about this first chunk, um, the next framework for you that it, uh, makes up what I, I consider to be a bit of a mini research plan, more of an informal research plan is, one, asking yourself what decision you wanna make. Next. Two, defining one to two learning goals, and then three, figuring out whom you're going to speak with.

 the first piece, um, that decision to make this is often the one that we skip, um, and actually is a lot harder to define than teams expect.

Um, you can also rephrase this for yourself as desired impact, but you wanna ask yourself, what is the next decision that I wanna make or action that I wanna take in order to take the next step, um, with what I'm building? And, you know, that I hope I can have customer feedback inform and that decision.

Some examples here might be, um, of course we can start with saying, I don't know, I wanna define what I'm supposed to build. Period. Um, Let's make it smaller than that though. It's good. It's good to think about the higher level, things that matter to you most, uh, and that you need to define over the next reps.

But it could be something like, I want to, um, refine what is my target customer. definition I want to, um, figure out which is the most important first feature to build for them. Think about the meeting you're gonna be in in a couple of weeks where you want this research to have informed it and think about what am I going to be talking about in that meeting that I wanna be able to do?

And then that gets you to question two for learning goals and saying, okay, what do I need to know that I don't already know, that will help me make that decision? And it's so interesting when I walk teams through this exercise, a lot of times when I'll ask them about the decision they wanna make, they'll end up jumping to learning goals without meaning too.

 but I think it's cuz we naturally move into learning mode. Let me say, oh, I need to decide and understand what the, the deep problems are of my customers, let's say. Um, so then if you're having trouble, that's still a learning goal.

That's still something you want to learn about is what is their in-depth problem. So then you ask yourself, okay, in order to do what, in order to frame some part of my value prop in order to decide on the first feature to work on et cetera, um, I suggest that teams toggle back and forth between those, but clarify that decision to.

And then the next piece of focus is around those learning goals. there's gonna be a million things that you wanna learn. What are one or two things that you need to learn first that you know the least about in order to inform that decision? Um, this is, I think, one of the biggest team mistakes that teams can make.

And, I've worked with companies a lot of different stages, and this is true for much more, um, mature companies as well. It could be so tempting to wanna ask about 10 things at once. But, uh, it's unfortunately going to, um, get you some pretty shallow, uh, responses. This is your, um, opportunity to, to go deep.

And also sometimes when you try to learn about too many things at once, you can have some, uh, variables that start to conflict with one another. So really say, what are the one or two things I must learn now? And then three, you wanna focus on a particular target customer,

 one mistake and one of the, um, the things that can again, lead to that moment of saying, wow, I've spoken with dozens of people. What do I get from this? Is if we talk with a lot of different types of people, even about the same questions, we're going to probably get some conflicting feedback.

And that's not necessarily that there isn't a clear way forward, it's that we might be talking to people who have such disparate needs that unfortunately some of them might be the one, the people that we wanna serve and some not as much. And then it can, create a lot of noise for us. So this is where I'd suggest really getting clear on who do I believe is most important, either one to serve as a business or two that is most related to this decision I might want to make and 

If you speak with five, six, maybe eight people, um, of a decently narrowed target customer type about a couple of main learning goals, uh, you can get enough qualitative data in those deep conversations to really start to see some patterns and to have a lot more conviction than even 50 more loose conversations that aren't as focused with a lot of different types of questions and a lot of different types of people.

And we get nervous sometimes when we hear, oh my gosh, I only spoke with five people. But really if you bring that focus into it, you can get something that is, much deeper than you'd expect. it's kinda the magic number in qualitative research that if you're well focused, you can move forward after five I.

Brett: With that bit of context, let's say someone is a chef in a kitchen and has this insight that they think there's an opportunity to create a new kind of pan, maybe that heats faster, has some set of properties that are different and valuable to other chefs around the world. And so they've been thinking about this and they're gonna go do customer research for this pan.

 how does this map to sort of the set of ideas that you just shared, what, what would that look like if this person was doing it well?

Jeanette: Yeah, that's great. So again, they first say, all right, I wanna build a mini research plan. Let's first think about the decision I want to make. And at this point it might be the larger decision is probably, do I want to actually build this pan? Um, but then they might say, actually, I want to further first define who is the right target chef, um, and what is the main value prop for that particular chef, let's say.

Um, and so then the next thing that they want to learn is what are the, top cooking related issues for this hypothesized, top chef profile? And. What do they think of my faster heating pan idea? And then as they start to define the target customer, it's not necessarily a chef anywhere. And here's where I can get a little bit dorky with the target customer piece, although don't be, um, I wouldn't, I hope that people aren't intimidated by bringing in this much rigor.

But you wanna think about what we, uh, call psychographics and demographics. Ideally, psychographics is how people behave or think, and demographics are some of those often easier to pinpoint details, age. Um, in this case, maybe it's, um, the geography of the chef. It's number of years of experience. And I often suggest that people start with psychographics.

And here you're gonna think about, wow, okay, if someone wants a pan that's going to heat faster, that probably means that they're a chef that's in a really fast moving kitchen. Um, maybe they have too much demand in their, in their kitchen, or they're trying to innovate the ways that they cook. I'm gonna hypothesize that it's about chefs that are in overloaded kitchens.

Uh, and I'll say, okay, if that's, uh, if that's probably the person, then maybe the demographic is, it is in a large city. It's in New York City with a restaurant that serves this many people. Um, and it's a chef of, you wouldn't even necessarily have to go to, let's say, years of experience or, um, a type of cuisine.

But you think about which of these demographics might relate to this particular type of chef that I think that a faster heating pan would. And then go from there. And, and for instance, if you think that wow, there's a split between chefs that might just be overloaded in a kitchen and need a heat faster versus a chef that's approaching it in an innovative way, those might be different chef profiles.

So then try five interviews with the first type and see how that goes. And or do five interviews with similar questions for group B, the innovation group.

Brett: And so if you don't, you know, and you're sort of getting at this, but if you don't know who this is gonna most resonate with, is your perspective that you kind of create three to five of these cohorts of five to seven people who are in a similar, psychographic or demographic makeup.

Jeanette: Yes. But with a little bit, um, you can do a little bit more legwork so that you don't need to go into five interviews with each of those subgroups. Um, this is where, you know, some of the great discovery research to be done is even just, um, secondary research or general observation before you get in front of new customers.

So in this case, the chef has a wealth of experience on their own and they can think about, okay, what do I and my peers, what are we interested in? What types of chefs are we in? What scenario do we mostly want this faster heating pan? You can also go to the chef forums, um, or go to a chef convention, find out what people.

Are talking about and what, what problems they're facing and see if there's a particular type of chef group that is, um, most facing that And then just develop a hypothesis for yourself of what do I think are a couple of these segments, these groups, um, that I most wanna serve.

And then just talk with the first one or two groups, five people a piece will say, that you think most fit it and go from there. If it's not really gaining traction, then you can revisit the other groups as well. But I wouldn't tie yourself out with, with doing that. I, I would just go do some immersion in the space to have more of an educated guess here.

 

Brett: Before we move on to the second phase, is there anything else founders or product folks should keep in mind in phase one as they're thinking about the who and the goals I should say.

Jeanette: I think it's just remember to keep it focused. Uh, we're gonna discuss a lot of tips today. It's funny, as I was preparing for this interview, I was looking back at top takeaways that, um, teams have had whenever we've talked through these different phases and it's all across the board. And that's because there's so many different steps here, um, for people to toggle, in order to get more out of their own research and just which one resonates?

Depends on. Who you are and probably where you are in your practice of talking to customers. And, but the, the simple thing that I would really wanna just drive home here is focus. if you're wanting to try to learn about 10 things, put some of them on the back burner and say, what are the one or two that I need to dig into now in order to make, make this decision now?

But target customers say, yes, of course you wanna get to know about all of them. What are the one or two that you most need to get to know now and focus on them. And then as you see what comes from research, then you can absolutely re-up and go learn more. Cuz you're gonna learn some things you didn't expect and you can have a chance to revisit those other goals.

So just don't forget to really focus and chunk those. Um, and this is true really at, at any stage of, of research, um, any stage of maturity in, in the questions that you're asking. Um, but it's particularly true in this discovery mode when it's otherwise so easy to get overwhelmed by the number of things that, that you want to cover.

Brett: Right. So that's sort of phase one. Let's dive into phase two and maybe talk about it at the philosophical or framework level, and then maybe we can re revisit our fictitious potential pan company. So this is all about the learning phase. Um, and one of the things you said a few minutes ago was, it's easy to inadvertently bring in bias.

I think kind of the number one thing that I've seen is confirmation bias. I want this pan. I think this is a good idea. No matter what happens in these conversations, I'm going to leave thinking the other person thinks it's a good idea as well. Um, or you do something such that the person doesn't wanna offend you, so that, that they actually don't share what they think of this pan or problem or whatever.

So let's dive into this learning phase and, and how to make the most of it.

Jeanette: yeah, of course. Um, I'll also break this into chunks cuz I think there's a lot that, that we can cover. And I'll start with this bias thing first, but, um, at a high level, when I think about asking better questions, there's two camps. The first one is, uh, getting better data, getting better qualitative data.

This is, um, when I'm asking for their feedback on the pan, will they, to the extent that they can truthfully tell me how they feel and, um, not that I'm implying that anyone is trying to del dilute you, it's because of these forms of inadvertent bias that we bring in as interviewers or that we might kick up in our interviewees.

Um, that we might unfortunately sometimes get false signal. You know, someone said, oh yes, I would absolutely do that, and then they actually won't. But now you have enthusiastically decided to build something that perhaps people don't want as much as they had indicated, right? So, number one, getting better data.

That's one I'll dive into first and, but number two that I wanna just flag will be coming next is uncovering. Insights that we didn't expect. Um, kind of like a le level two of up-leveling your question asking game. Um, that first level is already hard enough to do, to say, I wanna get a good read on the questions I know to ask.

But really number two, that level two is the real opportunity of an in-depth, interview and especially at this early discovery stage, is when you can find out things you didn't even know to ask about and maybe get signal on some of your blind spots, the problems that you didn't actually realize were there are that might get in the way.

Um, they might tell you, yes, of course I want a faster pan, but you realize they're bigger problems are actually related to something entirely outside of their cooking implements or is a totally different cooking implement and they just wouldn't have thought to bring it up. so those are the two things that I wanna make sure that we can, can talk about.

Um, but I'll first touch on the bias piece. And there are three, I think three main human tendencies that are good to be aware of. I think we're all, we all know that there's, there are a million forms of bias that we fight. You've just named one of them. Confirmation bias is, absolutely the one to keep in mind for, for ourselves and, and to recognize that.

Um, and I think the chef example here is great, that in a way the more that you are the user and you've built the product, the more that you have at stake and the more that you want to hear what you want to hear. And, um, one of the founders, uh, who's, a part of our first round portfolio who I asked for, um, his reflections on, on our time working together.

He said that one of his biggest tip for himself is to, to not have happy ears, to not be listening for something That sounds good. Um, we can talk about a few ways to combat this in terms of types of questions to ask. We can get into that in a moment. But also, this is around our analysis. It's, making sure that we're asking better questions, but also that we're not just listening for the things we wanna hear.

The second form of bias, um, that you alluded to is acquiescence bias. And this is a sometimes lovely, but not all that helpful, uh, inclination that we have as humans to be nice to one another, to make, to try to make one another feel good or to hope that people will feel good about us. And, uh, this is, I think, a nice cousin of confirmation bias, but one that I wanna call out on its own as interviewers.

What this can look like is that we will show that we're excited about our idea. We won't even mean to we, but it could be as early as your recruitment email, um, saying, will you please speak with me? I have this amazing idea that means that someone. Inadvertently feel like they need to tell you that your idea is good.

They don't wanna make you feel bad about it. So again, we'll talk about ways to get around that. Um, and then as the interviewee, sometimes again, you might want to be a little bit aspirational in your answers without even meaning to. The most classic example of this is that the doctor, they ask you how much water you drink.

Oh, I definitely have eight cups of water a day, um, when it's really two, but you figure tomorrow you'll start drinking eight. So it's not really all that off that, that the more that we might signal the ways in which, we want someone to answer, they might inadvertently feel they need to answer that way without, again, intending to dilute us.

Um, another way that this can show up is that if people can tell we care about something, they will probably want to spend time talking about it. So they know that we're interested in pans and they can tell you the 50 things that don't work about their pan. Um, even if it doesn't actually keep them up at night, they can find things to talk about because they want to be helpful to you.

And so, um, the more that you don't necessarily show your hand in terms of what it is that you want them to be telling you, the better. Uh, and the final human tendency to be aware of, I think is really. Uh, applicable to this group is that we are not very good at predicting the future.

Try as we all might, and it can be so tempting for us to say, Ooh, will you use my product? Um, what will make you switch to my product? How much will you pay for my product? People can guess, but we can't bank on it. And, um, really the best predictor of future behavior is our past behavior, our existing behavior.

So the final human tendency to call out there is that we also don't have great memories. So asking someone about what, how they were doing it five years ago, um, probably won't be that reliable, but a recent specific example is the best way to get at what they might be doing next. Um, I can, if you'd like, I, um, can dive into some more of the specific tips around how to combat each of those forms of bias, 

Brett: Yeah, I think that would be a great scaffolding. So maybe let's go back to the top and talk about confirmation bias, and given that human tendency, what should a founder do?

Jeanette: Well this is, this is the main tip and the least surprising tip, but I'm going to maybe put a slightly different spin on it, which is that you just have to avoid, um, asking leading questions. A leading question is one in which we, um, in our minds probably already have an idea of. What the answer might be, and then we wanna validate it, right?

So you might say, um, do you think this is easy? Do you struggle with this? Uh, in that case, um, we're implying that we do think that they struggle with it or that it should be easy. And, um, that may or may not be true, but they might feel compelled to say yes or no. So the, the nuance to how to notice if you're asking a leading question is, I have two tips here.

One, just notice if the question could be answered with yes or no. It often starts with, do you X, Y, Z? Um, and if so, bing, you have a leading question. And also, the second problem with only eliciting yes or no answers is that they just might, you might miss a lot of the richness. There are very few people you talk to who will just say Yes, period.

They might elaborate a little bit, but it's a lot less rich than saying, asking, tell me about this experience. what do you think of x, y, Z pan. And we, we can give some more pan examples in a moment there. The other thing to look for with leading questions, um, is notice if you have an adjective in there, an implied opinion, um, how e even if, let's say you've taken away the yes or no piece, and you say, how easy was this for?

That implies again, still that it was probably easy. So the way that I would rephrase this is first, tell me about this experience. Um, and then if you must ask about something a little bit more pointed, like say easy. Then another way to get around it is to always train yourself to ask about the counter here.

So the positive could be, how easy was this? Or to what extent was this easy, if at all? I love that extra little piece that you can put in there as well. That's the, if at all, gives them an out so they can say, eh, wasn't easy at all. So to what extent was this easy, if at all? And then always ask the counter to what extent was this difficult?

What did you find difficult about this? And there, I wouldn't even necessarily give them the out, make them tell you something that is, um, negative. That's the main tip under confirmation bias. But there's a few other pieces here. The first one is mindset. Remember, you're coming in to try to learn things that are scary in order to not spend all of your time developing your product and finding out that no one wants it or that you missed a key detail that would really have helped adoption.

Um, so first, remember, it is an opportunity and it's critical to be able to hear the scary things. Um, and then you also want to, um, make sure that as you are in the conversation, let's just say that you start to hear something that feels really good. Almost that's your alarm right there. The moment you hear, oh yeah, this product is great, that's a sign to say I should ask the counter.

Um, what is something that I can push on here? Also, as you, um, run those five interviews, we'll probably start to notice a pattern after three or four of them even. Um, which by the way is a sign that you're onto something good. Once you've really hit so many patterns and you feel like you're hearing the same thing over and over again, great.

That, that probably means you're done with this many research sprint. But that's also the moment to say, how can I push on what I'm now beginning to assume is true? Can I ask a new question that, um, even pushes against the patterns that I think I'm hearing here? Um, and then finally, an analysis as you even just do, I, I recommend that every team does 15 minutes of a debrief, right after every conversation.

One of the questions you ask yourself in that is, what did I hear that related to my hypothesis? Um, that perhaps supports it? And then also, what challenges. Again, always ask yourself the counter, um, and just be kind yourself. Know that it's not that fun to hear things that are tough, but you'd rather hear it now.

Um, and so again, it starts from your mindset beforehand. It's how you ask questions during, and it's also the pieces of information that you pick up on in your analysis after.

All right, great. Let's move on. Acquiescence bias.

 with this one, it, it starts as early as the, um, the subject line in your email to recruit people to speak with, but, um, the main tip here is to generally frame things a bit more neutrally and don't. Hold off on talking about your product, um, is the, the simpler way of putting it. And this will allow you to get a better read on what is actually most important to the person talking to you before they inadvertently want to jump into the mode of being really helpful to you.

So what this can look like with the, um, chef example, let's say that we're talking about, finding other chefs to speak with in your, as early as that email, you want to be sharing at least a little bit of information about who you are and whom you wanna speak with, but minimal information when you can about what your product is.

So you can say, I'd like to better speak with, um, chefs who are working in a busy kitchen. Um, tap your expertise, and then make it really easy. This is just a tactical thing, but make it really easy, uh, for them to schedule with you, incentivize them for their time, um, particularly if this is not someone in your network.

Um, but as early as that, you don't need to say, I'm trying to build the fastest heating pan, because then they will feel like they need to speak to that. So can you go a level up and just say, I wanna speak with, chefs who are under a lot of strain. Want to learn how to maximize their time in the kitchen more some, something like that.

Um, even that I'd say, I actually would correct myself and say that I don't even want to allude to too much of which problem I think I'm solving for them. But I think that you can tease out this balance between what's a little bit of information about the space I'm, interested in, and who is the right person for me to speak with without giving everything away.

This also then continues as soon as you start the interview. It can be tempting to say, um, I would like to, talk to you today about pants. And then people will feel like that's the first thing that they need to jump into. Instead, I would start a bit more open-ended. Most of the talking that I do is at the beginning of an interview, a lot less later on, and at the beginning you say, Hey, I'm such and such.

I'm wanting to learn more about X, Y, z. Gives them a little bit of grounding in the conversation. I'm going to eventually show you an early idea, but for now, I want to better understand you and your world. So anything that I ask you, um, You know, feel free to elaborate on what's most true for you.

And you don't have to also answer any questions you're not comfortable with. When I do that at the beginning and let them know what's coming, then they realize that answering my questions before they know what the idea is, is useful to me. Cause otherwise they might just feel as though they're waiting for you to get to that.

So, be very clear. These questions are, are what's useful to me. And this also gets at, um, preventing acquiescence bias by letting them know from the beginning that they're the expert, right? That you want to hear from, from them and hear what's true for them, not just what, what you are interested in. Um, throughout, again, you'll, you'll want to recognize that they are probably going to be, try to be very helpful to you.

So once they figure out you're interested in pans, they're going to want to talk to you about, about pans. Even if they're complaining about pans, um, they're like, oh, we know that this is what's most interesting to Brett. Um, but what if there's something else that was really, concerning to them, uh, outside of pans?

Then you have maybe missed that opportunity if you dive into that right away. So we're, we'll talk about this in my sort of level two, how to get unexpected insights piece. But I have a framework that I use for structuring all conversations, which is to zoom out first start with your bigger context, then you zoom in to that thing you know, you wanna talk about and your idea, and then you zoom out with some final.

Um, higher level question, uh, that helps wrap up the conversation. And those zoom out sections really help with acquiescence bias because they help you see what people really care about when they're not just, um, digging into trying to help you with your, your specific idea and topic. Um, and then, uh, I'll quickly move on to the, uh, future facing i, uh, hiccups that we have and that the, there's a very simple tactical point here, which is to ask about specific recent examples and behavior, um, as we'll talk about in the sort of zoom out in and out framework.

Um, I'd suggest that, let's say, you know, that you wanna have a section in which you, uh, talk to them about how they feel about pans. But in your section on talking to them about pans first ask them about what they're doing today. walk me through your way of preparing a meal today. And, um, it can be tempting at the stage to ask about generalizations, but the more that you can have them literally show you or tell you about a very recent example, it helps them really not miss any details.

Um, and then you can have them speak about the most recent pan that they purchased, for instance, and then after that you can ask about their, their attitudes. Um, but you always wanna balance attitudes with behaviors and talking about these specific recent examples will help them show you what they're doing and what they might, be willing to do in the.

Brett: So let's go back to our, our fictitious pan company that we're considering starting. Can you kind of bring this all together? We, we went through phase one, which is, we started to think about the exact, um, psychographics or demographics of the different types of chefs we want to talk to. And we maybe came up with three different groups of five folks.

And now I assume we want to plan and sketch out these conversations and I assume there's some outline that we want to put together. With these biases that you talked about, but maybe you can kind of bring it to life with the pan example, we have the groups and now we're gonna go think about what we're actually gonna ask and how we're gonna spend that time with those . Different groups.

Jeanette: This starts to, get into some of the, the tips that I think are a big part of uncovering unexpected insights, and the biggest one here is the Zoom out in and out framework. Um, and we can talk about how the, the pan fits in into that. Um, but I. I will spend a moment first, um, explaining what I mean by this framework and even giving you another example from the first round community to, to bring it to life in a couple of ways.

So the zoom out in and out framework is one that I use to think about the high level flow of a conversation. And by the way, this is true whether you're in discovery mode or way down the line in usability mode. Um, I always spend a little bit of time at the beginning zooming out to get a little bit of the context on whom I'm speaking with.

What else are they doing? Understand their existing mental models, their existing, context outside of and larger than my main area. I wanna focus on, and again, I'll illustrate this with a couple of examples in a moment, but even for instance, at the later stages of usability, when you're mainly just wanting to get feedback on something very specific, you still wanna understand how they're thinking about that particular topic, at least with a couple of questions today.

So you'll always wanna do this, but you wanna spend a lot more time in zooming out when you're in discovery mode. This is really, um, helping you catch your blind spots, um, and have context on where their answers are coming from later the second. Section, in this, uh, typical interview flow is the zoom in.

And this is the thing that, you know, that you, uh, want to dig into more. This is often relates to those learning goals. So back to those. if we think about the first chef questions and we said, okay, one of them is around understanding their top problems and two, understanding their take on this, um, faster heating pan.

The larger problems is gonna probably be in your zoom out and then zooming in, we're going to probably want to talk about pans and how they're using them today. Um, or perhaps if this is not about the specific pan, it can just be about what they do to save time. Whatever that angle is that you're taking, let's just keep it simple and say pans.

You're gonna wanna dig into that. And then probably part of your zoom in is to get their feedback on, on your potential pan. By the way, for this, um, level of idea feedback, you do not need to have a prototype yet. I'm even talking about, um, very high level value prop feedback. That could be your quote concept to test.

And then the final zoom out at the end is one or two questions that ask about the bigger picture again. Um, so I'm gonna walk you through a couple of examples. One, the pan and then another one, um, that I'm gonna start with comes from the first round community. This was working with another team and they were wanting to serve small businesses.

We're hoping to build, a bigger online presence as part of growing their business. Um, specifically through Instagram. Think of a small restaurant. Think of this chef wanting to, get the word out to a wider audience. Oftentimes, you'll find that they, um, want to post on a range of channels, and this team was believing that Instagram, for instance, was going to be the tool to fix and to make easier for them to post on.

So, back to some of these tips that we were sharing, um, before we were working together. They'd often led in their recruitment emails with, we're working on making Instagram easier for you. They'd start their interviews with, we want to make, Instagram easier for you, and then dive right into their feedback on Instagram.

Um, when we instead did the Zoom out in out structure and started a bit more neutrally, the first piece of zooming out was better understanding what it was like for them to run a business. Just get, ask a couple of questions, tell me about yourself and tell me a little bit about your business.

Start there, and then maybe walk me through, uh, last week at work. And that starts to give you a better sense of what their world looks like. Zooming in a little bit more, the second set of questions was around just building an online presence. We said, tell us about, what you're trying to do to grow your business today.

Which, apps, if any, do you have a presence on? Tell us about the one you use most. In this case, they spent 10 minutes talking to us about YouTube. Didn't mention Instagram at all until the very end of that. Then we said, which other platforms are you on? And they listed 10. So pretty soon we realized that, wait a second, the problem isn't just that Instagram's an an issue, um, but actually that there's this wider array, 10, 12, however many platforms that people need to be on.

Once we got into the Instagram questions, the, um, customer ended up having plenty to say about Instagram and all that was wrong with it. But again, we wouldn't have even known that they were using Instagram until we specifically asked about that. They wanted to talk to us about something else. And then finally shared the idea.

And a final zoom out question example here is, you know, if we could give you a magic wand and do anything to help you as, um, a small business owner or to improve your online presence, what would it be? And oftentimes you'll see that they'll answer something different than what you just zoomed in on.

And so when you have that final zoom out, it also helps you get some perspective on really how important is that thing that I was just talking to you about? So again, zoom out is getting a bit of a higher level context in the context of the pan. So we wanna better understand their larger problems.

We say, okay, tell me, tell me about your life as a chef. Walk me through, you know, yesterday at, at work and what was that flow? And then, um, you, you might want it, depending on what you think is the problem here. If this is about having a, um, more efficient. Uh, cooking process and handling, um, scale versus let's say innovation, then that would probably be your higher level set of questions.

Let me talk about how they think about efficiency, or let me talk about how they think about innovation. And then you zoom in a little bit more. And that's when you ask about pans and how they're using pans in order to, um, stay efficient as part of their, um, cooking process. Now, get feedback on the pan and then you zoom out and ask, you know, if there's anything that we could do for you to help you become more efficient in your business, what would it be?

So again, um, start with one or two sections of questions at the beginning to zoom out. Second, um, zoom in to probably that main topic here. you want to learn about, and your specific product idea. And then one or two questions to zoom out at the end. And, um, the step-by-step way to do this, to get to that is actually first, you know that as soon as you're finished writing that research plan, you have a million questions in your mind.

What do they think of this pan idea? What are the competitors that they're interested in? Whatever those questions are, get them out of your head, put them on a piece of paper. and then set it aside. But at least you'll capture what you know you're curious about.

Then go back to those learning goals and say, okay, the two things I really must cover are. Um, they're higher level problems and pans. Okay, um, now you're gonna start your high level outline. So first, ask yourself about your zoom in. What are your one or two zoom in topics they're gonna be easy to come up with cuz they're the thing you're dying to ask about.

But just ask about it at a, just think about it at the high level category, stage. So Instagram and my idea, write that down for yourself. And then you ask yourself, okay, what is the level of zoom above that? What is Instagram about? What is that? What is the category that that fits into for this customer?

So, oh, that's about building an online presence. Um, or in this case with pans we're saying in the context of what we think we're doing for them, it's about building an efficient process in their kitchen, or it's about innovation like at a higher level. What is that about? Um, and then you can think about from there, once you've identified what that sort of higher level zoom out topic is, then you can start to, um, consider what's your sort of final, final zoom out, um, topic or two.

So that's the, that's that second step is just write your outline. And that's by the way, the thing you need to actually memorize for your conversation. You don't need to memorize every darn question. You say, number one, it's an intro. Number two, I'm going to, ask about how they, are building an online presence.

Three. Instagram four, my idea five, wrap it up with a zoom out. And that's the thing you memorize underneath each of those. Um, particularly your main zoom out and your main zoom in section, that's where you wanna ask a little bit about their behaviors and a little bit about their attitudes.

So the behavior is where you bring in. walk me through what your process is for, running the kitchen the entire evening. Um, tell me about what it looked like yesterday. Another follow-up question I love is, um, how has this changed over time? Right? If we're asking, ooh, would they switch?

Oh wait, getting at that is why have they switched in the past? so in what ways has this process changed over time? What prompted that? Um, and this could be a good time to think about, again, instead of asking about what they'll pay later on, ask about what resources they put behind some of these things now as they tell you about their process.

Say, what resources do you use to, support the system? How, you know, and that could be about people, that could be about budget, um, tools, et cetera. Um, and then after you do a bit of that behavior digging, ask about, um, their attitudes. And this is where, um, my second tip in this get unexpected insights, comes in, which is to ask about this main topic, two or three different ways.

So you might wanna know about Instagram. Let's say, uh, well first of all, um, even just asking them to walk you through their, um, their process is a good way for you to spot an issue they might not even uncover. If they say that took 14 steps to post something that's already a problem that you're noticing, right?

Should, let's say it should only take three. And then when you're moving from behaviors into asking about their attitudes, you can say What works well about Instagram for you today? If anything, then don't forget to ask the counter what doesn't work well about Instagram today. And then ideally, here's where you keep going.

And maybe if this is really important to you, ask about it another way. Say, Hey, if someone's, um, wanting to start a new restaurant and post on Instagram, what advice would you give them that tells you another angle of what they think works or doesn't work about it? Or you can say, if we had a, a magic wand, everyone's favorite question and user research, what would you want to change about Instagram?

That's another way of getting at their take on what works or doesn't work about something. So if it's really important to you, ask about it two or three different ways, ideally even across how they're behaving and, and what their attitudes are about.

Brett: So there's a ton of stuff to dig in there. I guess one area that you mentioned briefly, but I wanted to talk a little bit more maybe towards the back of the conversation when you are showing them the thing you're thinking about building, um, or you're explaining it in a certain way. Um, and, and we'll get into this maybe a little bit later, but, but in the case of, of this specific question, you haven't built anything yet.

You're doing very, very early customer work. H how do you do that? Well, and maybe we could use the quick heating pan as kind of an example of, of how you're actually going about explaining it in, in the most productive way.

Jeanette: Yeah, that's, um, uh, it's a great question and again, I still appreciate, um, that we get brave enough to show concepts that are really early. This goes, even goes back to preventing that confirmation bias the earlier you show. So, Even at the one liner value prop level, the, um, less attached you'll be to what exactly it needs to be.

Um, and the more that you can get really helpful feedback that helps point you in the right, the right direction. So, um, I commend anyone who wants to jump into this stage that feels like it's almost too early to show a concept. It's not too early to show a concept, always get feedback on something.

So, um, let's imagine that you have started the conversation again with that framing at the very beginning of the interview where you say, I, you know, I'm wanting to better understand you and your world, and then I'll show you a, an idea. Please be open, you know, with your feedback. Um, and then lead through those zoom out sections, ask about some of those zoom in questions, and then you get ready to show your concept.

That's the next time that I talk a bit, and that's when, um, I first ground that section in reminding them that this is a, a really early concept and that I want their honest feedback. So I'll say, all right, we, thanks so much for telling us more about your world. I'd love to, um, shift gears towards a really early concept, um, that we're working on.

Um, at this stage, uh, your constructive unfiltered feedback is most helpful. You know, you, you can help us get this right. And in fact, something I, I'll usually say, and, uh, this usually benefits from a visual, but I hopefully you can picture me saying this, is, imagine that you have a microphone on your brain.

And when I say that, I will put my hand into a fist and put it on top of my head as if it's microphone on my brain. And, um, by the way, it's, it's somewhat disarming for the audience. Judge your audience and see if it's the right, the right set of folks. But it's almost strange to see an adult saying that to you and showing you that.

But, um, it's really an invitation for them to be open with you. Again, back to something that, um, as we prevent a bit of, uh, acquiescence bias from them, you can say, pretends they have a microphone on your brain. I really wanna hear your thoughts as, as we're talking through this. Um, and this is true whether or not you're showing talking to them about a one-liner value prop, or you are showing them a more, um, baked prototype.

What I find helpful about that particular framing is that, um, again, it's, it's so out of the ordinary for people that it makes them pause. And so then when they're repeating back to me often, they'll say, oh, well, I guess if I had a microphone on my brain, I would tell you X, Y, Z. And you could see it's something that they wouldn't have otherwise felt like they should share.

People need aren't even sure what level of sharing is helpful for you. Again, they want to be helpful. So that is an invitation to it. If that doesn't feel comfortable, ask for unfiltered feedback. So anyway, that's the time to do some order grounding and then. You can say we are working on a really early concept, uh, for a pan that, heats up in half the time of conventional pans.

So let's just say that there is a more multifaceted, um, idea here, and I can share one example with you in a moment. Um, I would still break it into the chunks that I share, and if it's just at a very high level, um, we have a pan, there isn't much more detail, um, I would start with just that one sentence or two.

I take out as many, um, adjectives as possible, even if it's supposed to be a value prop and just say it's a pan that, takes half the time to heat. And then I would pause and I would give them time to react in an open-ended way. So I, I would direct that by saying what comes to mind when you hear that?

And pause and see what they say. And see. Notice there were no adjectives in there, wasn't, what do you like, what do you not like? I can get to that, but I first wanna hear what comes from them. And they, that part of the reason you do that is, um, gives them more space to tell you what they don't like, but also for them to supply you with the adjectives and words that you should be using in your, in your more developed value prop later on.

Um, And probe them to tell you a bit more. So what comes to mind when you hear that? Um, and then you can start to, if they say, how do you, would it work this way? Turn the question back on them, how would you want it to work? Uh, and then you can start to get into something more directed. What, if anything, sounds valuable about this?

And then remember to flip it the counter. What, if anything does not sound valuable about this? That's also a good moment to, um, avoid bias because they will possibly say, Nope. Sounds good to me. Frame the question again. Probe on that in a little bit more and say, okay, well, um, what open questions do you have about it?

What concerns do you have about this? Um, some other optional, but I think really handy follow up questions when you're in this mode are one, um, ask them who do you think this would be good for? And then who do you think this would not be good for? This gives them an opportunity to Sure. Maybe help you think of another customer, but also to tell you if it's not for them.

Um, and then another set of questions I think can be quite helpful that I mentioned before, but I'm gonna bring up again. One is, if this were available tomorrow, what would you do? And they say, oh, well, hmm, I need to throw out my pans. I don't know if that really makes sense for me. Or they'd say, yeah, great.

I would just train the people on my team to use it. Um, and we'd get going. Well, then that even helps you see like, oh, I might need to help them with training materials. Um, or you say, if this were never available, What would you do? And if they say, oh, well I think I'd be fine, versus, oh my gosh, I can't even imagine my world without a pan.

That takes half the time. Those are really different reactions for you to, to play around with. And of course, you can always throw in a magic wand question there. So that's generally how I'd approach sharing a concept. If you have a more multifaceted concept or several to share, there's just some slight builds here.

So an example of a multifaceted, um, example here would be one of the teams I've worked with via first round was wanting to offer our credit cards to, um, particular types of, of small business owners, let's say optometrists and ophthalmologists. And there were a few different features of these particular credit cards, but we broke the value prop into a few different sections when we talked about, the first one was just to say we're building a credit card that is specific to optometrists.

 stop the sentence there and then say what comes to mind when you hear this. And then the question you can ask in that case too would be, how might you want this to work? Which is not asking them to invent your product. And people don't always entirely know what to do with that question, but oftentimes they'll find a way and they'll say, well, actually I think that if it were to fit optometrists, it would do X, Y, Z.

And then let's say you have an idea, you're like, ah, well here's, here is how it would work. It would have this percentage of rewards, et cetera. Um, well in that case then you come back. You say, okay, thank, thank you for that initial feedback. Here's the, here's a little bit more detail. And then tell them the additional detail.

Get open-ended feedback to that. Tell me what comes to mind when you hear this. And then at that point, once you've shared everything, then you can ask more detailed questions about what they'd find valuable or not, but breaking into those chunks. And this is similar, if let's say you're presenting two or three similar concepts, I would present them one at a time.

Ask for just some very light feedback. What comes to mind when you see this? How might you want it to work? Or what do you find valuable or not? And then wait for those more in-depth questions on which one is your favorite? Um, who is this good for, et cetera, until the end, after you've discussed all three.

That's probably for people who are a little bit later stage development But, um, you can still plug and play this sort of formula at that stage. Um, also, by the way, one tip that I have missed, but we have, we, I've been alluding to is that the first thing you can do to improve your conversations is to ask one short question at a time.

This is probably the biggest starter tip that I suggest teams begin with, and you can try it as early as dinner tonight. We tend to, as humans, want to offer a lot of preamble, Hey, you were saying this earlier and thank you for, sharing this or that before, and I was thinking about this and I know this about your space, et cetera.

And then followed immediately. There's two or three or more questions. So what did you think about Instagram? What do you like? What would you change? What was easy? It's not, and the issues with doing that, it could be really natural to do. That's our own acquiescence bias coming out again, by the way, is that we wanna make people feel comfortable with our preamble and we want to, um, make sure that they know that we're curious about all these different things.

But the preamble can bias us them. Um, and if anything, we don't always wanna allude to what they've already said, because what if they tell us something different at the beginning and the end? That's really interesting data. Let's, let's save room for those contradictions, if that happens to come up.

Or just also give people space to answer it in a more nuanced way than before and not just say, oh, nope, you can go with my previous answer. And then in two, if we ask multiple questions at a time, people will normally only answer one or two and not in as much depth. And the bonus is that when you ask one short question at a time, tell me what comes to mind when you see this.

You stop. I can guarantee that someone will wanna jump in and fill the silence, but it also signals that you wanna listen to them. So when I just shared all these sample questions, tell me what comes to mind when you hear this. How would you want this to work? Um, what works well? What doesn't, what concerns do you have?

What do space those out? Those are all asked one at a time. And I know that that will sound like it's gonna take forever in the interview, but actually you'll end. Getting through a surprising amount because, um, one, you aren't wasting time asking long questions. And two, it's so open-ended or it can give so much space for them to talk that people will honestly often answer about three questions worth at once.

So you still get through plenty.

Brett: What are your thoughts on figuring out willingness to pay in these type of conversations, or let's say that's one of the couple goals you have in doing this exploratory work, and what are the traps to look out for when you're trying to basically think about how much value is this creating for an individual, and how much of that can you capture in the form of some kind of.

Jeanette: this is such a tricky one and so tempting. We do want to be able to just ask someone, how much will you pay and have them answer us. But back to the piece about not predicting the future and people being nice to us, we can't, it's a hard one to directly ask. I would basically never suggest doing so.

Um, instead again, this is why we ask about existing behaviors. So, um, find out how much they are investing in kitchen tools now, how they've thought about that. How has that, um, changed over time? And especially, um, let's just say that it's, they might not have your exact tool. You wanna think about the category that it's in.

So, for instance, we might not be thinking just about pan budget, but maybe about all kitchen tools, right? Um, are we a part of that larger budget or are we a part of a specific one set aside for pants that might help you also get a, a read on what their. Um, there is also a, another sample set of, um, extreme questions that you can ask.

Almost like the ones that I mentioned earlier around this were never available. Or, um, actually even another interesting way of asking that is if there's were the only tool you could use, how would you think about it? But, uh, you can also do this related to price and you can say, um, come up with a, uh, a, a obviously very high number.

Let's say that you think of a pan as being a hundred dollars. But what if you, you said, we're trying to understand the value of this to, to you. What if this pan were, we're priced at a thousand dollars? What comes to mind? So go to some number that feels obviously extreme, and then you might get in really interesting off the cuff answer like, oh my gosh, no way.

I mean, it's great, but I would never spend more than $150. Boom, there's your anchor about how much higher they might go. 

But I still would rather make sure that I'm asking them about how much they're investing in these areas today. And, um, and also how in that category, um, how they're spending money on staff that might be related to that same goal should get you a sense of the different ranges that they're willing to.

Brett: Before we get into the analysis phase, I'm super interested to learn more about specific types of products or innovations or ideas that you've noticed are particularly difficult to do user research around if there are any.

Jeanette: I think that probably the first one that people will be aware of, um, that is trick can be tricky to research is really anything in the B2B space. And I'm sorry to say this cause I know that a lot of people are working in it, and that's just the first reminder I wanna offer is that a business is not a person, right?

A restaurant for instance, isn't a person. So when we say I wanna serve restaurants, you say, who am I talking about? Is it the, um, business owner, the floor manager, front of house staff, back of house staff? Um, and so part of it is that you need. Try to figure out the needs and behaviors of a few different groups.

And you can usually, boil this down into a few different groups. There's one that's the, the decider, the person who says, um, I would like to bring your HR software, um, into our business. And I'm, uh, I'm going to make a, a pitch for this to the people who group two are the ones that 

 hold the purse strings. Uh, three, there's the end user. This is, um, in this case, I'm actually referring to someone in, um, first rounds portfolio that I worked with, uh, very early on. And again, they were building software for, um, HR admins. The decider there had been the head of hr, but the end user they needed to design for is the HR admin who's actually using the, the product.

The person who was the group two, as I alluded to, um, person who paid for the, the budget here might be the head of people or even the ceo. And then there's even the end, end user, which is the employee at those companies. Right? So even first, just recognizing that it's more complex to understand the needs and to be able to change the behaviors of all of those groups is not to say that you don't do it at all.

Of course not, but just. As you're doing this discovery, make sure to even get to know a couple of those groups the most, probably your end user, in this case, the HR admin, and the decider, the head of hr. Better understand them and then you'll know more how to get in there. But that just, that makes it a bit more complex, of course.

 and I don't know, I find something fascinating and worth researching about basically any topic that, that I dive into. Um, but I would say that it's, it's especially hard when you're thinking about a crowded space, uh, that's hard to build for and hard to research for. this is where you wanna spend a lot more time zooming out, for instance, in order to make sure that you, um, maybe uncover needs that your competitors aren't thinking about.

but I find that, that in general, that's just still a lot harder to, to plan. I wanna emphasize enough that if you're going to be going into a crowded space, you have to spend even more time in zooming out and in getting creative with your synthesis to see what you can spot that you think others have not yet spotted.

That's probably a bit farther, um, out of what people have been thinking about so far.

Brett: 

So let's move on to this sort of last important pillar, which is the analysis. You have this set of conversations, you're gathering this information, and then what do you do? And maybe, maybe something you didn't actually mention that we should talk about is how are you gathering and organizing this information as it's going on in real time?

And then let's flip over to the analysis portion.

Jeanette: Yes, of course. And that, that, that is the first stage of analysis is in the middle of the interview or even before you start it, part of your prep. Um, that I would highly suggest for people is two things. Um, one, actually three things. One, to have a, um, note taking template. Uh, it can be a simple as just a.

 doc, or sometimes you can have it group by, let's just say you put in your zoom out in and out framework or separate questions. So it's really easy for a note taker to plug in how they're answering. Um, I don't want it to feel like it's too much work, so if it feels simplest to just start with a blank piece of paper, that's great.

But underneath that template is where I would also write out a quick debrief formula as well for each interview, which I'll describe in a moment. So get your note taking template in which you have space for each interviewee. Write down any context that you need. You can just have an open note space or you can add a little more structure.

You don't really need to. And then a space for debriefing. The second thing is you're prepping is I would suggest having a, um, note taker. And the third piece is, recorder. Um, and the reason I suggest both note taker is that,you are gonna be building with someone else. It always is faster and easier for people when you both hear the information live.

Also, um, even if you are using a recording program, I would not recommend banking on going back and watching a recording or reviewing a full transcript in order to analyze you're moving faster than that. That's okay. Um, so instead, I would suggest having someone also take the notes live in that document that I just mentioned, um, and that's where the first stage of analysis can happen.

And I'll share that framework in, in just a moment. But I'd start with a notes doc, a note taker, um, someone on your team who can help do that. And three, um, I would find a recording program, if that feels comfortable for the people that you're interviewing. And I, and I would get their permission of course, to do so.

The reason I'd suggest recording, um, would be that you don't wanna miss anything and that's away from what you tell them. Say I, I would like to record, um, in order to make sure that I don't miss anything. We can't type quite fast enough. Um, would that be comfortable for you if you do that? I would suggest using, um, a transcription program as well.

One that auto transcribes, and you if you have one of those programs, you can also, for instance, have your note taker highlight, um, the live transcript as the conversation's going, if it's a program that that allows for that. So the first piece of setup for how you gather it begins before you start, um, in your session.

I would suggest that you, as the person running it, You can take light notes if that distracts, you don't, I actually tend to find it a helpful way to, um, ground what I'm doing. And, and funny, funny tip is that, um, I will usually hand write my notes because it, uh, I think really somehow in some traditional way can signal to people that you're really listening to them when you're not typing.

Although you can always just say, Hey, I'm typing notes, and that's not a problem. It's just a little bit more my personal style. But also it allows me to look as I look down at my notebook, which they will see as me taking notes. I can also look at my interview guide quietly without making that very obvious so that it doesn't look like I'm ever reading a script.

But then I will, um, be able to glance at in and just really see if there's anything that I'm, that I'm missing. Um, and here's where I'll get to the framework for analysis that that comes up in the debrief notes and in what you keep track of during the interviews and after. And it's that, um, for me, analysis is broken down into three steps.

One, capturing data, um, this case the qualitative data. Two, getting to insights and then three, bridging that to action. So DIA data insights, action. And what I would suggest is that after every single interview, put 15 minutes on the calendar and with the person that you led the interview with, um, make sure that you write down what are some pieces of data that stood out to me from this interview.

And, um, it's kind of a mix. Basically show and tell. We'll say from, um, the interview, the tell would be what did they directly say? Um, that was interesting. In this case, I would look for something that, um, confirmed or maybe challenged my hypothesis. Again, always do the counter. So look for a bit of both.

Write down anything that directly, um, spoke to those pieces as well as anything of course related to your learning goals. But I also like to look for things in which they show what's important to 'em. So here my checklist would be something like really strong emotions. Back to that question of, you know, when I ask people, what do you wish you could spend less time on?

And they give some big audible sigh. I'm gonna note something like that. Wow. why did they have such a strong reaction to that? What is that about? What were they talking about if they went off on 10 minutes about something that I'm gonna pay attention to? What was that thing that they cared about?

So much to do that with another show that I look for is something called a workaround. If someone, um, has cobbled together a solution, uh, they might not even realize that they're solving a problem. Uh, and for instance, this might be that you have, uh, again, those 14 steps to ma make an Instagram post that involves starting it in a separate notes doc, and then moving it over and consulting with this or that person.

You can see that that is a lot more steps than might be necessary, and signals there might be a problem. So, again, in data, I would capture the three or four things that most came up in that interview. While it's. The quotes that confirm or disprove, um, and some of these behaviors, strong emotions or workarounds, and of course, anything interesting that you find.

But those usually help me expand my thinking. So, um, going back to the prep, I'll put those fields into my note taking template. I'll have data insights and action. And then in the interview, ideally you're already paying attention to what data you wanna capture later on just to save yourself a little bit of time.

So I, as the person often leading the interview, might still, as I'm taking notes, just put a big star by something or put a big e if it's an emotion, or h if it's something related to my hypothesis or something like that. You don't have to, it's called a tag in research. You don't have to come up with some specific system like that.

It'll just make your life easier later on because you're going to get hundreds of pieces of qualitative data in an interview. Uh, and, uh, you don't wanna have to sort through everything later on. And again, your dedicated note takers would even be able to do that more, more easily. They can bold quotes as they're putting them in, um, or put other notes by things as well.

So, um, back to those pieces of what happens after, uh, once you. Set up this note taking template. You've kept track of some data during, if you can. Then, um, I suggest sort of small, medium and large ways of doing analysis. The main one that I think gets people a lot farther than they think is that small version, which is the 15 minutes after every call.

And with your first couple calls, make it 30 minutes cuz you're still grounding in how do I look for data? Um, but it will make your later synthesis so much faster. And it's so much better when it's fresh. I know it could be tempting to do back to back calls, but you already wanna put a buffer between your conversations just in case someone shows up late or there's a technical error, you already want that buffer.

Use it to debrief the call that you just came out of. Um, outside of capturing those three or four main pieces of data, uh, you can start to think about your early insights. Um, an insight is your, your interpretation of what's going on. Why was that important, what patterns are you starting to see? It is your, your opinion on it.

You don't want to make conclusions about your project after only one call, but it's okay if you start to have some high level takeaways. And it's certainly interesting if you start to notice patterns and then the form of action that you take in this 15 minute debrief. It's just really as simple as what are some things I might wanna follow up on?

Did I, do I need to change any questions in here? Did I promise them that I'd follow up over email? That sort of thing. Um, or, hey, do I feel like I'm hitting on patterns? And, but I'm worried about confirmation bias here. Is there a better question I can ask next time? To get at something that disproves this early idea that I have.

The medium way of doing this, I find also to be quite, for lack of a better term, a game changer for, for teams. If they can really get into the, the rhythm of this, and this would be to have even just an hour of synthesis, every handful of conversations or once a week. Oftentimes I talk with really early stage teams who are just constantly talking to customers, and it's just been a real opening for them.

If on Friday mornings they put an hour in their calendar to synthesize what they learned that week. And so you, this is where those de those many debriefs are also really helpful. You don't even have to go back and reread all the transcripts, although you're certainly welcome to. But go back and look at, um, those debrief notes, the data and some of the early insights that you had and start to see, am I noticing patterns across those?

Um, it's also a good time to start to play the five y game, uh, that everyone loves with this where you say, okay, they, um, so I noticed that they took, uh, 14 steps to post on Instagram. Why are they doing that? Why is that important to them at each level that you have an inter, now you're starting to interpret, uh, when you interpret, that's when you get into a deeper sense of why is this important?

What else is going wrong for them? And that's where you get to an understanding that others don't tend to have as. The, I'll give you a quick preview of the large one, and then I think it's probably worth us talking more about how you get into these more unexpected insights. But large is where you might more want to more systematically look back at all of your interviews.

Um, I'd usually recommend a method called Affinity Diagramming. You'll see it as that classic thing that designers do, in which there's a bunch of post-its on the wall, but really what you're doing is, um, wanting to grab data from each of the interviews that you've led. Let me go back through and see if there's any that you missed that were not captured in your mini debrief.

And, um, go through each, each person and then start to ideally put each piece of data on, let's say a Post-It. You can do this via an online program, like let's say Miro and, um, move around those, uh, pieces of data to clump them together and see which ideas relate to one another. This is the affinity piece of it.

And then you start to see patterns that emerge. Um, this is more of a, you can do it in a, a top down and bottoms up way, and which, um, the bottoms up way is to see, wow, here's all the data that came outta my interviews, including things. Maybe I don't yet know what, what's important about it, but it seems they had a big emotion about it.

I'm gonna put it on a post-it. You can see what naturally emerges from the data as patterns. Um, especially if you put in some of those more unexpected things that you heard or saw, the top down would be, Hey, I know that I'd like to, um, uh, see what the feedback is on this feature, that feature and that feature.

And you almost already have your clumps of things that you want to notice patterns around. It's good to do a bit of both and to compare and say, what did I hear about these features? But I always like to do this, this, um, bottoms up approach as well, and which you see what emerges from the data, which is, you know, starting to get ahead of the, uh, that question of how do you get to some insights that, um, are a bit deeper than than you first would have encountered.

So just to summarize this a bit, remember to think about data and then start to, um, jump into insights and think about action, what you do with it. So often teams will want to jump to the solutioning mode of action or even the insights interpretation piece. But the more you ground yourself in that data, the more unexpected pieces you can pull out and the more conviction you'll have and where it's coming from.

And don't forget to do it in small and medium ways at least. The bonus would be to try Affinity Diagramming, but at least try the small and medium as well.

Brett: So now that you've kind of taken us through these three sort of key pillars of the customer development or research process, I'm curious, what have you noticed in terms of what goes wrong even when you do this well? 

 what have you noticed that tends to go wrong? Maybe it's just luck and randomness. Maybe there's a set of things that end up happening where you're translating these insights into product and into early customers and, and sort of what's going on there.

Jeanette: it's a great question and I think there's two main, uh, topics that I wanna touch on here. And one is when you do this research a bit too late to really actually steer the ship. Um, and also the second is when perhaps you don't take enough time to really, uh, deeply understand what you've learned at the end, enable the team to actually apply it to, to the product.

Um, so the f the first piece around coming in too late, um, this is, this is kind of classic. Often again, we, as I mentioned at the top, we don't always identify that we are doing user research at the beginning or just kind of chatting with customers. And that framing and the lack of rigor that you can have around that, um, can do you the disservice by not, again, making sure to bring in some of this rigor and get more out of it early on.

But many times when I've come into companies, I'm employee 1000. A lot has already been baked by then and. It's not that you need to bring in a UX researcher from the very beginning, but a lot of times by the time that we're either bringing in, um, more dedicated UX research practices or that we are asking ourselves some of these questions, it's, it's already towards that stage we're saying, did I build it well?

And in fact, the real power of asking these questions and doing this work is as early as possible. This is why I'm so passionate about working with this audience. It should be from the beginning, um, in order to actually help shape where you're going. there's two dimensions to this.

One I have deep enough. Um, I'm taking the time to really get to deep enough insights and figured out how to then apply them to the team. It's this last step of synthesis, um, that I think is the one that we're most likely to shave off and constrain. But then it means that you don't get nearly as much out of the research and it can have these trickle down effects to maybe not actually even really impact the product that you're, that you're trying to serve.

Um, we know that we need to take a little bit more time to learn, but we really just wanna synthesize today and be done. And so, um, one with synthesis it, even when we're moving fast, you need a little bit of time. For it to percolate, right? It's not that you need to take a week to synthesize, um, instead of an hour, but maybe something in between.

And this is actually one of the benefits of doing that light synthesis along the way. Things start to percolate. unfortunately, you, you can see that it's very tempting for teams to want to just take what's said at face value. Um, and this is, uh, the, the sort of classic push from, from Henry Ford saying, um, if I would've asked customers what they want, they would've told me, um, to build a faster horse.

And by the way, we do or do not know if he's actually said this, but I think it's an important point to bring up. And, um, back to even the pieces of data that we gather, if we, it's so tempting to hear that someone says they want a faster horse. Um, maybe in your zoom in section, you'll end up hearing a bit about people's very specific requests.

And as people who are driven to help people and to act on things, I see that so many teams can really quickly just say, okay, I'm gonna act on that. I'm gonna build a faster horse. Let's do that feature. But what we really need to dig into is where is that request coming? What is that about? What is why?

This is where that five y game can come in where you say, Why does this seem to be important to them? This is also why you want to have done that zoom out piece and make sure that that's some of the data you're pulling into your analysis. What else were they talking about when they weren't yet talking about Instagram?

What were their headaches before then? What else are they spending time on? What does that say? What is that helping them achieve? Um, we can so quickly jump to the solutions that we don't ground ourselves really deeply in the, in the problems overall when we're building, or even as we're trying to interpret what they're saying when they ask us for a faster horse.

So ask yourself, what is the faster horse about What is, what is going on here? Oh, well maybe their lives are suddenly, um, moving much faster. They have twice the number of demands as before and they, there's simply no time and space to move between them, um, efficiently enough. Or it's about needing a renewable resource at home or whatever is the kind of building block of why they're asking for that.

That's the thing, first of all, that we need to actually take time to understand and we so often rush through cuz it's again, it, we're listening to our customers. That means you don't need to take everything that they say at exact face value. That's not listening to them. It's understanding them is what you're aiming for here, which is why again, you get into this zoom out piece, understand them more deeply.

And then the second piece that I think we moved to quickly through that relates to equipping teams is that we then, Can almost be ready to just build the next iteration without even thinking about how to apply all these learnings that we've had to it. So one, take a little more time in synthesis to let it percolate, and two, to dig in and see what's actually happening behind the ask.

Um, and then you say, okay, I think it's about, um, helping people move through their lives faster and having a renewable resource at home. And then step three, brainstorm as a team and say, okay, what does it look like, um, to use these three different components? that could be a faster horse, that could be a car, that could be a bicycle, it, a rocket, whatever that may be.

But then you're being more grounded in the principles of the problem you're really trying to solve. Um, I see teams of all stages doing this, and that's where I think, um, you don't have the same success that you want, is you're so earnestly trying to listen to people, but you can't only listen to them at face value.

You have to understand where that ask is coming from and use and interpret that and figure out how to put that into what you're building. And that just takes a little bit more time, a bit more nuance than people often want to devote, especially at early stages.

Brett: Great place to end. Thank you so much for joining us, Jeanette,

Jeanette: You're welcome. Uh, you know, I'd urge the audience, we covered a lot today. Um, and I'd urge you to take time and just think of one main takeaway that you have from listening and, um, come back to it. A lot of times what I see from teams is that there's one thing that stands out to them now that you could practice for a month and then your next round do after a couple more mini research projects.

Try another one. Just try one at a time. And that's plenty to start with. Um, and of course you can always work with a researcher if you want, want help with the others. 

Brett: Awesome. Well, thank you again.

Jeanette: Well, thank you for the, the time and thoughtful questions yourself.