How and Why to Use Customer Feedback to Build Great Products with Janna Bastow

Aurelius Podcast – Episode 67 highlights with Janna Bastow:

  • Using UX research to inform product roadmaps
  • Stories about how to avoid major code and product rollbacks by simply conducting good UX research
  • Using customer needs to truly innovate and avoid building a better competitive copycat product 
  • How to avoid building a Frankenstein product

In this episode we have Janna Bastow, the CEO and co-founder of ProdPad, a software tool for product ops and product managers. Janna started ProdPad due to the lack of product management tools that worked the way she and her co-founders wanted to work in product management. Our conversation highlights the importance of balancing customer feedback with strategic business objectives, and going beyond “genius design”.

Janna discussed how critical it is to use customer feedback and UX research to go from early days decision making in a new product, where you may even think you’re the user (which, spoiler alert, you never are) to a much more mature business and product.

We also talked about having the right balance between acting on customer research and feedback and doing things the business needs to be successful. Ruthlessly prioritizing, as Janna puts it, between customer needs and strategic objectives often yields the best results and Janna shares some personal stories and advice on how to do exactly that.

Links from this episode:

This podcast is brought to you by Aurelius, the powerful research and insights tool. Collect, analyze, search and share all your research in one place.

Janna Bastow podcast on How and Why to Use Customer Feedback to Build Great Products

Episode Transcript

(this transcript was automatically created using our very own transcription feature in Aurelius and has been minimally edited 😀 )

Hey, Janna. Hi there. How are you today? Yeah, great. Thanks for having me here.

We’ve been. We’ve been waiting to chat with you for a while, and this is actually a little bit later than we wanted to. We had to do some rescheduling. So I’m really glad that you were kind enough to take the time and join us. We’re definitely excited to chat with you.

Yeah, excited to be here. Thanks for having me on, for sure. So with every show that we do, before we get really into any of the topics, we’d like to have folks kind of talk about their background, introduce themselves. So if somebody doesn’t know who you are, hasn’t followed your work yet, they have an idea, kind of where you’re coming from as a means to understand your perspective. And the conversation we’ll have.

Yeah, absolutely. So folks might know me as one of the founders of Prodpad, which is the software for product people. So it’s a tool that helps you figure out what you should be building based on feedback from your customers or ideas coming from within your team or what the overall goal for your company is. So you can outline your vision and then map out the steps in your strategy into your roadmap and make sure that everyone’s aligned around that roadmap and you’re building the right stuff and sending the right stuff to your development team. So it’s a tool that helps with that.

My background originally was as a product manager. So basically I needed tools to do my own job, and they didn’t exist. So I got together with another product manager and we started hacking away at this problem. You might also know me and Simon, my co founder, as a couple of the founders behind Mind the product, which is the big community of product people. And again, this was sort of started based on a need.

We didn’t start it because we knew everything about product management. We started it because we wanted to know other product people and bend their ear and figure out whether what we were doing was the right thing or not. And by surrounding ourselves with lots and lots of product people, we learned so much about best practice. And a lot of that best practice is now being applied back into what we’re building here at Prodpad. And it’s just done a lot with accelerating my career into the product management space.

Yeah, I love that. And thank you for sharing that, because sometimes people I don’t think may know that about you. I think a lot of folks probably for sure know you from Prodpad, and even some of your speaking now but that background to me know I’m really biased because that’s very similar to how we have worked at Aurelius. My background is in UX and UX research. And I was like, we don’t have anything that helps with that specifically, so we should do that.

Right. And I just love scratching your own itch. Right. And I think that a lot of the most successful tools and stuff are built with that same approach. Right.

It’s somebody who’s just, they have the domain expertise like yourself, and they saw an opportunity to do this because they just wanted the problem solved, not because necessarily they just saw a business opportunity or a cool idea. This would actually help me in my life. And I’m pretty sure the people I know, and I love the fact that you just went and built it. I mean, I think you’re underselling it. Mine.

The product is pretty huge. There’s thousands and thousands and thousands of people from across the mean. That’s no small as you were building company and product as well is definitely impressive. And there’s lots of people very happy with what prodpad is doing. And so I would say I think you’ve done a pretty good job at that.

So all of that is. Yeah, of course. And all of that’s really helpful background, too, on what I was hoping to chat with you about today. And I think it’s a really awesome meta topic for you, which is how do you figure out what feedback you’re getting from customers should be implemented? How do you figure out how to prioritize that?

And I would just want to add a lens to that as like a high level. A lot of folks listening to this are likely going to be in UX or going to be in UX research. There’s probably a lot of product people too, and I think everybody’s got sort of different levels of understanding how you ought to do that. Right. So maybe first let’s just talk about what sort of feedback should we be collecting and trying to get a sense of, to even inform product to begin with.

Yeah, absolutely. And you sort of touched on something that I think is important to address, which is a lot of tools like Prodpad, like a royalist, like a lot of other companies out there. The inception is from people, professionals who try to go solve their own problems. But one of the lessons that I learned really early on is that even if you are your user, you are not your user. We made some really big mistakes that involved having to roll back literally months of work.

Right. Code that I had hacked away on in my bedroom for months and months and months thinking, this is the way to do it, because this is the way that I do it. And then when I started getting it in front of actual customers, actual other product managers realized that that’s not the way that everyone else does it. That was just the way that myself and my co founder did it. And so even if you are your user and you’re quite representative of that space, you’ve still got to get out there and listen to customer feedback.

And so that was an early lesson. It was a bit painful throwing out months of work and having to rejig what we were doing, but I’m glad I learned it that early on because some people don’t learn it early on enough at all. That was still when we weren’t paying developers, we was just our own time, and we hadn’t quit our day jobs at that point in time either. But even still, it was an expensive lesson learned. And so to your point of which feedback to listen to when you’re super early, it’s whatever feedback comes your way, right?

You’ve got to just get it in front of people and hear them out, right? It doesn’t mean do something with it immediately, but just hear them out, capture it, listen to it, and then ask them to get more information about why, where it’s coming from. Some people might look at your thing and go, absolutely love this. Cool. Why?

What resonates with you? What are you doing today? That this is replacing what is good about this or what’s not good about this, right? If somebody hates it, dig in, really find out why. If somebody tries it and then sort of, like, falls off, they don’t use it.

It doesn’t resonate, but neither do they hate it. Dig in, find out why, right? What is it that they don’t love? Because obviously you love this thing. This is solving a problem for you.

You thought this was going to be absolutely glorious, and you show it to somebody and apparently it’s not this thing that they go, whoa, love it. And that’s going to be the reality. Your first version of something is going to be pretty crap. Like the first version of Prodpad. Myself and my co founder hacked it together.

I had to learn jquery to do it. It was junk, right? I mean, I’m really glad to say that most of it, no, all of it, sorry. Has been completely thrown out and redone since then. Only whispers of the original concept now still exist within the current code.

But we had to be humble, right? You had to get it out there and just expect that people were going to bash it apart. And that hurts, right? People are going to look at it and go, this isn’t what I expected out of a roadmap, or, I want to do this, this and this. And had we listened to early customers and what they needed the most, we almost went the direction of just building a different jira.

Right? I mean, we had enough feedback where people were like, yeah, cool. I mean, roadmap tool. But I don’t like the way that Jira does this. So could you make a bug tracking tool that does this instead?

And we nearly went that direction, and maybe that would have been a path for us, or maybe that would have made us just another dead task management tool along the path of all the other task management tools, as opposed to a category creator in the product roadmapping space. Who knows? Maybe it could have made us super successful and we would have been Trello and acquired for the billions. Who knows? But it’s hard to say who to listen to.

Yeah, you touched on a couple of things there that I think are really important to highlight. So the first one is it very much matters what kind of feedback and how you’re going to interpret or apply that, depending on where you are. Let’s just call it chronologically, if this is completely net new product. Now, in your case, you were bootstrapping a company that’s like next level, right? You’re not talking about a big company with a bunch of funding and resources making a new product.

I mean, in your case, literally, you have to get these decisions right. We know that because we were bootstrapped or we still are mostly bootstrapped technically. And so you have to prioritize these decisions very carefully, whereas later on, you’re talking about feedback that you’re likely getting about a specific feature and perhaps the interactions about it. That’s different and that’s a lot more literal. Right?

You’re almost talking, you’re zooming out a massive level just to say, hey, is anybody else upset about this at that early stage? That’s one thing that I think is really important to highlight that you mentioned. There, and it does change. Right? So when you first launch a product, your first product is going to be crap.

And so the feedback that you get is quite often like, hey, I wish that you had a way to reset passwords. You’re like, yeah, we probably should have built that. Cool. We’re not even at minimum viable product yet. Right?

I wish I had a way to tag my ideas. Cool. We’ll build that in, right? These are things that sort of make sense, and you’ll hear this feedback, and a lot of it will be pretty obvious stuff that you’re like, yeah, this definitely makes sense, right? But then over time, this first cohort of users, some of them will drop off and you’ll never hear from them again.

Some of them will stick with you, and they’ll continue to give you feedback. Now, our first ever paying customer is still an advocate for us, right? Still uses Prodpad and gives us feedback. This person has become really advanced at using Prodpad. We have a bunch of people in those early cohorts who still use Prodpad.

They’re super advanced users. They use Prodpad in more advanced ways than our team uses Prodpad because they’ve got bigger teams and they’ve got more advanced use cases. Now, they didn’t when they first started using Prodpad ten something years ago, but over time, they become more and more advanced. So imagine your company plus three years in the first 30 days. All you have are customers who are 30 days old.

They’re all in that 1st 30 day cohort. So all their feedback pertains to what happens in those 1st 30 days. But add a year to that and you’ve got people who have a year’s worth of experience. And so they’ve got problems that come with having used your product for a year long period. So in the B two b world, if you’ve got some sort of collaboration tool or a tool that accumulates stuff, comments, items, objects, whatever it is, now they’ve got this bunch of stuff in there, right?

They might want to have a history, or they might want to be able to look back on things or export things or report on things in various ways, right? These become problems that weren’t problems when they were first trying to log in. But you still have people who just signed up yesterday in their 1st 30 day cohort. And now fast forward another couple of years. You’ve got people who’ve been with you for three years and people who’ve been with you for a year, and people who have been with you for 30 seconds, and you now have to figure out, well, who are you listen, listening to, right?

Are you building for people who have been with you for a while? And you need to build a tool that manages the complexity and allows this cohort to grow with you and expand and turn into the sweet enterprise accounts. And now you’ve got competitors who are coming in and copying all those features. So the things that were nice to haves way back then are now must haves because your competitors built reporting in two months flat. So you’ve got to do it too.

These are all real stories. And so you’ve also got people who just started 30 seconds ago or 30 days ago or 30 minutes ago, and you’ve got to listen to their feedback too because they’re running into lumps and bumps. You wish you smoothed them all out in those 1st 30 days when you built the product, but you didn’t. And you’re going to run into new lumps and bumps too, because as you’ve been adding new functionality to your app, here’s what happens is your original piece of functionality in the app was pretty simple. Maybe you had five core functions, five things that it did right over time.

You added things right. You added the reporting, you added the search and the filter and you added the whatever. Right now you’re onboarding people to use all these things and hopefully you’re not just giving them this onslaught. You need to come up with a good onboarding flow. So you need to make sure that people understand how to use these things.

And so that 1st 30 days, that 1st 30 seconds experience needs to be adjusted. So do you spend your effort on onboarding? I mean, one of the tenets that we’re constantly repeating here at propad is always be onboarding. Always be thinking about how to get people through that onboarding flow. Because if you can get them onboarded well and using the existing functionality, they’re way less likely to be offboarded later to stop using it and to not get engaged with it.

It’s way easier to get them engaged in that first period than it is to find that somebody jumped on board, didn’t quite get it, and then kind of lagged along for a little while. So you’ve got to listen to that early feedback, but you still have your power users who’ve been paying you for five years and who are saying, hey, love this, but how do I bulk invite 100 people? And you’re like, oh, we never built that because we didn’t think about what would happen when somebody wanted to do that.

Exactly. Yeah. How do I integrate with Okta or some other tool that helps them become more enterprise? And you’re like, oh, we should definitely build that. And all of these things go into your backlog, all the things you could do, and it becomes increasingly complex and you’ve got to prioritize ruthlessly, right?

You’ve got to decide what goes in. Yeah. So prioritize ruthlessly. Is an extremely well put way to say that, especially for companies in that position. Again, we’re not necessarily, at least especially the stories you’re telling.

We’re not talking about some massive company with a ton of resources, like this is newer product or even established product, but much smaller company in size and scale. And I think that prioritized ruthlessly should still be practiced, even at those really big companies, because it reduces a ton of waste. And even in big companies, they do have to. Right. Even if you have 100 developers, those big companies who have these big bloated development teams, they have those big bloated development teams because they have 10,000 things they could do, because they’ve got these giant products and portfolios of products and they’ve got so many bells and widgets and they’ve got so many customers.

There’s so many options of things they can do, they still have to prioritize ruthlessly. It’s just that their prioritization is like, of all the things you could do, where are we focusing? And ultimately, when you’ve got all of these developers, they don’t actually move all that fast. So you’ve really got to prioritize to make sure that you’re focusing them on the right things because it’s going to take them three months to solve any one problem. Yeah, again, really well said.

I want to ask about that though, because that to me just. It rings a bell. Right. Prioritize ruthlessly. And I would argue that people in product, people in Ux and anybody sort of in that periphery would agree with that statement.

The question then becomes how. How do you prioritize? Now, a lot of this we’re talking about using feedback to inform the choices we make. And I love that one of the things you said very early on too, was how do we check and balance that against what our company goals are? The goals for this product specifically, that I might work on, if it’s a company that has multiple products, that kind of thing.

So when we get back to this idea of prioritize ruthlessly and we use feedback, how do we do that together? How do you typically coach teams? How do you do that yourself? How have you seen it done successfully? Yeah.

So that’s actually a really key point that you’ve touched on there, which is taking a look at the company level goals. There’s a temptation to be customer driven, to listen to the customer feedback and build what the customers want, because the customers are always right or the customers will tell you what you need. And unfortunately, that’s not true, right? I mean, one proof point is if you just built what the customers wanted, if you just listened to the customers, you’d end up with, first of all, Frankenstein of a product with no cohesion. And also, they want it to be free.

So why not just make it free? Because that’s not good for business. Right. You need to think about what is actually good for the business. And what’s good for the business is what aligns with the strategic steps you should be taking in order to win.

And by win, I mean beat out the competition and take home the biggest slice of the pie, take home the revenue that you need and the profitability that you need. And these aren’t necessarily aligned with what your customers are asking for. Most of the time they are. If you’re in a monopoly, chances are it doesn’t really matter what the customers want. But most of us don’t operate in a monopoly.

Right? You do have to listen to what the customers are asking for, and it creates a good sense of alignment, because if you don’t do what the customers want, if you annoy them all, then they will take the business elsewhere because they can. But ultimately, you’ve got to think about what the business needs and what the business wants. And as a product person, as a UX person, you are working for a business, you’re not working for the customer. So you can’t forget that alignment.

And so when we’re talking about the company level goals, it’s like, well, why does your company exist? Is it there to, unless you’re nonprofit, is it there to make revenue? Cool. Well, how does it make revenue? Is it trying to beat a particular competitor?

Is it trying to make space for itself in a particular space? What is the context of this space? Are they a new entrant? Do they have some sort of competitive advantage? Are there particular constraints on it or particular things that could allow it to win out over other competitors out there?

So take a look at what those different strengths and weaknesses are. Right. Good old SWOT analysis can help here and then prioritize the different opportunities and challenges and problems that you could be solved as. Almost like stepping stones. I like to think of them as the stepping stones, the initiatives that you could tackle to take advantage of those opportunities, to solve those problems, to tackle any challenges.

And those stepping stones are represented as steps on your. I like to use the now next later roadmap. Right. The things that are right in front of you, the things that are coming up beyond that, and the things that are coming up further on the horizon beyond that. Right.

And these should be like the stepping stones you’re going to take in order to meet the vision for the business. So your company is saying, hey, we want to be the x of Y. We want to be in a few years time atop that mountain over there. So in order to get there, we need to take these kind of steps. If we go this way, we’ll get that far that quickly.

And if we go this way, we could see it falling off that cliff. And if we go this way, we think it’s going to work the best because that, so you sort of outline the steps that you think you’re going to take, and you prioritize these bigger steps by getting feedback from your stakeholders. And some of these stakeholders might be from customers, but a lot of these will be from internal checks as well. And what you’re going to do is you’re going to size these up against your customer feedback. So it’s not customer driven, but it’s customer informed.

And so some of these things might be on the roadmap as opportunities or problems are solved because you’ve got a whole bunch of feedback of people saying, hey, you know what? We absolutely need to solve this sort of problem. And you’re saying, hey, if we solve this problem, there’s a whole pile of customers over here. And if we get a whole pile of customers that drives the bottom line and that helps us get to here. If we get the revenue from here, we can then do this.

And that’s the step we’re going to take. That’s how you’re justifying it. You’re not just saying we’re doing it to shut these customers up or to make these customers happy. You’re doing it because it drives a step, a stepping stone for the business to get to the next stepping stone. And usually it is revenue driven.

Right? Or at least preventing revenue loss or preventing a competitor from overtaking you. Because it’s not just about getting to the top of the mountain, it’s getting to the top of the mountain faster than your competitors do. Yeah. Really critical that you’re touching on this in the context of saying this is meant to serve some larger business need.

Regardless if you work in product or UX research or design or engineering all of this stuff, I think it’s really important for all of us to keep in mind that the decisions that are being made and the things we prioritize are all sort of in service of that. And so if you start there, I forget the exact word you use, but you can sort of reflect that back against the work that you’re doing. So what I’d like to do is take most of what you just said and sort of interpret that in a different way for folks like in UX research. And I’ve talked about it this way in the past as well, where it’s not just, to your point, being totally customer driven, because even in your own story of Prodpad, you could have become another task management tool, and that actually may have been successful for you, but it wasn’t in service of the vision of what you and the team said Prodpad was meant to be. So what you did is you had a goal in mind.

You said, we want Prodpad, or we have this vision for Prodpad to be X, Y and Z. In order to do that, these things have to happen. So as a UX researcher, it’s very easy to craft questions around, well, how do we understand how we can make those things happen? And then if we’re asking questions and setting up research in that way, getting customer feedback that helps elicit ideas and suggestions and changes and new things in the product that help us meet that. What makes it a lot easier to prioritize that, because things that sort of fall outside of those goals or that vision, to use your language, maybe just go into the later category or total backlog type thing and the other ones fall into some relative priority between now and next.

I’m using your specific terminology, and maybe you can clarify that for folks, because I don’t know that everybody is familiar with that, but the now and next later framework is pretty clever. Not everybody does it, but maybe you can touch on that too. Yeah, I can clarify that. So the now next later framework actually came out of some really early discovery work that we did at Prodpad, and it came out of the fact that the first version of Prodpad was a pretty junk roadmap, right? It was the old school way of doing roadmapping, where we recreated the roadmap that I’d been doing at my job previously, which is a timeline roadmap, very much a Gantt chart where I would take my features and I would line them up on a timeline and say, this feature is going to be delivered here, and this feature is going to be delivered here, and this feature is going to be delivered here, all stacked up in a line of due dates and deadlines, and which features are going to be delivered when.

And I digitized that for Prodpad, and I assumed that other product managers wanted to use this too. And actually, it wasn’t a bad assumption, because if you did a Google search for product roadmaps and looked at the images tab, this is what all product roadmaps seem to look like. And so we were creating a digital version of this that was just easier to keep tabs on, and you could manage the specs of those ideas and the background as well within Prodpad. But what we actually did was we started sharing this with other product people who loved it at first, but then after a few weeks, about a month or so later, we started getting feedback back, and people said, this is cool, but I want to be able to take everything in the front part of the roadmap and move it over by a month. And we’re like, oh, that’s interesting feedback.

Now, had we just listened to the feedback, we had a lot of feedback around this, like a good chunk. A good portion of the people who were using this just wanted to multi select drag and drop everything over. Which one of the constraints was that? That was difficult to build in a performant way. With jquery.

I wasn’t good at jquery, so I was like, oh, it’s kind of tricky. But we also sort of said, well, why is everyone wanting to move everything over? And what would happen if we weren’t able to move it over? Because that timeline would sort of pass by. If people didn’t actively move it over, everything would fall off the back of the roadmap, and that would be kind of messy.

This whole thing has a tendency or could turn into a kind of a messy backlog. And so we started asking the five whys and got down to the bottom and realized that no product manager seemed to be actually finishing the roadmap in that time. We’re like, oh, see, I thought it was just me who wasn’t doing my roadmap when I committed to what I was going to do that next month. Turns out much better product managers than me weren’t delivering the roadmap. I was like, oh, that’s a revelation.

So if no one’s doing the roadmap, what is the point of a roadmap? And these timelines and these due dates and all this sort of stuff, what is even a roadmap? And so we sat down and we brainstormed what it might look like if we didn’t have this constraint of the timeline at the top, and instead thought more in terms of the order of things that you needed to tackle and the confidence in which you had that you were going to tackle it. And it turned into these time horizons, so things that were close to you, things that were further away and things that were further away still, and then being able to sort the problems within those, but without having a strict timeline, so that it didn’t elapse every week or every month when things were happening, if something hadn’t happened with it, where, if it fell behind, it was still the first, most important thing to work on until you decided it wasn’t the most important thing to work on, so you didn’t constantly have to shuffle things around. And people loved this new version when we put it out in front of them.

They loved the freedom that it gave them. And this was actually a really interesting tipping point in the early growth of Prodpad was this freedom that this new now, next later format of the roadmap gave them. So that was the genesis of it. And it was based on looking at user feedback and user behavior. And it was one of the first times that we realized that we are not our user.

We’ve got to listen to feedback, but also do things that break the mold a bit. Right? Because had we just listened to user feedback at its core, we would have built a multi select drag and drop. But instead we asked the five whys and got down to the problem and came up with something unique and frankly, groundbreaking. It became a new framework that’s now being used way more widely than Prodpad itself.

Yeah, that’s awesome. And I can relate to that a lot, too, because especially in our early days, we got a lot of stuff which was just that sort of interaction level feedback. You have this tool or this feature, I would like to do this thing with it. Now, if you took that at face value to your point, you would have just built that thing and it would have worked in that way. Again, maybe successful, but wouldn’t have turned into what, in your case, Prodpad is today, and the value that it actually can.

And so that’s where things really change a lot. And this, to me, is very much where good customer research, feedback, whatever you want to call it, comes in to inform that one of the things that you said, I think is very appropriate to link to this is the confidence that we should do that thing. And to me, good research, customer research, feedback, any of this, however we want to describe it, gives us the confidence to do that. Yeah. The question I wanted to ask you is, especially as a product person, if you’re working with UX and UX researchers, talk to me about the best case scenario where they can come to you as a product person and say, this is what we’ve learned.

I’m giving this to you. How do we make a decision on what to do in the product based off of that? Yeah, it’s a really good point. I think confidence is a really good way of putting it. A lot of times ideas exist within the backlog or feedback comes in and it might resonate with you because you also have some similar feedback or something like that.

Right. And within Prodpad, the way we deal with that is that you can add your sentiment to it, you can give it a thumbs up or a thumbs down or give your thoughts on it and that sort of thing. But really what it comes down to is how confident you are in it. And that’s a different setting, that’s a different thing that you can track for it. Is this something that you got very little evidence behind?

It is just the CEO’s idea? Because I’ll be honest, as a CEO now as opposed to the product person, I have ideas and I throw them out at the team going, hey, what if we were to do something like this? And they are like cool, we’ll put it in the backlog. But this is just a shower idea. It’s not solid enough to go straight to development.

Sometimes it’s an idea that comes from looking at the competitor so and so has done this. So maybe we should do this. Maybe it’s something that you’re doing because a client has asked for it. But only one client and everyone knows that if one client has asked for something, it’s not likely going to close that one deal and it’s certainly not likely to go close 1000 other deals like it. So you’ve got to take these things with a grain of salt before you go and build something.

Just because somebody whispers a feature idea at you, that is a lesson learned as well. You’ve got to look at something and say, okay, how confident are we that this is going to solve the problem that we have? Is it something that’s highly impactful? Is it something that is going to take a lot of effort? We’ve sort of made these estimates up for now, but can we do, and to build confidence, you gather evidence.

Can we do some user discovery? Can we do a round of surveys? Can we put together a prototype and watch some user behavior around it and then do some surveys following that? Can we get a prototype and see how the usability tests score on it? As you start doing more and more of these tests, you start proving further and further that this is in fact the right thing to work on versus not the right thing to work on.

One of the best confidence scores that I’ve seen, the confidence meters is by Idamar Gillad. And he has a really neatly outlined and sort of marked score from zero to ten, or I think it’s zero to one. But basically how you score something as to whether it’s high confidence or not, and it’s really brutal as well. It’s basically like a CEO’s idea gets like 0.1 and you sort of go like, cool idea. It’s still worth nothing, right.

And it only starts to gain real confidence as you gain real evidence that it’s in fact the right thing to do. You’re really, truly derisking this thing. And that’s really what lean product development is about. And all this experimentation is about is derisking it so that when you actually do the build, the expensive build part with development, you’re not building something that is the wrong thing to build. Because if you’ve got 100 things you could build, or in a lot of product teams, cases like 1000 things you could build, which one should you build?

You can’t make a misstep. We once worked it out that it costs like minimum 25,000 to build the wrong thing. And that’s like a small feature, right? Imagine several months worth of work building something slightly bigger that needs to be rolled back or ends up eating away at the user experience enough that you lose revenue over it. These things can cost companies millions if they’re implemented badly.

And how many times do companies build the wrong thing and don’t check it? And they haven’t gone through this set of stages where they check the confidence and get evidence as to whether they’re working on the right thing or not. Yeah, 100%. And I’m really, really glad to hear you talk about it in terms of risk, because one of the things that I have been speaking about specifically recently is how Ux and good UX research, customer research, is actually risk mitigation. I think there’s a misconception, certainly outside of the UX circle.

Maybe not necessarily people who work in Ux, but they have this idea of, well, Ux makes it look good, that’s true. But it’s actually, when you boil it down to business impact, it’s risk mitigation. Just as making good product decisions is risk mitigation. It’s not just prioritizing the things that have come in and having maybe the nice Gantt chart that makes everybody feel good, that is actually just marching towards the wrong thing, perhaps. Right.

And so all of this is like, I feel like it’s all actually talking about the same thing and it’s just good handshakes between disciplines of trying to do the exact same thing just from different perspectives. Exactly. That a good ux researcher, a good UX team should be able to help remove the chance. Derisk you building the wrong thing. Derisk the chance that it’s going to be rejected by users.

Derisk the chance that it’s going to degrade the experience to the point that people churn more often because of it or don’t react to it or derisk it to the point that when you’ve got marketing yelling about something that their stuff doesn’t fall flat. Right. And sometimes the end result is that, yeah, it looks better, but that’s just the surface level. Right. It only looks better because it feels better and it only feels better because it was the right thing to build because it’s solving a real problem for us.

Right. The looks better, whether that’s nicer pixels or whatever, is just the lipstick on top. But really a good user experience, as we know, goes down to its really core as to whether it’s solving a problem for us and feels cohesive through the entire set of touch points that the customer goes through. Yeah. And one of the things we do just to talk about Aurelius for a second, for us, even in the tool, we quite literally call it a key insight.

And that is a statement of something we learned from customers. And I also appreciate that you touched on the fact of evidence because then you can attach actual evidence, right? Like quotes from interviews, clips, documents, any of that stuff, to say, here’s the thing that we learned, and here’s what backs that up to say that we learned it. But then the next step too, even inside of realis, is like, we call them recommendations. But a lot of know UX and research teams are saying, OK, here’s the thing we learned, here’s what we might recommend you do about it.

You can link all of that together, even inside of Aurelius. Right. And whether you’re using a tool like that or not is irrelevant. But I think it’s really important because everything that we’re talking about right now is saying, learn something, make a statement about that. Make sure you’re providing suggestions or recommendations or ideas that all link back to that stuff.

It also sounds like maybe that’s what feeds that confidence score that you were talking about too. Yeah. And there’s a really interesting point to be made here, which is people talk about tools as just being this thing. Take them or leave them, you don’t really need them. We talk about this stuff as processes, right?

And these processes are pretty common sense, right? I mean, these processes we talk about are the same things we were talking about 20 years ago. Ask people the right questions, check whether you are building the right thing before you go build it. Test whether you built the right thing. Repeat.

Right. There’s nothing that’s rocket science about this. But where tools really help is that they pave the path. They help lock in better habits. So by creating space where you can more easily capture those insights and capture that evidence and remind you to go back and check whether, like in Prodpad, we have target outcomes and actual outcomes for every idea and every initiative at the roadmap level.

So you can say, we said we’re going to set out to do this problem. Did we do this problem? Did it work? These are all things that you could do anywhere, but teams don’t tend to do them. You could capture evidence on postit notes and then read it later, but you don’t, right?

Having tools helps you do so and keeps it organized in a way that the rest of your team can consume it and you can actually do something useful with it. Yeah, definitely. And that is, I mean, the post it notes one is a bit poignant because that’s exactly what UX research. Even myself, when I was still working in house many years ago, that’s the way we did it. There weren’t even tools really available to do that stuff.

And so that’s definitely the case. Another thing that I thought of as you were describing that too, I think is really important to highlight here is that when we’re talking about feedback, I also think that there is a level of interpretation. In your case, you were doing the five whys. But really don’t take this at surface level when somebody says, well, I just really need using your story, I just really need to be able to take these items and slide them over to the left. Great, we could have built that.

But understanding why is that the case? Because what it sounds like you learned is that there was this whole other intention because, well, we never actually finish it in the time. And so why is that? Well, because these things happen. We need to prioritize it in a different way.

And that was awesome because you tugged at that thread and you learned a whole new way to provide value to people doing that work in your platform. The opposite is true, where you can just react to that stuff and say, oh well, customers said this. And so these are things that we need. And again, I think it’s possible to find success with that, but it’s far less likely than if you were to understand the underlying challenge that that person has, because we’re not often very good at expressing that because they’re looking at the thing in front of them. One of the things I often say is, like, if something exists, somebody has an opinion about it.

You can optimize locally that thing until it’s perfect, but it still might not solve the underlying issue. And so really good research and really interpreting feedback, I think is an art and a science. Spend some time with that and get deeper than what the first thing is that you hear. Yeah, that also reminds me of, it’s a book that I always tell my team about. I added this book as a kid, and I’m sure it might be one of the things that got me started in product management thinking, which is if you give a mouse a cookie and the book sort of goes around, the mouse wants a cookie, and once you give the mouse the cookie, it wants a glass of milk to go with it.

And once you give it the glass of milk, it also wants a napkin to wipe its mouth. And once you give it the napkin, it wants a crayon to draw with. And it kind of goes down sliding slope here, slippery slope, until it owns the house, basically. Right. And so the lesson here is, don’t give a mouse a cookie.

And I always use this to warn our team from getting into too many rabbit holes. And it’s like, well, we build, and this is not a great example, but it’s like, okay, if we’re going to build tags, cool. Well, people aren’t just going to want to add tags, they’re going to want to remove tags and search and sort and filter by tags. Cool. Once you have that, do you have and or filters?

And once you have that, do you need to be able to save filters? And nowadays these are all kind of expected features, fine. But every piece of functionality that you add, you’ve got to think about what rabbit hole is that going to get you into? You can add one piece of thing here, but as soon as you have that, someone’s going to have an opinion. And actually quite a few people are going to have an opinion about that and are going to say, hey, can you go do all this other stuff around it?

And the thing is that people will give you feedback around the things that they see and have feedback around, and they won’t give you feedback around stuff that they can’t see. So you’ve got to be able to be more well rounded, right? So people will give us lots of feedback around how we manage user stories, for example, which is kind of like a small piece within prodpad, small but functional piece. And it’s not a huge indicator of success within Prodpad, whereas there are other areas that no one really gave us feedback that we should have. OKRs, for example.

But once we built it, it opened up this whole floodgate of people who wanted this. So we had to take a bet. No one told us, no one gave us feedback on this. We had to say, this is what’s good for the business based on we think there’s a problem to be solved here, not based on feedback, but based on where we think things are going. So we had to ignore feedback and go our own direction.

And now that we have it, there’s lots of feedback about it, right. But you’ve got to sort of set your own direction sometimes and not just go after the thing that people are giving you feedback on. Otherwise you just end up in these smaller and smaller little rabbit holes, right? You give the mouse the cookie and they just sort of dive you down into that thing and you get distracted and you never really get the chance to step back and say, whoa, what does our portfolio look like? Where is it we’re going with this whole product?

Yeah, definitely. And that book I’ve read to my kids several times. Brilliant. I’ve got it up there. Yeah, it definitely is.

It just becomes this unraveling thing that eventually comes back, like full circle where the mouse takes a nap and he wakes up and he’s going to be hungry and he’s going to ask you for a cookie. And the thing is, that’s the point is around and around you go until you actually get to the core of the problem there. And for me, again, I’m trying to interpret this from the perspective of folks listening who are in UX or UX research. To me, it sounds like working with product to do this really well is being able to say, we did research and we asked questions that help us meet this overall goal that we all agreed to. So we’re not just asking what do you think of the product?

We’re not just asking how can we make this feature better? We’re trying to learn things that are very targeted so that we can work with product and business and everybody else to say these things we learned are important for that, not just making the widget better necessarily, but that might be part of it. Right. But knowing that context, I think, is really key to focus the conversation and then by extension, prioritize ruthlessly around the decisions we make. Right.

Yeah, absolutely. I totally agree with that. Nice.

One of the things that, it was really interesting, as you talked about this, too, because before you and I sort of officially started recording, we were chatting a bit about stuff that we were working on, and both of our companies have been working on stuff AI related, and that didn’t come from direct customer request for us. I don’t know if that was the case for you, but in our case a few weeks ago, let’s just call it about a month ago, we launched something we call AI assist, which essentially just helps you take an entire, let’s say, customer interview, summarize it, and turn it into key themes. Now, this is work that researchers are doing already. They didn’t ask for an AI tool to do that, but it’s been easily one of our most successful launches ever in Aurelius. And to your point, exactly what you were saying is we didn’t hear anybody say, I want an AI driven feature to help me do this specific thing.

What we learned a lot about by being in contact with our customers was that time, from data to analysis, data to insight is a pain point. Everybody’s always trying to do that faster. They’re trying to get to this more accurately without spending as much time. So we saw an opportunity where there’s technology that you can apply to this thing and do that. Right.

I’m curious, is that how it came up for you as well in Prodpad, or were people specifically asking about this, and then you were trying to sort of marry this technology with it as well? So this the AI side of things, you mean? Yeah. So with AI, it’s one of those things that people don’t necessarily know to ask for, right. Because it was so new, the interfaces hadn’t been developed yet.

Right. When we first got our access to GPT four, the pattern of a purple sparkly button hadn’t been developed by the industry. Right. And actually we’d been playing with GPT and had parts of it, early versions of it, installed in Prodpad long before chat. GPTE was a thing.

So we’d actually been using it for various things, like deduplication and merging and like a tag suggestion bot, which now we look at and is like, well, that’s junk compared to what we can do. But one of the really interesting things is when technology is moving that fast, is that you don’t actually know what people are going to want to use it for. You’ve just got to sort of throw things at the wall and see what sticks and then get feedback from that. And so when we got access to the GPT four API, we caught it on something like the Tuesday and the following Tuesday, as part of our release, we got the first version of our generative AI out the door. And the first version was basically like, you know how you’ve got an idea, maybe something on a postit note somewhere?

That’s the gist of the idea. Well, this would help you write the rest of the idea, and it would help you think and brainstorm around what problem it might solve and why you might want to solve that problem and target outcomes. You might want to measure for that. But also we gave it a bit of an opinion, so it would help you think about what risks and challenges might come up. So are there any ethical or privacy concerns that might come up if you were to come up with this idea?

And we’d keep the human in the loop. So as it generated this idea for you, you could review it, edit it, take out stuff that wasn’t right or was scope creep or whatever, and then save that to your idea. And then once we started seeing people use this thing, we could get feedback on it, and we expected there’d be a good, we’re like, this is pretty impressive, right? Let’s see how other people think about it. Maybe they’re going to hate it.

Maybe they’re going to be like, whoa, don’t replace me, or, I’d like to write my own specs, thank you very much. Turns out they do not. They love having AI write their own specs. And what we even realized was people would happily spend, because it takes about 30 seconds, it took a little bit longer, even when we first launched this thing, to have it write your spec. And we saw, I mean, there’s one user who spent, must have been hours doing this, press the button 300 something times each idea he went through and used this thing.

So that’s a really good indication of like a product feature fit when you’ve got that much friction. And yet he was still able to just push through and use this thing over and over and over again. But what we were then able to see was not only did people like this idea, but more people were pushing these ideas to Jira. More people were pushing the finished generated user stories to Jira or to azure DevOps, or to Trello, or to pivotal track or whatever dev tool they were using than the ones that they had written by hand themselves the week before. So we could see that people actually appreciated these things and liked them enough to send them onto the rest of their team to go do something with.

And that was a really interesting vote of confidence in this AI tool. But as we started gaining that feedback in, once you have it, then you can start saying, great, what else could we use this generator for? And that started pulling people’s feedback out, right. We could get inspiration from that. And since then we’ve been able to get, it’s almost like another ten or so implementations of it.

So generating key results from your objectives, because key results can be difficult. And we sort of primed it to come up with outcome focused leading metrics to measure as opposed to a lot of people come up with output lagging metrics. And we’ve turned the thing around to help you come up with easier ones, but also turn it into things that would help you judge as to whether the idea you created was actually any good. So it doesn’t just generate, but it comes up with feedback saying, well, this idea is well aligned with your vision, or isn’t very well aligned with your vision, and here’s why, and give you feedback on that. So we did a bunch of different things like that, tested what worked, and then doubled down on improving the UI to get people using it more, making it more visible and that sort of thing.

There’s been a lot of movement in that space based on the feedback that we saw after that initial launch. That’s awesome. Yeah. And I mean, even tying this back to some of what we were discussing earlier, where just taking feedback is dubious, but it was very clear that this was being used heavily and you were getting a lot of feedback about that. I mean, even that just alone tends to suggest a really clear feature customer fit.

Right. And that was very much something that happened with us, too. It’s funny enough, we launched AI assist quietly. We didn’t even announce it, and a lot of our customers found it without that, and we’re using it and reaching out to us. Oh, this was really cool.

You know what I wish? You know what I wish? You know what I wish actually, we’re already planning to do those things, so that’s great to hear. And then it continues to snowball, which tends to be a pretty good indicator that you’re onto something. Yeah, sorry about that.

Big bang. That was my cat trying to be on the podcast. That’s okay. Cats are welcome.

Well, with that, I realized that we are running out of time and I need to be respectful of that for you. But one of the things that I like to do before we wrap everything up is I ask everybody that we have on the show is if I were to forget everything we talked about and somebody came to you and said, heard you were on that podcast, what was that all about? How would you summarize and maybe answer that question of what did you chat about on the.

I mean, today’s podcast, I would say is you and I geeking out over product in general and customer feedback and how you pick and choose which customers to listen to and why and how it ties back, most importantly, to your company goals. Beautiful.

Really appreciate you jumping on and having that conversation with me. Is there anything that you want to share with folks that we didn’t have a chance to talk about yet? Yeah, sure. So I’ve had a really great time on this podcast, as I said, geeking out over this particular angle of product. This is actually something that I do on my own webinar series.

So if there’s anybody who wants to hear about broader product management topics, I run a series of webinars where flip the script. I am the host, and I host various other product management experts talking about things. Sometimes we cover feedback, but it’ll be things like last time, it was about how to talk ROI and money with your execs. We’ve got other ones about how to get your execs on board with a timeline roadmap, how to deal with OKRs, all sorts of different topics like that. All sorts of different experts from all over the product world.

Hit it up. It’s prodpad.com webinars. They’re all recorded in the past, and they’re ongoing in the future. I’m recording one tomorrow, and we do them live as well, so you can jump on and ask questions as we go. Nice.

We’re going to have a link to that and prodpad in the show notes to where you’ll find this episode. If folks wanted to reach out to you, continue the conversation, ask questions, what’s the best way to get in touch? Yeah, cool. So I’m Janabasto. You can find me on LinkedIn.

I’m the only Janabasto there. So reach out, connect with me, and just let me know where you found me. That’s always really helpful for me to connect the dots or if you want to reach me directly, I’m Jana@prodpad.com. Excellent, Jana. I really appreciate you taking the time.

Love the conversation. Yeah, likewise. Thanks for having me. Yeah, absolutely. Best of luck with everything.

In. Um, okay. All right, everybody, we’ll see you next time.