Ben Pearce (00:01.246)
Hi everyone and welcome to the Tech World Human Skills podcast. We have another treat for you today. We're talking about goals, not the football kind, the ones that help you achieve things. Now, I must admit, I have a love-hate relationship with goals, so I am all ears when there's a different way to think about them. Now, our guest today,
is a product leader and an entrepreneur. Not only that, she's the author of the book, Radical Product Thinking. And Sneak Peek is currently writing another book on why goals and targets don't work and what actually does. So please welcome to the show, Radica Dutt. Radica, it is a pleasure to have you with us.
Radhika Dutt (00:54.626)
Thanks so much for having me, Ben. I'm so excited to be here.
Ben Pearce (00:57.948)
Well, I believe you're joining us all the way from Boston today. For all of those people that don't know you, could you tell us a little bit about your background?
Radhika Dutt (01:09.912)
Yeah, so my background is that I'm an engineer by training. So I did my undergrad and grad at MIT in electrical engineering. And my background is that I started in the startup world right out of the dorm rooms at MIT. And you know, it's just, even as I say that, it sounds like one of those Silicon Valley tech pro stories that just sort of gives me an allergic reaction. So let me put it out there.
that I am very different from that. And in fact, I'm not going to talk just about all the great things I've done. Let me start by talking about the mistakes, right? The first startup that we founded right out of our dorm rooms, it was called Lobby7, and our vision was to revolutionize wireless. And if you ask me 25 years later what that meant, I still don't know.
Ben Pearce (01:40.852)
Okay.
Ben Pearce (02:05.588)
Right?
Radhika Dutt (02:08.066)
But we were going to be big, damn it.
Ben Pearce (02:10.065)
Hahaha!
Radhika Dutt (02:13.144)
And it was all about scale without knowing what problem are we setting out to solve. But it was a fantastic lesson in what I now call product diseases. It was a product disease that we had caught that I call HERO syndrome. And 25 years later, it's still just as common today where companies focus on just being big and scaling and
you know, measuring success by how much fundraising have we done as opposed to what problem are we solving and how well are we actually solving that problem, right? So that was hero syndrome. But I'll share, you know, some more of such hard lessons that I learned along the way. There are other such product diseases that I had caught. And the good thing is, of course, you learn, right?
So over time, I built an intuition for how do you avoid these product diseases and build good products. And so here's the good part. As I learned how to do better, I've now been part of five acquisitions, two of which were my own startups. But at some point, there was this burning question in me, which is, are we all doomed to learning from these hard lessons? Or is there a step-by-step process that we can use?
to build world-changing products. And that was how radical product thinking came about. So that brings us up to the story of radical product thinking and why I wrote it. And now I'm working on this next book because I realized that that just wasn't radical enough. We just had to go a step further. We can get into all of those things.
Ben Pearce (03:54.085)
Okay.
Ben Pearce (03:59.86)
Right. So the next book that you're currently writing as we speak is all about goals and goal setting and some things that are broken with it.
Radhika Dutt (04:13.486)
That's right. What I discovered is for a long time, I found that there were side effects of goal setting. Just really bad side effects and perverse incentives that goals and targets were creating. But I couldn't articulate for a long time what it was that wasn't working about goals and targets. I thought, maybe it's just me because...
You know, clearly goal setting is how we're supposed to build companies. That's what helps us grow, right? Just everyone can't be wrong about this. know, surely it's what I'm doing. But as I started delving into it, I realized, no, it's not just me. Like in private, a lot of people would say what you said, Ben, which is I have a love hate relationship with goals. And many people went further saying, you know, I hate goals and targets. But the thing was people also would say, well,
It's the devil I know. What do I do otherwise? And so it's only in the last year, year and a half that I've realized, well, there is an alternative. And I've been trying it out with companies and it's worked so well that I felt like, okay, now's the time to write about it.
Ben Pearce (05:29.081)
Should we start off? Let's look back. Why have we got goals? Like, where did this idea of having goals come from and how are we where we are today?
Radhika Dutt (05:44.655)
I love this question because, you know, every time someone publicizes or evangelizes a new way of doing goal setting, it's repackaging the same old ideas, but it makes it sound new. Let's look at OKRs or objectives and key results. If you're not familiar with them, you're lucky. But, you know, but this was evangelized as a methodology by John Doar in 2018. And, you know, Larry Page,
Ben Pearce (06:05.071)
Hahaha
Radhika Dutt (06:14.722)
Google's founder said, you know, this was the secret to Google's success. So we think, whoa, this is a new thing. But you dig a little bit into it and you see actually OKRs were instituted by Andy Grove at Intel. He's the one who came up with them in the late 70s and 80s. Right. And so you then go back in history and you look at Andy Grove and where did he get goal setting and OKRs specifically from?
and he just modified a little bit what Peter Drucker came up with in the 1940s. 1940s, that's a really important thing to think of, Like, let's think about how different the workforce was in the 1940s versus now. And so let's go back and look at what problem was Peter Drucker solving? Why did he come up with goal setting? And the answer to that is that he was working with General Motors at the time, and the problem General Motors had was they had a workforce
that was primarily working on assembly lines where there was little automation at the time and it was unskilled labor. And so, his ideas at the time were revolutionary that instead of using command and control, set targets together with employees and then measure them by it. Makes sense. And in that setting where you're working on an assembly line and it's a repetitive task, it's really easy to tell that Andy here is a better performer than
because he installed 45 tires, whereas Tyler here installed just 40. There's one right way to install tires, not too many ways you can do it. And so it's easy to tell who's a better performer. But now you take that same mindset and the ideas behind goal setting in the 1940s and you apply it to today's knowledge work. Well, first of all, it's mostly skilled labor.
Any task that could be automated generally is. And so the kinds of problems that we're working on are more like complex problem solving or more like puzzles than repetitive tasks. So let's look at even a manufacturing problem, like the problem of Boeing, right? And you say, okay, what if I just use the same goal setting in Boeing's manufacturing floor? And you see what sort of problems you get with quality and panels flying off of planes.
Radhika Dutt (08:38.924)
Because you have skilled labor that's doing this work. It's not all about automation, right? You get quality issues where you have electrical wiring right next to metal shavings. Not a good idea.
Ben Pearce (08:55.155)
Okay. So, this sort of goal setting has gone back, you know, a long time. I'm sure probably people were setting goals way before that when they were building cathedrals in the middle ages or whatever it was. But surely there's gotta be something because, you know, if I just think of like the most simple meeting that I might have with somebody, you know, you have a conversation about something and then you go, ah, well, okay, right. Who's gonna do that? You you decide something.
Who's gonna do that? When you're gonna do it by? How do we know when it's done? Okay, go. And so in its most simple fashion, it's just saying, right, well, there's something you need to achieve and how do we know if we've done it? Am I oversimplifying that? Are we starting to think that maybe that's not an appropriate way to do it?
Radhika Dutt (09:46.434)
You know, we want simplicity, right? And so how do we get simplicity? Sometimes that sort of simplicity is alluring, but it's not practical. So let's delve into that example. What you're essentially saying is, but don't we need goal setting to align people on what is the impact we want to have? And this is exactly what people
evangelizing OKR's say, know, whether OKR's targets, KPIs, whatever you want to say, you know, it's a way of creating this alignment. How do we know, you know, that we're all marching towards the same thing? So one of the first things that you have to look at is, huh, why goals for alignment? Are there other ways we could do alignment? What I've discovered is that goals and targets, they're like duct tape on foundational cracks.
That's sort of the solution of using goals for alignment. Because one thing that I find missing is a clear enough vision for whether it's a product or a company. When the vision is fuzzy, then you need the next level of detail like goals to be able to really define what in the world are we actually doing, right? Because what kind of a vision do we typically have in a company? Let's go back to the Boeing example.
Ben Pearce (10:41.043)
Okay.
Radhika Dutt (11:07.939)
Boeing's vision is to be the enduring global industrial champion in aerospace. What does that mean actually? Do you measure industrial champion by revenues, by shareholder returns, in which case just do shareholder buybacks? Is it based on market share? Is it based on technological leadership? It's not clear, right? And so when you have a vision,
that is fuzzy and one of these broad visions that captures everything, that's like having a net with holes so big you're not catching anything. You need a filter for a vision so that when you hold up a feature or an initiative against that vision, sometimes the answer has to be no, don't do it. When you have a vision like be number one, anything
falls under that trap of, yeah, just do it, right? I want to give you a concrete example of what I mean by such a detailed vision. And that's what I tackled in radical product thinking. The ideas are radical because we've learned that vision statements have to be big, broad, et cetera, just like I described. In the radical product thinking way, you write a detailed vision statement that actually defines the narrative of the impact you want to create, not bullshit statements. So.
Ben Pearce (12:16.146)
Okay.
Ben Pearce (12:34.131)
Mm.
Radhika Dutt (12:35.023)
Here's an example of a fill-in-the-blank statement in the radical product thinking book to define your vision. So here's what I would write for a startup I had in 2011, sold it in 2014. It would go, today when amateur wine drinkers want to find wines that they like and learn about wine along the way, they have to find attractive looking wine labels or find wines that are on sale.
This is unacceptable because it leads to so many disappointments and it's hard to learn about wine in this way. We envision a world where finding wines you like is as easy as finding movies you like on Netflix. We're bringing about this world through a recommendations engine that matches wines to your taste and an operational setup that delivers these wines to your door.
And so this is a radical vision because this gives you the kind of information that you wanted, right? Which is what are we doing? And you can break this down into who's going to do what. And the one problem with goals to define, you know, this is success is you might discover things along the way. doesn't give you, you know, goals and targets don't give you space for the
discovering and figuring out. And we can get into that. I know I already covered a lot. Let me pause there for a moment.
Ben Pearce (14:05.203)
No, is interesting. mean, like you say, we've got to that kind of the vision part of the goals that we've been discussing. I sort of, think back, you know, so my background was I worked at Microsoft for a long time. And if I think of the mission, it probably wasn't this when I got there, because I joined in the late 90s, but it would certainly been the mission before, which was a PC on every desk, right? And that was what Microsoft were famous for, right? Let's put a PC on every desk.
I joined and we were already doing server stuff. we were all so it clearly that wasn't the mission. When I left, the mission was along the vision was along the lines of and I'm not going to get this word for word, but empowering everyone to achieve more. I think it's words like that. And it might have changed since then. And I used to not like that because it was so broad. It was it was like, well, that doesn't actually mean anything to me because I'm in power. You know, that could mean anything now.
People also talk about, you've got to be sharp, haven't you, with your, putting a PC on every desk, right? Right, that's short and sharp. Whereas the one, the example you gave me was quite verbose and sort of went on for a while, so it's not going to roll off the tongue. It's not like a, it's not like a just do it, like a tagline, is it? It's not that type of vision.
Radhika Dutt (15:15.791)
Mm-hmm. Yeah.
Radhika Dutt (15:24.365)
Yeah, you're exactly right. think we always confuse taglines with vision statements, right? The power of a vision statement, like in that fill in the blanks that I described, is you don't fall in love with your own words. Instead, and by the way, you also aren't requiring someone to memorize a vision statement. You know, even just what you were saying about Microsoft's vision or mission, you were like, let me see, I'm not going to get the words right. But here's the thing, the exact words aren't the problem.
Ben Pearce (15:30.172)
Bye.
Radhika Dutt (15:54.032)
What you want is to understand and internalize the problem statement so much that you're excited about solving this problem statement. It's the shared sense of purpose we have. And that's what such a detailed vision brings. It helps everyone have that shared sense of purpose. You never want someone in your team to repeat your words.
in the vision because that just means they're just parroting back what you said. What you want is you want to hear the vision in their words and yet it's pretty much the same thing that you know you were saying yourself.
Ben Pearce (16:38.355)
So if we start to of think, so you've got the vision part of the goal. I mean, if I'm honest, the goal framework I always just remember was always smart, which was, I'm gonna test myself now, specific, measurable, achievable, realistic, timely. I'm gonna guess that I might be right. I might've got something like that.
Radhika Dutt (16:49.999)
Yeah.
Radhika Dutt (17:00.099)
Yeah, yeah, something like that, close enough, but yeah.
Ben Pearce (17:02.705)
But that, and I always use those. So like whenever I was like sort of setting a goal, I would always go, right, smart. But what I learned early in the career, really early in my career was actually you get what you measure and what you measure isn't necessarily what you want to get. And the example that I always kind of think about was when I first graduated, I went into...
Radhika Dutt (17:23.213)
Mm-hmm.
Ben Pearce (17:29.881)
support desk and so I was you know people would ring up and I would fix calls and it was working working at Microsoft and there was a team of us working and one of the leaders that we had you know very much was measuring our performance and was very much looking at who closed the most calls in the shortest period of time because ultimately in customer support that's kind of what you want right customers want things fixed quickly
But if you just measure that, the behaviour that that led to was there was one particular person, I shall not name this individual, but some people listening will remember this person well. And what this person used to do was monitor the cue, know, monitor the cue like a hawk. And then as soon as a case came in, look at it, and if it was quick and easy, pick it up and put it in their cue.
Radhika Dutt (18:02.061)
Yeah
Ben Pearce (18:22.319)
solve it and get it done. So what that meant was everyone else was left with the harder ones, the ones that weren't so quick and easy to fix. And so actually, your best performers, the people that could solve the most challenges in the most effective way were left with the ones and their stats look bad. Whereas the worst performer that could not solve many of the problems was the highest performer. And I remember looking at that and
And that is when I first ever learned, you get what you measure and you don't always, when you measure it, you don't get what you want.
Radhika Dutt (19:01.698)
love this example for so many reasons and in fact I want to quote you on this example Ben in one of the chapters I'm working on.
Ben Pearce (19:06.418)
Hey
Ben Pearce (19:10.425)
do it! I'm always available for any chapter in any book. Ever.
Radhika Dutt (19:15.056)
Brilliant.
That's a great example for so many reasons. Let's unpack that, right? One of the things that you described, this is what research shows as well, that goals and targets, they erode collaboration in a team. And it creates performance theater. What you just described was performance theater. One can make stats look really good. You game stats and...
you know, whoever has the highest and best looking stats doesn't necessarily, it isn't necessarily the highest performer. one other point about this is in the end, what this really does is it's soul sucking for the high performers, for the actual high performers. It's demotivating to see the sort of performance theater. You feel like, you know, you want to do good work. You care about your work. and here is, you know,
just a way of measurement that is killing your motivation, right? And so the fact is we want to implement goals and targets to motivate and drive teams. And the exact opposite happens because of all the performance theater. so you're demotivating your highest performers. And on top of that, you're not seeing the best business results either because people are chasing numbers. And I'll give you one more example of something like this where
Even in sales, you have targets. And one would say, well, of course you need targets. And yes, you absolutely need to set expectations that this is what I want to achieve in terms of business goals. And I'll go into what do you do instead of just targets. But let's just look at sales as an example for a moment, where I was working at a company called Avid. We absolutely dominated the video editing business.
Radhika Dutt (21:11.192)
And every movie was being edited with Avid's video editors. Like every Oscar winner was edited with an Avid video editor. And we were hitting all of our sales numbers in the video editing division. You know, our numbers looked great. But if you just dealt into those numbers a little bit, what you would have found was our low end was getting eroded by Apple and Adobe. The mid tier two was starting to get eroded.
But how did we make those numbers? By moving further and further into the high end. What happens with goals and targets is you have the incentive to say, ta-da, I hit my numbers, right? I overachieved even. Even without it being malicious, you are inclined to not look at the bad numbers. Think about even sales where you've probably seen this in your company where
You lose a deal in sales. How long does it take for sales to actually take time to reflect and go like, huh, I wonder what happened there. Do you really see that reflection or do you see, okay, guys, let's move on. Moving on to the next deal. Let's make our numbers, right? Like you don't have the sort of reflection, learning, adaptation. That whole muscle of experimentation, learning, adaptation, atrophies.
when you have goals and targets because your temptation is, whether subconsciously or deliberately, it's to sweep away the bad numbers under the rug.
Ben Pearce (22:44.915)
Yeah, I love this gold bashing. I'm to do some more gold bashing and then we will and then we must turn into what we can do about it. The other things I remember, there was one particular, I remember leading a team and I always remember sort of having, I'm not sure I'd said it out loud, but kind of thought if you do the right thing, the numbers will take care of themselves. And that was kind of, and then I started realising that was a very naive way to think about it.
Radhika Dutt (23:06.426)
Hmm.
Ben Pearce (23:13.075)
And I had to modify that to do the right thing and manage the metric. Because if you didn't manage the metric, then you wouldn't be around for year two to continue doing the right thing. And so the right thing might be, we're selling to the wrong people, we need to, but you needed to manage the metric as well. So it just felt like that that was always a big part. You couldn't just do the right thing. You had to do the right thing and then manage the metric as well.
Right, we've done a lot of goal bashing. I imagine what... Yes.
Radhika Dutt (23:47.153)
And just on that note, right, what you said there is so important. I've heard this from so many people. Those who care about their work and are those high performers end up doing double work. One, to do the right thing. Two, to show numbers, And that is sometimes soul-sucking, but other times it's just also causing burnout because you're doing double the work sometimes. But let's continue. On to the solution then. Go on.
Ben Pearce (24:10.385)
Yeah. Well, yeah, that's it. We've bashed goals, right? Whether they're smart or OKRs or whoever they are. So what can we actually do about it?
Radhika Dutt (24:24.664)
Yeah, so the answer that I've discovered lies in puzzle setting and puzzle solving. That is really the kind of work that we're really doing, right? We are constantly solving puzzles. So first of all, you know, let's even try this experiment where I say to you, you know, just tell me how these two questions make you feel, right? The first question is,
What are your goals for this year? Just think about the feelings that well up in you when I say, what are your goals for this year? And here's the second question. What puzzles would you like to solve this year for the business, Ben?
Ben Pearce (25:07.139)
Do you want me to tell you how I feel with both of those? Yeah, so when you say what are your goals? Slightly overwhelmed, slightly like, I don't know. Slightly like who's gonna hear them, because what am I gonna commit to? And if I commit to them, then I sort of got to nail them. And are they the right ones? You know, those are sort of the feelings that I sort of had as you said that. The second one was...
Radhika Dutt (25:09.262)
Yeah, go on.
Radhika Dutt (25:21.658)
Yeah.
Ben Pearce (25:35.132)
What would I like to solve? What would I like to solve? What puzzles would you like to solve? Yeah. And so instantly, I guess that feels a lot smaller, a lot more manageable. And I go, well, I'd like to solve the puzzle of how I attract new customers and do that fairly effectively and consistently. Right. So that's a puzzle I'd like to solve. So the second one feels a lot more manageable and feels a bit smaller to me.
Radhika Dutt (25:38.852)
What puzzles would you like to solve for the business?
Radhika Dutt (26:06.609)
And one other thing I noticed is you were coming up with ideas already. It was already sparking ideas for what you're going to do next. And it was driven by your curiosity. Like, what can I do to attract more customers that are right for your business, et cetera? So this is what we want in our teams. We want to drive that curiosity. And we want to harness that motivation and performance. And this is where puzzle setting and puzzle solving comes in. So let's talk about this methodology that
Ben Pearce (26:11.473)
Yeah, okay, yeah, yeah,
Radhika Dutt (26:36.367)
I call OHLs or objectives, hypotheses and learnings. it helps, it offers scaffolding for this approach of puzzle setting and puzzle solving. Because as a leader, you know, when you hear puzzle setting and puzzle solving, it sounds like, okay, you're just telling me, forget all the business results, forget, you know, the rigor and hard work. It sounds like you're telling me just off you go team.
You know, go have fun, come back when you're done solving puzzles. And I want to very clearly upfront state that this is not at all the intent, right? You very much are driving to business results, but you're doing it by solving really hard problems together and creating a sense of collaborative learning. So here's what puzzle setting means. So the objective can be the puzzle. So instead of, you know, in OKRs where you say objectives grow the business,
Here's how I would define the objective in a puzzle setting format. So let's take the sales example. I would say, know, our objective. So we need to get to X million in sales by the end of this year. That's what the market expects of us. But I see a problem with that. There is a puzzle we need to solve to get to that number. And the puzzle is, you know, our growth.
has stalled in the last year after three years of growth, what might be happening here? Is it that something is fundamentally shifted in the market that we've not accounted for? Is it that maybe we knew how to sell to the early adopters, but we haven't figured out how to sell to the mass market? Maybe there's something about our product that we haven't figured out how to address the needs of the mass market.
there's this puzzle solving that we need to figure out what is happening. And this goes to exactly what you were saying earlier, Ben, which is the kind of, you know, we have to figure out the stuff, but at the same time hit metrics, right? But therein, instead of doing double work, here's kind of how I would approach the puzzle solving, right? So first of all, we set the puzzle, we define the objective as this puzzle. And now,
Radhika Dutt (28:54.179)
in terms of solving the puzzle, you ask three questions. So the first question is, how well did it work? And this is where you define your hypothesis. You say, if I try this experiment, then I expect to see this because this is the connection. And this is where I measure leading and lagging indicators. You know, in the sales case, it might be that I'm measuring
how many meetings am I getting with the decision maker based on this messaging? Maybe the problem is with my messaging that I'm testing out, right? And there might be other such leading indicators that you have. The lagging indicator might be how many deals you're closing. If we're looking at a product, you might have defined your hypothesis as if we try this particular experiment, it might be a feature, then here's what we expect to see.
And then you might have leading and lagging indicators to measure, is that feature actually working? And so the first question, how well is it working? It's really important that it's not a binary question. I'm not asking, have you or haven't you hit this target? I want the good and the bad numbers. That is super important because that is a holistic evaluation where I'm going to play detective on the bad numbers. So that's the first question. How well is it working? The second question is,
What did we learn? And this is where I say to teams, don't just spit out a bunch of numbers at me. Don't tell me, you know, there's our weekly active users, time spent on site, time spent on the feature, blah, blah, Tell me, after you've interpreted all of those numbers, what's actually happening? What are people doing? And what have you learned from that? And then comes the third question, which is based on how well it's working and what you've learned.
What are you going to do next? Meaning, if I were to give you a magic wand, what would you ask for? And so these last two questions, they trigger the creative problem solving part of our brain because you're creating the narrative of what have we learned? What are we going to do next? Whereas the first part, how well did it work? Those are all the analytical parts of our brain, where we're looking at hypotheses, leading and lagging indicators. And this is how we engage the whole brain to solve a complex puzzle.
Radhika Dutt (31:17.241)
And in the example you were saying, Ben, where you felt like you were doing double work, in this case, as a leader, as an individual contributor, you start to present this information across your team in this way of how well is it working, what have you learned, what are you going to do next? And you talk about good and bad numbers. It's not about spinning numbers. You talk about what your learnings are, and you have meaningful discussions as opposed to OKR discussions.
of is this green or yellow and you're negotiating whether it's green or yellow, et cetera. And in terms of what are you going to do next, this is where as a leader, you have the ability to course correct along the way and actually make decisions together with the team, like guide the team in how big a risk do you want to take in that last step of what you're going to try next. Maybe this is too big.
Ben Pearce (32:11.357)
So you've set the puzzle, right, and you've asked that to get the juices flowing. So I'm just thinking about this now in a big team setting, right? So you set the challenge. A million people have a million different ways to solve that puzzle, right? So there's some kind of conversation that happens, some way of ideating, getting that, deciding this is the way we've decided as a group, we're gonna solve this puzzle.
And then you start going about and asking those questions. Remind me of those three questions again we're now going to ask.
Radhika Dutt (32:44.369)
How well did it work? What did we learn? What are we going to do next?
Ben Pearce (32:45.573)
Yeah. Yeah. What are we going to do next now?
Radhika Dutt (32:51.461)
What will we try next, right? Yeah.
Ben Pearce (32:53.755)
With that, are we then gonna set some targets or how? So I get the fact that we've asked a puzzle, which exercise of curiosity, but how do we then execute and make sure that we're executing with appropriate velocity, appropriate energy and progressing at the pace that we need to progress?
Radhika Dutt (33:13.318)
Yeah.
Radhika Dutt (33:18.619)
Excellent question. And this is where I want to share the example of a company I've been working with. It's called Signal Ocean. They're in the maritime industry. And by the way, it's of all the industries I've worked in, which is all the way from telecom, advertising, robotics, wine, government even, this industry is the most complicated I've ever worked in. This is the hardest puzzle I think I've ever worked on.
Ben Pearce (33:42.001)
Wow, right.
Radhika Dutt (33:42.867)
So let's talk about that example and I'll share kind of how do you deal with this thing of driving towards results. We weren't doing it just by constantly looking at targets. So I had the team present this information to me in terms of how well is this particular experiment working? Like look at the data and tell me, did it actually work or not? I wanna know the good and bad metrics.
And I was able to drive or like even in these discussions, we were able to sort of push each other in saying, well, what if we try this instead, you know, and we were driving towards how do we improve the numbers that we are seeing, but it wasn't a matter of chasing numbers. It was to drive those, you know, whatever the learnings were, it was to drive forward the...
puzzle solving. So let me actually talk through the details of the Signal example, because it'll give you a better sense of it. So what was happening with Signal when I joined was sales had stalled, our usage too had stalled. It was a data platform that is used by the maritime industry so that you can find the right cargoes for the right vessels, the right shipping vessels. And so what we were discovering was, huh, when we look at our data,
our current platform and our approach isn't working very well because in terms of our hypothesis, we thought if we have the most accurate data, everyone is going to flock to the system because they need this data. It turned out that a lot of people in the maritime industry are not tech savvy. So we had the most accurate data for doing this shipping matching, but this felt like magic to our users.
And people who are tech-averse don't like magic. They want control. I don't want this platform that's spitting out magic numbers at me. And I have no idea how you got this, right? And so the puzzle was, how do we attract these tech-averse people who don't like the magic we're offering? And yet if we give them too much information of what's underlying this magic, it's going to send them running for the hills because it looks too complex and scary. So that was the puzzle. And so how did we go about solving it?
Radhika Dutt (36:04.358)
We started figuring out their workflow. We were asking very basic questions, like, you know, just to understand their mindset, their mental models for how they look for data, what they do, like, what are the steps before and after, et cetera. We figured out how to make our data platform fit into their workflow. And then we tried things and we talked about how well is it working.
every single feature that we would put out there, we would talk through this in terms of how well is it working? What have we learned? What are we going to do next? And the what have we learned was about, we getting any closer to solving this puzzle that I just described of attracting tech averse users in these different roles? One might be a broker. One might be someone who wants to ship cargo. Another one is someone who is managing their ship. We started solving this and in the span of so
2023 was when I started working with them. We doubled sales in 2024 and again in 2025. We decreased customer churn from 26 % to 4%. And you know, the parts where we were using this puzzle solving the most, we increased usage of those components by 180 % in one year, right?
It was leading to real business results, but it wasn't because we were chasing numbers, but we were doing this puzzle solving and we were getting really good. So what this means for a leader is you have to go through these feedback cycles, really go through, like, don't just, you're not leaving the team to solve their puzzle on their own, just off you go.
It's constant feedback. want to see the progress, right? I want to see what you've done. I want to see the depth of your thinking in what have you learned. If I'm not seeing the depth, you hear that from me. This is about direct, honest feedback. I'm going to push you in the team to show me that you're solving this puzzle. This isn't to slack off. On the contrary, I get the actual information. I actually get to see not just bullshit numbers that you're showing me, but I get to see the depth of your thinking.
Radhika Dutt (38:19.206)
by your answers to what have you learned and what are you going to try next.
Ben Pearce (38:23.421)
So in that example that you just went through there, do you start with the target or do you start with the abstract puzzle? So what I mean by that is you said you doubled sales by, I think that was one of the things you said, you said you doubled sales. So was, somebody said, we need to double sales and therefore you go and engineer how are we gonna do that? Or does somebody just say, we need to accelerate sales?
but we don't have a target in mind. We're just trying to accelerate or grow sales or we're not growing, that's the puzzle. We need to grow, it's the solution state. And now go and figure out and let's see what results we get off the back of it.
Radhika Dutt (39:09.616)
Yeah, we did not start with the target of we have to double sales. Sales had their targets, right? But I was really working with a product team and I wasn't looking at sales targets or, or, you know, doing all of this to hit sales numbers. What we were looking at was first of all, the problem of why isn't our product selling like hotcakes when we have the best data in the maritime industry.
That was the puzzle. And then uncovering that puzzle and then figuring out how are we going to solve it, that was the path that we took. And so you push numbers and so you look at numbers as a way, how do I say this? yeah, we look at numbers as indicators of are we doing a good job at solving this puzzle? And
Ben Pearce (39:49.265)
interesting.
Radhika Dutt (40:03.96)
Sales is always going to be a lagging indicator. It's never a leading indicator. And so to get to doubling sales, if you're constantly looking at sales numbers, you'll only know in the rear view mirror if you hit it or not. You'll never know. It's not an actionable number in terms of here is what I'm going to do to be able to double sales. So starting with a target of I want to double sales gets you nowhere.
And in fact, they used to have OKRs. It was OKRs to, want to double the number of users. I want to double the amount of sales. They did have OKRs, but it wasn't working. Sales had stalled in 2023 despite having OKRs. So what you need to do is shift the mindset to puzzle solving. And that actually helps you tackle those problems. It gives you a toolkit.
to solve those puzzles and scaffolding to solve those puzzles as opposed to, oh, whoops, we didn't hit those numbers, but I don't know, what do we do?
Ben Pearce (41:05.987)
Yeah, it's fascinating. Just looking at the clock, we're almost out of time. One thing I just want to ask and get your perspective on before we then start to wrap up. lots of people that listen to this podcast will either be leaders that have to then assess the performance of their team or will be individual contributors whose every year go through performance reviews, midges, whatever it is.
Radhika Dutt (41:28.754)
Mm-hmm.
Ben Pearce (41:34.675)
Where and that can be that can be then part of their bonus, their remuneration, how well they've done all that kind of stuff. In this type of world that you're talking about, how do you understand how successful people have been and hold people to account, I guess, or do you hold them to account for their performance?
Radhika Dutt (41:57.908)
I love this question and Ben, there's so much more to talk about. think we will need to do another episode at some point again. But let's talk about performance, right? one thing, just if we look inside ourselves, one thing we are so used to is this idea that I constantly need to evaluate performance and I need to rank people. And it's all about quantifying performance. Are you?
Ben Pearce (42:02.259)
Yeah, we need more!
Radhika Dutt (42:26.565)
a five or are you a three, etc. This whole idea of quantifying, why do we even want to quantify performance in this way? mean, if you really ask a few whys, what you get to is, well, unless I tell Ben you're a five, whereas this other person is a three who is solving only the easy problems, well, how do I know who are the lowest performers to fire in the case of layoffs?
So basically if I'm doing layoffs, I want to know who's the bottom 10%. When you step back, what you realize is this sort of a mindset of planning, you know, just performance evaluations or performance management for the worst case scenario of layoffs. That's like planning your business trip or a vacation, planning any trip, thinking I need to plan for my hospital stay there. Right?
It ruins the trip. It ruins the whole point of the trip. So let's get to the question of performance evaluations. Yes, I do want to know who's performing, but goals work well for telling me Andy did 45 tires and Bob did 40. It doesn't work well in a puzzle solving format. One thing I observed in my own experience is I used to have all these people
coming from backgrounds at Google and Amazon, who were brilliant at presenting numbers and showing me how they were optimizing for numbers. And when I started using this approach of how well did it work? What have you learned? What are you going to do next? I noticed that there was a total lack of depth in the actual analysis and understanding what have we learned and what are we going to try next.
They're so used to looking at numbers that their own data platforms are telling them and then making small tweaks based on that. But that critical analysis of what have we learned and what are we going to do next was missing. That's what you need in actual performance management and development. I want to see the depth of people's thinking. The accountability comes not from just presenting numbers and spinning numbers to me. The accountability comes from how well do I actually
Radhika Dutt (44:48.871)
feel like you're solving this puzzle, the results I'm seeing from solving this puzzle, you know, in the span of six months, it's going to be very evident to me how well someone is solving this puzzle. So this approach focuses on performance development and performance evaluation is more like a side dish as opposed to the main course, right? And performance development is the main course, makes, and this approach is easier for performance development because think about
how much you've had to justify feedback when you were giving feedback to people, right? When you were giving feedback telling them, I want you to improve in this area, you would hear so much defensiveness saying, well, no, but you know, that's not a fair assessment, et cetera. Why that feedback? Because people felt evaluated. The less they feel judged, the more open they are to taking feedback and developing.
Ben Pearce (45:47.728)
I mean, an amazing topic all on its own, right? And we must get you back at some point to maybe when you get closer towards the end of the book, we can help plug it a little bit. It's been really, really fascinating, really fascinating. And my brain's whirring all over the place, but shall we just summarize what would be the key takeaways that you'd like people to take away from this episode?
Radhika Dutt (46:12.787)
So the key takeaways would be goals create this burden and a lack of curiosity and playing detective and actually solving problems. With goals, you're going to see the results that you say you want to see, but that doesn't actually mean progress in business. What you need instead is a mindset of puzzle setting and puzzle solving, and you see much better business results as a result of this.
OHLs, Objectives, Hypotheses and Learnings, give you scaffolding so that you're not just letting your team out in the playground come back when you've solved these puzzles. It gives you scaffolding so that you can experiment, learn and adapt. And the three questions for puzzle solving are how well did it work? What have you learned? What are you going to do next? And the final takeaway is that you can get the free OHLs template
at the link that we can share in the show notes. And you can start to apply these ideas. And when you apply them, you're welcome to reach out to me on LinkedIn, because I love hearing these stories. And by the way, just like Ben shared his story about why OKRs or goals don't work, and it's making it into a chapter, your story might too. So feel free to share yours with me.
Ben Pearce (47:33.723)
Yeah, that's brilliant. I am just as you were saying that it just reminded me of a phrase that we used to use all the time. I'm sure you've heard it, but it was the phrase, the watermelon scorecard. So if you imagine you've got your scorecard, you know, these are all my things with all my, you know, my numbers and I need to hit this number and this number and this number. And the watermelon scorecard was where it's green. Yes. So there'd be a traffic light system. Is it green? Is it amber? Is it red? But it's a green one.
that as soon as you look inside it, as soon as you chop it open, it's bright red. And we used to call those the water. And those were the people that were always just managing the metric, not doing the right thing. It just popped up to me when you were saying that. it's, you know, I've found all of this fascinating. I've really enjoyed a bit of goal bashing. I've really enjoyed thinking about some alternative approaches. And I think what I need to think now to really get this in my brain a little bit, I'm gonna get your toolkit.
So it's on your website radicalproduct.com and it's the OHL. Remind us what OHL stands for.
Radhika Dutt (48:38.227)
objectives, hypotheses, and learnings.
Ben Pearce (48:40.53)
I'm going to have a go with that and I'm going to have a go at the next level of detail and see if that helps me with some goal setting. I mean, it's as we record this, we're just getting into autumn, which is kind of like in the UK after the summer. It's like a new year almost. So got some goals to set. So I'm sort of thinking maybe this is another way to think about it. So so thank you very much for this final thing. I just want to say thank you so much for taking the time to come and talk to us. Joining us all the way from Boston.
It's been an absolute pleasure to have you on the show.
Radhika Dutt (49:13.125)
It's been such a pleasure, Ben, such a fantastic conversation, so many wonderful insights. And I'm looking forward to talking to you more closer to when the book comes out.
Ben Pearce (49:22.488)
See you soon, Relica.
Radhika Dutt (49:25.875)
See you soon.