Kent talks with Will King about bringing an industrial design mindset into software: human factors, observing real users, and why good product engineering starts with caring enough to notice what frustrates people.
They dig into product debt, support as a product superpower, pruning features without breaking trust, and how to use AI agents for exploration and critique instead of only faster implementation.
Will's path runs from designing bucket trucks to self-taught software engineering, education products, and database tooling, and that background gives this episode a distinctive lens: software is still a product people use with bodies, habits, emotions, and mental models. The conversation makes product sense concrete through examples like onboarding timing, course complexity, support workflows, and the small confidence signals that separate stable-feeling products from merely functional ones.
You'll hear why watching users work keeps surfacing across this series, how to tell broken experiences from merely unpopular ones, why user feedback usually improves polish more than strategy, and how product engineers can stay valuable in an agent-heavy future by understanding both the user and the constraints of the software medium.
Homework
Resources
Guests
Transcript
Kent C. Dodds (00:01)
Hey everybody, it's Kent, C. Dodds, and I'm joined by my friend Will King. How are you doing, Will?
Will King (00:06)
I am doing great. It is a good day.
Kent C. Dodds (00:09)
Thank you so much for joining me to chat about becoming a product engineer. ā And so I just published the episodes last week ā and sent out an email to the newsletter. You replied and said, hey, I've got like, I've done some product engineering stuff in the past. ā And so I asked you about your experience and sounded really interesting. So I'm happy to have you on the podcast to talk about.
your experience as a product engineer. Can you give us a little bit of your background ā and ā why you're so interested in this category?
Will King (00:46)
Yes, I would be happy to. what's funny is that like normally going back to like your college degree ā doesn't usually factor in once you're a decade into your career. It's kind of like, that doesn't matter so much. Like, why are we going all the way back there? But ā I did not get like a comp sci degree. I got my degree in industrial design. ā And for people who are not familiar with what industrial design is, industrial design is like
Kent C. Dodds (00:57)
Haha
Will King (01:13)
ā designing physical products is the easiest way to describe it. But hopefully what I will be able to share throughout and why I'm, I want to make a point of this is, if you think about industrial design in the context of software, like where we are at in the software cycle, ā as a manufacturing process is that industrial designers are kind of in this role of like product engineers for physical goods.
because if you think back on industrial revolution, manufacturing, automation, like the manufacturing of actual physical goods is kind of already gone through this process that we are now kind of dealing with on the software side of things. And I think there's a lot of takeaways from industrial designers ā as a discipline. So that is like where my career kicked off and then ā met my lovely wife, Allie.
realized I was never moving out of the state of Alabama and you can't really design physical products from Alabama for companies in California. ā So I moved into software. I've been doing that for the past decade. I worked at a education startup for part of that. I worked at a database company that just got acquired by Snowflake and throughout all of those roles.
Kent C. Dodds (02:19)
Hmm.
Will King (02:37)
as I self-taught engineering, I have kind of married these disciplines of like design, engineering, and product throughout like every single role up until now. ā So that is kind of the background that landed me in your email inbox with some opinions and thoughts that I thought would be fun to chat about.
Kent C. Dodds (03:00)
I love that. it sounds like, and actually just to speak to the relationship between physical products and software products, ā I recently listened to The Design of Everyday Things by Don Norman. And ā you mentioned this as kind of required reading in the industrial design.
Will King (03:15)
So good.
Yes. Yeah. That, that one is like,
when you start your industrial design course and you make it into the program, you're like one-on-one class. It has like a list of books and design of everyday things is like always in that list every time.
Kent C. Dodds (03:32)
Yeah, yeah. So when I was listening to it, of course I'm thinking from the perspective of a software developer and designing ā software products. And ā there were just so many corollaries ā between what is in the physical world and what's in the software world. And he even talks about like and touches on software ā as well. ā But I think in general, ā software we've always been trying to...
kind of emulate the physical world anyway. And so it makes sense that there are a lot of ā related principles. ā Yeah. So when you were, ā it sounds like you kind of made the shift over to software pretty quickly. Did you ever design any physical products though?
Will King (04:04)
Mm-hmm.
Yes, lot of corollaries, yep.
Not, not a prof, I guess yes and no. So like my first job out of college was with this company called, ā they're called Altech. They did like cherry picker trucks, like bucket trucks. ā And I got to design like the sheet metal fabrication for like the back bed, like how they have like all the doors and different storage organization. Like every company has like a different specification for how they store their gear.
what gear they have, depending on if they're like a telephone company or a, like a water company, like all of them have different needs. And so they have a fully customized back sheet metal storage design for their, for their trucks. And so I got to like do the design and 3d modeling for about a year, ā doing that. but it was really fun. Right. Yeah. It was really fun for like, ā about six months.
Kent C. Dodds (05:11)
Wow, that's cool. Somebody's gotta decide where the bolts go and stuff, right?
Will King (05:19)
And then once you kind of get through the first six months, what you realize you're doing is like, you're just building Legos every day and all the Legos are about the same shape. ā it's not, I was just like, I need, I need to move on to something that is not monotonous. So.
Kent C. Dodds (05:36)
Yeah, interesting, yeah. So let's talk a little bit about ā the, like, some of these opinions that you emailed me about. So ā once you get out of ā industrial design ā and get into software, where did you go and how did that industrial design background help you in these ā roles that you had?
Will King (05:47)
Yes.
Yeah. So the, my first job, ā in software was working as a designer slash front end engineer for a education company, which is something you are extremely familiar with having run education, ā like your entire, ā not your entire career, but like a large chunk of your career. So you'll probably, ā relate to a lot of this. What's so interesting about designing for education is that like. You immediately hit like.
everyone needs a different solution ā because everyone learns differently. And so the way that industrial design, I think, moves into software is that I think a lot of people see a flat screen and they forget about how important it is to understand how a person uses it because you don't think of it as
Like you said, like you don't think of it as a physical product. Like you don't think of it as like, well they're interacting with it. They're using it. They need to understand what to do, when to do it. ā and like you lose this concept of like, ā human design principles. So I think a lot of people get into like, well like are our buttons the same sizes are like our menu items. Do they make sense? and you think about things very clinically, I guess is the best way I could.
could describe it. like, you can see this, especially across like large organizations, like they, get into this rut of like, how can we automate as much of this as possible? How can we like uniform, make everything as uniform as possible? ā And you lose the part that like, ā software design is still designing for humans as well. ā And in industrial design, what they call it is they call it human factors.
And there's this large, there's this big category. If you Google like, what is human factor design, there's this like huge category of like material and study around like, how do different people of varying sizes, shapes, ā disabilities, like backgrounds, ā like even into like, like how people think emotionally and all of that to think through, like, how does this human physical human interact with the product that you use?
And education, have to confront that immediately. kind of, certain software, you can get away with not needing to confront that right away. But I got dumped right into the deep end with education because so much of it is about how people learn. ā so the human factors design ā is such an interesting topic of study that I think is missed a lot when people think about.
how like software is built.
Kent C. Dodds (08:55)
Can you give us some concrete examples of like failure modes for the human or for that human factor or ā like successes? What are ā some specifics of things that people get wrong about that or and ways to fix those things?
Will King (09:12)
Right. So, I'd say like one, one that came to mind listening through a couple of your other episodes, you had this like really good question, for Dylan. Dylan was talking about his domains, the Verso domains project and how they had this idea for like the social media, ā like social media sharing. ā and, ā one, like one, you asked him like, well, how do you know in the onboarding? Like,
what features you should show when you have these like interesting features like this that may not necessarily be something that somebody needs to be onboarded. Like what do you decide to show? When do you show it? ā That kind of thing. And the human factors part can play a bigger role in that, ā which is like, you need to understand the demographics of your audience. Like ā who is like very active on social media?
Kent C. Dodds (10:01)
Hmm.
Will King (10:05)
Is it going to be like your lower age end of the market? Is it going to be your higher age into the market? Understanding like, you're like age groups, your demographics for like your product can help you make, they can help you make like, we build this or not decisions, but they can also help you make decisions around like, what do we prioritize in like the critical path? ā so like. Verso trendy, generally speaking.
Kent C. Dodds (10:30)
Mmm.
Will King (10:35)
shared around ā social media as like a company is just a general, like a very social company. So that product could make a lot of sense to put that upfront. ā And then like one other factor to consider that when I was listening to that too, from like a product engineering perspective is I think a lot of people forget that ā time, ā time is a, like a metric that you can like use for design. ā You don't have to think of
Kent C. Dodds (11:02)
Hmm.
Will King (11:04)
Well, how do I show everything on this page right now? You can also think of how do I show this page the first month that a user needs to be interacting with it versus three months from now? Cause like one of the first things that popped into my head was, do you need to show your domain search the first day that you started using domains? How many searches could you have? But maybe in like a month or two months, suddenly you've had like hundreds of searches and it could be a really fun feature to like.
Kent C. Dodds (11:15)
Hmm.
Will King (11:33)
have pop out of the woodworks to say like, would you like to share some of these searches? ā So like thinking about like age and thinking about like time of use on a product as a dimension are like these like factors that you can like think about beyond just pixels on a screen.
Kent C. Dodds (11:51)
Yeah, okay, that makes sense. So, factor would just be ā like a number of ā categories or, yeah, ā I guess factors.
Will King (12:03)
Yeah, a lot
of things are talked about with like, um, there's this whole, there's this book and it's very expensive for industrial designers, but it has like, uh, measurements for people. like, you think about physical products, when you're designing a single product, you could have people who are seven, two, you could have people who are like four or five, but there's this, like, there's these statistics around, like, here are the average heights of different people in different areas. And like, you can like look through kind of these numbers, but it's just this idea of like,
Kent C. Dodds (12:26)
Hmm.
Will King (12:33)
understand ā who the demographics of your users are, understand their measurements, understand how they think, how they interact, and with the internet and with digital products, it's less about how they physically use it, but how do they, what's their mental model of it? Younger people have a completely different mental model about infrastructure than like,
Kent C. Dodds (12:39)
Hmm.
Will King (13:02)
50 and 60 year olds who are maybe coming from like having managed their own servers and stuff. It's just like taking into account those type of factors, not just is this easy to use? It's well, is it easy to use for who? Like this age group, this age group, which age group is like a lot of your makes up a lot of your user base. Just thinking through like a lot of that.
Kent C. Dodds (13:06)
Hmm.
Yeah.
Yeah, yeah, that makes a lot of sense. So how do you avoid just taking a shot in the dark and say, well, I think that most of my users are this, that, or the other. Like, how do you get a little bit more scientific about those factors?
Will King (13:37)
Right.
ā man, this is like the fun part is like, I don't think you should, I don't think you should be that scientific. ā I really don't, I don't think that you will find a lot of success waiting until you have scientifically backed evidence for something because I think what you'll run into is that like by the time you feel significantly moved in the right direction ā statistically.
Kent C. Dodds (13:46)
Hmm.
Mm.
Will King (14:07)
you'll realize that those statistics are now out of date or like the thing that you were trying to like capture the value of the values not there anymore. Or the value has now moved to this other ā place that this other company was able to like to capture kind of before you, cause you waited. ā So I don't, I don't know if there's necessarily like a, I don't say that to be like, you should be more logic, like scientific about your approach. I think it's just more of like the way I describe.
Kent C. Dodds (14:10)
Yeah.
Will King (14:37)
being good at anything, being good at product, being good at design, being good at engineering, there's just, it all boils down to two things. Either ā you're needing to make sure that ā you can observe a problem or you know how to solve a problem. And if you skip either one of those, then like that's a good signal for like why your thing isn't good. And this kind of falls under that like observe. ā You need to be able to observe these things.
You don't have to get scientific about it, but you do need to factor that into your understanding of a problem when you're trying to observe it.
Kent C. Dodds (15:18)
Okay, yeah, that ā actually makes a lot of sense. I think like, I hesitate to, ā it feels like this, what you're saying is like just rely on the golden gut sort of feel, but I don't think that's actually what you're saying. So how do you decide between not being at all scientific, but also not just relying on your gut? Like, is there anything that you can actively do?
Will King (15:39)
Right.
Kent C. Dodds (15:47)
to properly observe the problem.
Will King (15:51)
I thought about it this way when kind of like thinking about that, that question of like, is it a gut thing? I think that when it comes to like to observe a problem, ā that is not a gut thing. Like you're, you have to observe reality as it is. So, ā that can be done through like user interviews. There's these, if we want to talk about like recommendations, like there's these two great books.
Deploying empathy and the mom tests. And both of these books are fantastic books for how to understand talking to people and getting like realistic information from them about like how they feel about a problem in industrial design. ā the whole first part of any. Like design process, this process of creating a new product. It requires you watching people use the existing solutions because
Kent C. Dodds (16:49)
Hmm.
Will King (16:50)
There has been, there's some executive or there is someone who had an idea who's like, Hey, this is a problem. People are saying it's a problem. So I need to observe the problem. I need to observe them using it. I need to know how to ask questions. That's what those books kind of help you do. I need to know how to ask questions to get people's real feelings about using it. Like, yes, you don't like it, but why do you not like it? What about it is something you don't like? ā those are very like watch people use the thing.
There's also like, when you think about engineering and systems, like observability, like being able to see the facts as they are, like you can have a user say, hey, this didn't work, but what didn't work? Like that's like where observability can come in to help. ā One superpower at ā Crunchy is like support. ā I think any body who used the Crunchy data,
database service, ā yes it was a really good Postgres database, is a really good Postgres database. But what everyone talked about was how great it was working with our knowledgeable support team, our support engineers, and a superpower is when you have extremely knowledgeable, if it's not an area that you are an expert in, I'm not a database expert, much to my boss's chagrin, but I was like,
chatting with our support engineers constantly because they were the experts. like being able to pull from like experts, like their opinions observing. like, if you're not an expert, when you're trying to observe a problem, if you're not an expert on that domain, then like work with people who are experts and like observe together and get information from them. But that's like the first half is just, you've got to be able to see it in action.
before you can do the second half, which is like solve the problem. ā And like the solving the problem is where you go from, like that's where the gut kind of kicks in.
Kent C. Dodds (18:55)
Gotcha, yeah. There's maybe like intuition and stuff like that. We love to talk about that. I've gotten that queued up as a ā follow-up question too, but I wanted to just point out that ā watching users use software is ā something that keeps on coming up in this podcast series. ā And I really appreciate that. ā
Will King (19:18)
Right.
Kent C. Dodds (19:24)
Like yes, you wanna ask users like, what do you not like about the current solutions and things like that. But users, typically, unless your users are developers or something, they're not gonna have the right words for describing why a solution doesn't work for them. They're just gonna say, I don't know, I just don't like it. So ā watching them use it will reveal a lot. then you can say, well, why, in fact, I was just having this problem this morning.
I was trying to sign in to Spotify on one of my kids' devices and they didn't have a show password ā option. So like I'm typing this one password, password, which is like just a bunch of nonsense and it's a small keyboard and I can't figure it out. I wanna be able to see the password, but I couldn't. And my thought just immediately went to ā if they just watch somebody do that, then they'd be like, yeah, like totally, big miss.
Will King (20:01)
Hmm
Bye.
They just watched. Yeah. Like it becomes painfully
obvious. Like you can watch something and just like, you could have spent three months thinking of like, man, why are people like you think about it like onboarding flows, like why are people struggling to convert to this product right here? And you're thinking of like all these wild reasons, like, man, maybe we don't have good positioning. Like maybe, like maybe people aren't resonating with like the words we're using, blah, blah, blah. And if you just watched.
It is a lot of times something so stupid, like so simple. Like it's just a paper cut that if you could just see somebody use it, like you said, it just slaps you right in the face. You're like, ā dang it. Like that's it.
Kent C. Dodds (20:57)
Yeah, you know, an idea just occurred to me ā for my own products ā to like offer people a discount to watch them go through the purchase flow of my thing.
Will King (21:03)
Mm-hmm.
That's actually
a very common tactic. there is, we did that a lot with, at like the, education startup, like we would, we would offer, discounts on courses for us being able to like do user interviews specifically for like onboarding or like, if we had a new course, we would pay students, we would pay students to say like, Hey, would you go through this course so we can like watch you?
Kent C. Dodds (21:13)
Hmm.
Will King (21:37)
learn the video and then speak out loud, like, does this make sense to you or not? Why does it make sense or why doesn't it make sense? ā And like, it makes a huge difference hearing not like feedback after the fact, but like feedback in the midst of a frustrating situation.
Kent C. Dodds (21:55)
Yeah. Yep, 100%. And when I am asked to provide feedback on different products, I always just fire up a loom or screen recording of some kind because I just know that I'm gonna go through it, I'm gonna have frustrations, but I'm just gonna, know, whatever. And then when I go to talk to them later, maybe I'll call out some of the big things, but I'll just forget about those little things that, and often those little things are the
the polish items that give users confidence in the entire solution.
Will King (22:28)
Yes.
Yes. Confidence is a good word. That is like, there is nothing worse. Like the, difference between like a good product and a great product is, is that difference in like how confident, how stable you think a product is through like tiny interactions.
Kent C. Dodds (22:46)
Yeah, I'm gonna avoid throwing a product under the bus, but there's one that like drives me bonkers and they have so many like little things that as a software developer, I'm like, ā this is, this would be so easy to fix. So easy. It's so painful to use.
Will King (22:58)
It would not.
Yeah. It's it's some, it's once you, this is way I like think about product engineering is like, usually the issue is that it's not that people couldn't be good at it. It's just, don't care. But like, once you like crack the seal on caring and like noticing, so a good friend of mine, Josh Puckett, he has this course called Interface Craft. And,
his very first lesson in how to be good at designing interfaces. He calls it being someone who notices. And like that is the first step to like getting better. Taste is the word people throw around a lot, but to get better at having a good gut about any new thing. The first thing that you have to get good at is being someone who notices and pays attention. Like that's it. Like that's literally just the first step. And if you are
do a good job of noticing, then you start cataloging all these things, and that's good and bad. You notice this bad thing. Well, I don't like that, that's bad. I don't wanna do that. That's a bad signal, I'll do a good, or like, wow, that was really good. Why do I like that? If you can answer why you liked something, then you're a lot closer to being able to answer how do I do something.
Kent C. Dodds (24:18)
Yeah, it sounds like ā almost journaling ā on a regular basis would be helpful for that. ā So this same product that I'm not gonna throw under the bus, but I kinda want to, because that's how I feel about them. But I had this problem, and so finally, I hate doing this, I hate reaching out to support about stuff. I'm just like, you shouldn't have to. But okay, so I reached out to support.
Will King (24:21)
Yeah.
It definitely would be.
Yeah.
Mm-hmm.
Kent C. Dodds (24:44)
And after going through their typical like, did you sign out and sign back in again? Like all of these things that you're just like, this is so, let's just cope. But like doing those things, then they said, okay, I did this thing on my end and now try it again. And of course it worked. And it just makes you think if they, I'm very confident because I know this company or I've had experience with them in the past.
Will King (25:00)
Mm-hmm. So frustrating. If they could have pressed a button.
Kent C. Dodds (25:12)
I'm confident that nobody's paying attention to that support issue. That support person seemed like they cared, but nobody from development, nobody from product is talking to those support people about these problems.
Will King (25:22)
Nope. You know what that sounds like?
It sounds like an internet provider is what it's like. My internet is down. ā okay. Let me check. Some magical button gets pressed and does your internet suddenly work again? Like, yeah, it does. I don't know what you did, but like, yeah, nobody cares. There's a button that support can press if somebody cares enough to call.
Kent C. Dodds (25:27)
ā yeah.
Yeah. Yeah. It's
Yeah, exactly. so this is just again, like talk to your support people. What are the issues that keep on coming up? That's our backlog.
Will King (25:51)
ā
Well, here's a, here's something that, ā came to mind when thinking about this is like, there is, when I thought about like feedback, like getting feedback from users is, a huge part of being a good product engineer. The problem, I think the problem is with that being like the main message, like you want to be careful that you don't make like your entire ethos as a product engineer or somebody who's like good at product.
being based on like, well, I listened to all the users feedback and that's why I'm good. User feedback can only provide like incremental improvements. They cannot make order of magnitude improvements for your product. That's where like.
Kent C. Dodds (26:24)
Mm-hmm.
Maybe
early phases of the product, would say that you can make huge improvements with the right users.
Will King (26:43)
Yeah, if you've got early users who are like validating the whole concept of your business, most likely, yeah, they could definitely help. But even then, they don't have, ā unless they're like somebody who's like thinking the same way that you are as the owner of the product, owner of the business, the person who cares about like the scope of it, you are gonna get targeted like,
targeted feedback on how the experience felt doing the thing that they were trying to do. And that won't give you an order of magnitude change. What it will do is like, it will help you create like stable better products because you realize like all the little things that like aren't going well. ā But just listening directly to what they're telling you is wrong will not make an order of magnitude difference. You have to have... ā
Kent C. Dodds (27:18)
Mm-hmm.
Will King (27:41)
this like, that's where like the solving problems comes in. Once you've observed the feedback is all the observing. Once you've observed, you've got to like, you've got to practice, ā the, ability to solve, Ira glass called, like, have you heard of the gap by Ira glass? So there's a, there's Ira glass has it. And then, there's this really good podcast, by this guy's name is Jay Akunzo where their whole, the whole shtick is like the gap between, ā you know,
Kent C. Dodds (27:56)
Hmm, no.
Will King (28:11)
what something good is, but you can't do it yet. That's the gap. And the only way that you cross the gap from being somebody who can recognize when something's good to being somebody who can do something that is good is just reps, like repetition on that problem. like solving that problem over and over, then going and getting that feedback loop of like, feed, like observe, and then try to solve the problem again. ā That's that gap. You have to like get those reps.
Kent C. Dodds (28:14)
Mm.
Yeah. Yeah. Practice.
Will King (28:40)
And then when you, when you start getting those reps, what you start realizing through those reps is like, you start pattern matching and it's pattern matching that you can do because like you've done the reps you see, you, you start seeing like the matrix is like what I like to kind of think of it as is like, and I think Dax talked about it a little bit in his episode where once you've done something in a product area for long enough, you start seeing
things less as individual requests and start being able to see like, ā this group of issues, like matches this primitive that we can like, if we can provide this fundamental fix, it will solve. I liked your use of ā calling it like a problem tree. Yeah, yeah, yeah. Those are like.
Kent C. Dodds (29:28)
Yeah, I was just going to bring that up.
Will King (29:32)
I really like those like mental models for this, like that's it. Like you're trying to figure, like when you're trying to solve problems, you want to be trying to cut off as much of that tree as you can with every solution versus just picking leaves off the tree, you know?
Kent C. Dodds (29:46)
Yeah, and I was gonna relate that to what you said earlier where users can give you feedback on the polish of a particular solution, ā but maybe they're not going to say this solution shouldn't exist, right? Like we need to go up the problem. Like that's not what they're thinking about. They're just saying, hey, like I couldn't see my password as I was filling out, you know, that login form. It'd be nice to have a little C password ā icon, but maybe there's like a completely way, a different way to... ā
Will King (30:00)
Right.
Kent C. Dodds (30:15)
to get to where that screen doesn't even exist or whatever. Probably a bad example with a login. But yeah, that's basic idea is just the problem, solutions to problems create more problems. And the goal is to find the solution that takes you down a path that not only creates fewer problems or at least smaller problems, but opens up more opportunities as well, which the metaphor of the problem tree doesn't quite cover.
I guess we could go the opposite direction. We're roots into the ground and you're hitting fewer, like don't go in the direction of the concrete.
Will King (30:45)
Right.
Yeah. It's just like, whole concept is like always been true, but it's like even more true and important now because of how fast people can create more problems. Like the reality is, is like every solution is going to make more problems. like, you're not like, don't think of like everything that you can ship as like this great, perfect thing. Like.
Everything comes with trade-offs and like part of being good, a good product engineer is understanding the trade-offs and knowing that like you're coming out, but like the get diff, you know, like it's like a get diff, but for product, like, you deleting more than you're adding? ā you need to like have that mental model for every solution that you're coming up with. Otherwise, like you don't, it's not just technical debt that you can fall into. Like you can fall into product debt, ā very easily.
Kent C. Dodds (31:27)
Hmm.
That is interesting. Let's talk a little bit more about product debt. Give us some example. ā For software engineers very familiar with technical debt and the pains associated with that. I think that's a very visceral ā analogy. But what is product debt and how do you fall into it? How do you get out of it?
Will King (32:03)
All right, I'm gonna use ā a story, I guess, to explain it from like having actually experienced it. So at the first startup, the education startup that we worked at, we had started ā trying out a bunch of new type, like course types. We had like video courses, we had like exercise courses, we had ā like actually like.
download something, change it, upload it, like file upload, all this kind of stuff. And we started like adding a bunch of different requirements across the different course types. And we thought like, great, like a bunch of people like to learn in different ways, you know, this type of like education, ā science talks about how like learning information in different mediums, it reinforces learning. ā But what we found out was that the more course
Kent C. Dodds (32:54)
Mm-hmm.
Will King (32:58)
types within a course, like lesson types inside of a course that we introduced, people stopped like completing courses. and so what we kind of ran into there was like, yes, we thought all of these ideas were good. So we like pushed them out to the, to our learners. ā and what we learned was like, just like created product debt. Like there was too much at them. And like what we learned through talking with them was that like, they felt like
Kent C. Dodds (33:06)
Yeah.
Will King (33:26)
there was like, was harder than it was because like they kept trying to, they kept having to do new things when they're already learning something completely brand new. They were having trouble doing like, okay, well I'm learning something new and I'm having to do these lessons in different ways each time. And so they would just get overwhelmed. ā and so like we had to talk to our users. We had to like ask them like what was going on. ā but like product debt is like that where like it's very easy to get to a point of a product where like,
Kent C. Dodds (33:30)
Hmm
Hmm
Will King (33:57)
users can't understand the scope of the product. They stop understanding the benefit that they get out of it. You can also have internal product debt, where it's like, ā if you start trying to add something new to the product, ā and like, no, this thing that we added last year keeps coming up as an issue for adding these other things, that's a pretty good signal for, is that thing that you added good?
Kent C. Dodds (34:22)
Mmm.
Will King (34:26)
Never be afraid to prune features from your product if they are not serving you for the needs that you have right now. One of my favorite companies doing this right now is AMP. There is no space I've seen move as fast as all these agent workflow tools. And one of the things that I super appreciate about observing AMP as a company is that they are not afraid to be like, yo, the puck has changed completely.
The floor has changed out from underneath us. This idea that we had that was really good that everyone loved. Like people really liked this thing that we built, but like that no longer makes sense with how good the models are. So we are removing it completely from our product because it like, it's just not worth like it's hurting you. You just don't know it because you're used to it. And you could get this thing that's new, this, this better experience with less like less. And that's always like,
Kent C. Dodds (35:14)
Hmm
Will King (35:23)
always want you want to try and do. like they are not afraid to like prune their product. And like that's a really, if you want to like observe a company that's like going really, ā like has like a really strong product sense for like their willingness to like prune product and keep product tight. That's like a really good company to observe.
Kent C. Dodds (35:44)
Yeah, yeah, that makes sense. think that would have to be, like you'd have to have really good product sense ā for being able to manage that while also making your users not give up or not be able to keep up. Like you see some of the, ā some companies are just shipping every week they've got something new. In fact, I remember a year ago, they've kind of fixed this now, but Cursor, like every time you opened it, there was an update.
Will King (35:57)
Right. Yeah.
ā Yeah, it was a
completely different layout or a completely different like starting screen. Yeah, no, I remember that phase and being like, there is such a thing as like stability as well. Like that matters.
Kent C. Dodds (36:17)
Yeah.
Yeah, yeah. Yeah,
so there's a balance, but I think being able to kill your darlings and just say, know, this is actually really an important part of my problem tree metaphor too, is like, maybe we got to this point not realizing all of the problems that would arise or those problems didn't exist at the time and now they do. And so ā you just have to be able to back up, look at the whole space.
Will King (36:41)
Right, right.
Kent C. Dodds (36:49)
and say, you know what, at this node, we solved it this way. Now the technology is better, whatever, things are different, infrastructure is better, models are better. We can go this way now. That way didn't exist back then, but now it does. Let's chop off all this and go this way instead.
Will King (37:06)
Yeah.
One, I guess one other thing I would add to that is like, you, like there is, it's not cut and dry. Like removing something does not mean you have to like, like put a wall in front of it per se. Like what it could also mean is that you now like have the path merge back in, like provide a path for like one good like product thing is that like, if you are getting to the point where you're like, this is not a good solution anymore.
Kent C. Dodds (37:27)
Hmm
Will King (37:36)
or like we have come up with this new idea that like should replace this, this, and this, like provide a path to help people see that who still like the old thing. ā Like you can do that. We did that with ā like our database company stuff. Like we had like some issues where like, Hey, this idea that we had before is not as good as this one. It's going to require like these underlying like migrations. Like, so we've just provided like a one-click tool that's like, Hey, this thing's better. It could be a lot of work.
Kent C. Dodds (37:46)
Hmm.
Will King (38:06)
If we just said, use the new thing, it's great. Instead, like we went the extra mile to say, like, let's make something that they can click and it does it for them because we believe, help them believe it's as good as we say it is by like helping them get there faster ā and like take some of that like work off of their shoulders. Like there's plenty of ways to ā like prune debt without like.
Kent C. Dodds (38:19)
Hmm.
Will King (38:32)
pruning the trust of your users, I guess.
Kent C. Dodds (38:33)
Yeah, yeah, like you've
got users sitting on that branch of the problem tree and you cut it off and they lose all those users.
Will King (38:39)
Yeah, there's yeah, yeah, yeah.
Exactly.
Kent C. Dodds (38:43)
Yeah, well, very cool. Yeah, I really like that. That concept of product that I think when balanced ā like as you're pruning the tree balancing that and like really thinking about the user that that's part of the user experience too is the experience of ā interacting with your company and the way the way that you do these sorts of rollouts and things ā and maintaining that client base. And I think really you find some users who are
Will King (39:05)
Mm-hmm.
Kent C. Dodds (39:13)
really good at providing really valuable feedback and you tell them the problems that you're facing and the solution you have in mind and have them play around with it. Actually, I end up being that user for a number of companies that I use. Cursor is one of them. And as they were gearing up for the big launch of Cursor Cloud Agents with computers, now your agent has a desktop, you can even remote into it and everything.
Will King (39:39)
Mm-hmm.
Kent C. Dodds (39:43)
And they had me using this feature a lot. like being able to get a lot of that feedback from active users can help smooth out the transition for everybody else.
Will King (39:57)
Absolutely. I think the term for that is like design partners. A lot of times is like what people call that in like companies is like find your most passionate like users who want to support you and want to like help you win and like make them feel part of it, like make them feel like their partners with it. They're not just a user. They are like someone who is working beside you to bring this to reality. ā And I think like a common term if people want to like look.
Kent C. Dodds (40:02)
Hmm.
Mm.
Will King (40:25)
up what that might look like is like a design partner.
Kent C. Dodds (40:28)
Hmm, yeah, that's very good. But it seems like a lot of work. Well, can't I just like use my intuition that I developed because I'm such a smart product person and just do what I think is right? And if they can't figure it out, if they can't get it, then they just don't get it they're not in my audience.
Will King (40:46)
goodness. Like that I'm speaking at react Miami in, I guess, two, two weeks at this point, on like hiring and like, it's so funny how like a big part of my talk is just on what you just said. And I think it like, it is, is, is across all of these things. One of my favorite quotes that I figured found while like researching my talk was if you, ā if you
If your argument is that you meet the minimum requirements for something, expect the minimum consideration. ā So like, nobody can make you care. Nobody can make you work hard. You have to work hard. You have to care if you want to be successful in anything, like anything. We're not just talking about product engineering. We're just talking about as a human. You have to care and you have to try. Because if you don't care, you don't try.
Kent C. Dodds (41:22)
Huh, that's good.
Hmm.
Will King (41:45)
people aren't gonna hand you stuff. think a good, we're both, I think you're really good friends with him, buddy of mine, Aaron Francis. ā One of his, one of my favorite quotes of his is like, you have to operate in the world as it is, not as you wish it was. And like, sure, you might wish that you didn't have to do these things. You might wish that it was easier or like people just like understood that you knew what you were doing or understood the way you thought, but like, you have to operate in the world as it is. You have to operate.
Kent C. Dodds (41:52)
Mm.
Will King (42:15)
that like people just, people just don't understand how something might work. And like your job is not to say, well, it works for me. Your job is to understand why it doesn't work. Your job is to understand like, how can you support like not only your users, but how can you support your support team? Like Dylan had like a really good point, which was just like, it may seem like everything's great, but if you look at the edges, if you look at the people who are on the front line, having to hear from people using it, like expose yourself.
to noticing those things. Expose yourself to noticing when things don't work the way you thought that they should work ā because it's humbling when you realize, that's the reps. The reps is actually just being constantly humbled that you thought you had a good idea and it was good but it was flawed. It's always flawed.
Kent C. Dodds (43:07)
so there's no end of users that are going to tell you that you're wrong. ā How do you, like, so there is some level of, no, no, my way is right, you just haven't seen it yet. ā How do you judge between the users who really don't know what they're talking about and actually listening to your users?
Will King (43:27)
that's a question. think ā it's not one that I've solved personally. I ā think the nature of building products is the nature of building something that's never done. ā One thing that I have said a lot recently, especially with the explosion of AI and everything, that curiosity is currency. Your willingness to be curious about
how something can be done is valuable. Like there is value in that. ā But like understanding that like a product is always gonna be changing and it is never like what you have now is never gonna be the final version helps you kind of ā like balance that feeling that need to say like, well, I need to make every single change that somebody recommends. Like, well, like, yes, if the product was like never gonna get changed again, then like, sure.
tweak it infinitely as it is and spend your time tweaking it. But like you have to still be able to balance. ā You have to be able to balance the reality that like you can't tweak paper cuts forever and still like create better product, like a better like high level experience for people. like we may have infinite automated agents, but you only have like limited
Kent C. Dodds (44:39)
Hmm.
Will King (44:54)
capability to solve problems in the time that we have. ā And also in your ability to have the energy to solve it. you've got to find that balance of how much time do I spend listening to all the little paper cuts, how much time do I spend focusing on the big problems, ā and worrying about those even if it means like,
Not sweeping them under the rug per se like always acknowledge them, but it's okay to like tell your users like Hey, like hey that I know that like workflow isn't great. We're actually like working on this other thing We don't have time to necessarily solve that one ā But I do think that there's like there are a class of problems that like you should always solve which is just like if there is If the experience works the way you meant for it to work, and they just don't like it then take your time trying to like solve it
But if something's broken, like if something's like legit, like, no, we didn't actually mean for it to work that way and it's broken, like always solve a broken product. ā but if it's just, they don't like the way you solved it, ā listen to them factor that in as you like, try to solve. Like we were talking about earlier, like how do you roll that into your strategy for like making a better product? Don't necessarily like listen to it as it comes in and be like, that's the change we have to make.
Kent C. Dodds (46:18)
Yeah, like user feedback is not a prompt request necessarily. ā
Will King (46:23)
Right.
Oh goodness, that's a really funny way of thinking about it.
Kent C. Dodds (46:27)
ā Yeah, so ā one of the things that I would like to ask you about before we wrap this up is ā my personal thesis on the way that things are going for software developers specifically, that agents and models are continuing to get better, harnesses are getting better. And it's very clear for most people that ā implementation is primarily managed by agents nowadays. And if it's not for you, listener,
it will be very soon ā because it's just like inevitable at this point ā for the vast majority of software developers. I'm sure that there's edge cases, whatever, but like for the majority of us, implementation is already handled by agents primarily. And ā a lot of people are saying, well, okay, now we're gonna be orchestrators or we're going to, whatever. My thesis is that ā agents and harnesses will get even good at that. ā And like,
Will King (46:59)
Thank
All right.
Kent C. Dodds (47:24)
being able to review the code. Like I think that agents could potentially get good at that. I don't necessarily make all my life decisions based on what I think the future is going to look like ā as far as like agents getting better or anything, even if agents remained the same, things are completely different. But my thesis is that since the beginning of software development or the beginning of product building, and even now and far into the future, ā
Will King (47:27)
Mm-hmm.
Kent C. Dodds (47:51)
being able to design a good product has always been valuable and will continue to be a valuable skill and likely will be the last valuable skill that software engineers will have. I'm curious what your thoughts are on that Will and like whether you agree or whether you think I might be off base.
Will King (48:01)
Mm-hmm. ā
I think you're pretty on base. And I'm gonna tie it back to what I said at the beginning, which was like, industrial design is a discipline that still exists even after all of the automation ā that has gone into physical manufacturing. ā Because there still has to be the person who makes the decisions and sets the direction ā and knows what the right thing to build is.
One, some, some things that I want to call out with that is that like what an industrial designer is, like what somebody who built products before and after like automated manufacturing is going to look very similar because like before it was like, no, like I had to go pick the wood. had to like plane it down and I had to like measure it and do all this stuff. Like for myself, like that part of it is going to go away. ā like there are people just like now, there may be artisans who like want to like.
handcraft something because it brings them joy. But as far as what is expected out of a business who is going to compete, it's going to be more automated. The things that are interesting about industrial design that I think tie over to your premise of what you think the future is going to look like is that for industrial designers, you can't make good design decisions if you don't know the constraints of your material. ā
Kent C. Dodds (49:37)
Hmm.
Will King (49:38)
part of industrial designs education is not just like, how good are you at making something look good? How good are you at making something that users like to interact with? How good are you at like creating something that efficiently solves the problem that people had? Like there's like a whole like year long part of the education that goes into like, hey, here's how the machines make things. Here's the materials that you can make things out of. Here are different types of plastics. Here are the properties that those plastics have. Here's like,
whether they are springy or they're break easily or how long those materials last. Are you making a product that only needs to exist for a year or something that needs to exist for 20 years? All of these things that are part of the manufacturing process, I think tie directly over to agents in the future. You can't make good software products and product decisions if you don't understand software as a medium. If you don't understand the constraints,
Kent C. Dodds (50:13)
Hmm.
Will King (50:36)
You can't guide the plan in a way that like gives you the most effective version of the product out the end of the manufacturing pipeline. ā so I think that that, like that tie in of like, yes, like what ends up happening is that like, you need a lot more conceptual understanding of the industry and like you work at a higher level, but that doesn't change.
how much expertise matters when it comes to the medium that you're working in. And I think that will still apply. You can't write off the past years of experience that you've spent learning engineering because that's what makes you know when what agents have put out was successful. And also helps you in understanding when something didn't work out.
Kent C. Dodds (51:07)
Hmm.
Will King (51:33)
being able to pattern match those and understand the medium is like gonna be a huge part of the job.
Kent C. Dodds (51:39)
I really appreciate that ā analogy between the manufacturing and physical goods and making software products. And I think that makes a lot of sense. one thing that has been kind of in the back of my mind is like, product engineers? So part of my thesis is that like software developers are moving away from implementation, getting closer to product managers. And I know that like so many software developers want nothing more than to be a manager. ā
I think like in thinking that I've just been thinking like, is there a distinction between what a product manager does? Like what we've been talking about in this podcast series and what a product engineer does. And what I just heard you say is that there very much is a distinction. The product engineer brings that level of experience and understanding of the materials used in the development of the software. So when the agent decides it's going to go with SQLite for the database,
Will King (52:09)
ā yeah, more people interacting.
Kent C. Dodds (52:39)
and you actually are in a situation where you need multi-tenancy and distributed and like, you know, multiple, whatever's, I mean, SQLite's really popular or really powerful and I personally use it for everything and even distributed. I love SQLite, but ā so maybe a bad example, like just your level of experience with these technologies, when the agent decides on one direction, that is going to help you.
Will King (52:45)
Right, exactly.
Yeah.
Right.
Kent C. Dodds (53:07)
course correct that agent. And yes, maybe the agents get better and better and they won't make those kinds of mistakes in the future. That's possible. ā But there is still, I still think that there is a distinction between the product manager and the product engineer that has that technical expertise.
Will King (53:27)
The way I guess the one addition I would make to that that supports what you're saying is that like, unless AGI shows up, you know, I'm not gonna try to predict the future. Right, mean, caveat with AGI, but you know, like with the way that like agents work, like ā they only work within the context that they are given and.
Kent C. Dodds (53:37)
Yeah, we have to caveat everything we say with that.
Will King (53:51)
you can think of what we're discussing as your ability to provide the right context. ā Because yes, the difference between product engineers and product managers is their ability to provide very technical context to implementation solutions. ā Because a product manager, ā or somebody who's only thinking, even if a product engineer is trying to think more from the user side,
Like you can still only talk about like this one feature. Like you're explaining, trying to get an implementation for this like one specific thing. And like what you have to be able to, what you can hold onto and what this one single agent session is not holding onto is like the full scope of your product. The full scope of like your goals, the full scope of like where you plan on taking this product next. Like those...
Kent C. Dodds (54:42)
Mm-hmm.
Will King (54:50)
all influence the technical decisions that get made in how something gets built. ā Because like, you can know through your like experience that like, hey, like, yeah, we should probably, we may not be doing it right now, but we're going to think about like, at some point, it's going to do this. So like, we need to have the right separation of concerns across these, these dimensions that you might not realize are going to exist.
but like I know they're going to exist. So let me put that into the context so that you can implement the solution that factors that in product managers aren't thinking like that. Like product managers aren't thinking in terms of like engineering scaling systems. ā And like maybe that change, like maybe that like industrial design has like is a role that has like sucked in a lot of different roles. Like it is a very visual design. It is also like very like
research oriented, but it is also very like technical. Like you got to know manufacturing and like measurements and like material strengths, that kind of stuff. like maybe the, maybe like whatever this role that we're like moving towards is somebody that can do a lot of that at once. ā but like,
Kent C. Dodds (56:02)
Hmm.
Will King (56:07)
it's still gonna require expertise in engineering. the reality right now for the people that you want to reach is that that's what they have. You have engineering experience. Yes, start getting interested in these other domains. ā But your engineering experience matters and is valuable for product.
Kent C. Dodds (56:28)
I sure hope you're right, Will. I think a lot of people listening right now ā hope that their engineering expertise ā they've worked on for two to 50 years are like, it's still valuable in this future world. So as we wrap up, Will, the ā one thing that I am having all the guests do is provide a specific ā actual action that listeners can do.
to improve their product engineering capability or their product sense or taste. Do you have a specific thing that people can actually do that will get them closer?
Will King (57:07)
I do.
I do. All right. The one specific thing is that I would say people should use agents more for gathering, not executing. ā I think there is a lot, a lot of the problems that people get into is because ā AI only gets used as a tool to amplify like decisions that you are already making when
I think there is a lot of power and a lot of like, like, benefit to like using AI to, to not be like, you can be a little lazier. Like the reason why you may not have wanted to like think through every iteration or every idea or every potential solution is like, man, that was just taking time. Like it doesn't take that much time anymore. Like it doesn't take that much time to like explore ideas. doesn't take that much time to like explore like slightly different domains or like.
slightly like adjacent like areas to what you're trying to do. So like, don't be afraid to like slow down and use some agent time to not push code out, but to like bring information back to you so that you can observe more and have a better understanding of the problems before trying to solve them.
Kent C. Dodds (58:27)
That sounds great. ā the next time somebody has a problem that they're looking to solve, go to the, like, sure, you're gonna do what we've always done, think about the different areas, whatever, but go to the agent and say, here's what I'm thinking, and like iterate with the agent on like not just one solution, but maybe a handful of solutions.
Will King (58:44)
Yeah.
Yeah, mean, and like, get interesting with it. Like, let's like bring it back to the human principles stuff. Like, do you have like a good source of like user data for like your users? Like throw that type of information in there too. Like, hey, like this is for this type of demographic. Like what might I be missing? You know, like it's consumed so much content from the internet. Like it's got a lot of, it's just got a lot of data. It hallucinates sure, but like you're not using it to like implement.
at that time, you're using it to bring things for you to use your judgment on. So like give yourself as much information as you can to make a judgment.
Kent C. Dodds (59:27)
I've got this ā interesting, it's not the system prompt, but like it's my personalization that I have in chat GPT that I got from Ryan Florence. He actually gave this to me. So in my custom instructions, I say, are an inhuman intelligence tasked with spotting logical flaws and inconsistencies in my ideas. Never agree with me unless my reasoning is watertight.
Will King (59:35)
Mm-hmm.
Mm-hmm.
Mm-hmm.
Kent C. Dodds (59:56)
Never use friendly or encouraging language if I'm being vague. Ask for clarification before proceeding. Your goal is not to help me feel good, it is to help me think better. And I think like one of the things that people really complain about with AI is that it's always just like validating, reaffirming, like your ideas are all good. ā You can tell the agent or whatever AI you're using, you can tell it, don't do that. And it works. Just yesterday I had...
Will King (1:00:00)
Yes.
Yes. I love that.
Kent C. Dodds (1:00:25)
I had it generate a couple of ideas for the subject line of an email that I was sending out to somebody. And I really wanted them to open the email. So I was like, okay, I wanna workshop this a little bit. It gave me like five options and I thought they were pretty good, but I liked the one that I wrote. And so I pasted that in and I said, tell me why this one is better than the rest of the ones that you wrote. Like assuming that it's going to do that. then, yeah.
Will King (1:00:30)
Mm-hmm.
And specifically saying like, I'm better, tell me why.
Yeah.
Kent C. Dodds (1:00:52)
Yeah, exactly. And then
I was planning on follow up, okay, now tell me why it's worse and just compare. And it replied with, it's not better, it's materially worse. And here's the reasons. And I was like, yes, that's exactly what I was looking for.
Will King (1:01:04)
There we go. Yeah.
That's the type of stuff that you can do and you should be doing before just shipping. Take some time, question assumptions, explore. AI can be used as a tool for creativity as much as it can be used as a tool for productivity.
Kent C. Dodds (1:01:24)
Yeah, I'm with you Will. Looking forward to the future and what that has to offer for product engineers. Thank you so much for ā sharing your experience with your industrial design background and just your experience ā as a product engineer over the years. And what was the best way for people to follow up with you if they wanna keep up with what you're up to?
Will King (1:01:44)
Wilking on x.com, the everything app, just at Wilking.
Kent C. Dodds (1:01:50)
Awesome, thanks so much Will, appreciate you and we'll see you all later.
Will King (1:01:51)
Awesome. Thanks, Kent.


