Learn about the consequences of addictive design and biased algorithms
Web development isn't as simple as it used to be. We can't view the source of any webpage and be able to recreate it anymore, it's all bundled and obfuscated. That increased complexity comes from the increasingly larger role that the internet is taking on in our society and the need for our websites to do more than display static information.
The growing role of software in our society has had its consequences. Addictive design, and biases being built into software waste people's time at best, but also have the devastating ability to ruin lives. In this episode, Cher Scarlett brings awareness to these issues and recommends some books that you can read to educate yourself further.
Read one or more of the following books
Kent C. Dodds (00:00):
Hello friends, this is your friend Kent C. Dodds, and I'm joined by my friend, Cher. Say hello, Cher.
Cher Scarlet (00:05):
Kent C. Dodds (00:08):
This is actually starting to become a joke. You're the third person that I've talked with for season four, and each one so far has, there's been some funny joke about hello Cher, or hello my name. Anyway, I'm excited to chat with you. We've known each other for a while now. I can't remember, we first met online somehow, and then I do remember now we met at React Rally, which seems like forever ago.
Cher Scarlet (00:36):
I think it was.
Kent C. Dodds (00:38):
Yeah, maybe like 2018, or something.
Cher Scarlet (00:40):
No, it had to have been before then because I was still at Starbucks, so it must have been 2017.
Kent C. Dodds (00:46):
Boy, I can't believe it. Time flies, my goodness. So it has been a long time, but yeah, it's been a pleasure to know you online, and I'd love for my audience to get to know you, so could you give us a little intro to yourself please?
Cher Scarlet (01:03):
Sure. So my name is Cher, I'm a single mom. Well, I'm actually not single anymore, I got married.
Kent C. Dodds (01:09):
Cher Scarlet (01:09):
But I was single for a really long time so it takes me a really long time to like, wait.
Kent C. Dodds (01:14):
When my wife and I got married, she was an elementary ed teacher, and she just started her first class right after we got married, and she kept on referring to herself as Miss Jesse, and her students just thought that was hilarious they're like, "You're not Miss Jesse, you're Mrs. Dodds." So yeah, it does take a while to adjust.
Cher Scarlet (01:35):
It does, yeah. So yeah, I taught myself how to code when I was in middle school, so that was in the late nineties, and then I had a rough go of things and went down the wrong path, as people who have a rough go of things tend to do. But when I was 21 I got pregnant with my daughter, and when I was even or eight months pregnant I just had this, all women go through this phase of where they're like, oh no, I have another human being to care for, what am I going to do? And I was living with my mom, not in her basement, but in an upstairs childhood bedroom and I was like, oh my gosh, I'm going to have a baby and I'm living in my childhood bedroom at my mom's house, and I don't have a job. Well, I was a waitress, but whatever.
I just didn't want that life, and I had dropped out of high school, and also dropped out of college, and the only skill I had was coding. And I didn't necessarily know that that was going to be a career at all, I was just like, I need a job and this is a thing I know how to do. And so I looked at Craigslist and there was an ad looking for something called a front end developer, and little known fact, at the time I was the second Google result for front end developer.
Kent C. Dodds (02:53):
Cher Scarlet (02:53):
Because it wasn't really a job title, this was in 2006, so it wasn't really a job title at the time, and I just had a corner on the market. And so I think that when they Googled front-end developer and they saw my name, and I applied for the job, it was just was like, oh yeah, bring her in. And I just told them what I knew, which in a technical aspect wasn't a lot, I didn't know terminology and stuff. But I remember the director of technology, who was the one who interviewed me, he was obviously having some back and forth between the panel that had interviewed me being like, "She has no idea what she's talking about." And he brought out a printout of a webpage and asked me to write HTML on paper for it, and I did that, and I think that because I had spent so much time in the CSS Zen Garden community that my semantic HTML, that is all I knew. You style around semantic HTML, that is how I learned.
And I think that that right there was the thing that he was like, because he walked out and I heard him, he was like, he just put the piece of paper up and he's like, "She's really good." And so then they offered me the job, and it wasn't very much money, it was like $30,000 a year or something.
Kent C. Dodds (04:14):
Cher Scarlet (04:15):
Knowing now that everybody else was making double what I did, I should have negotiated or something, but at the time it was a lifesaver. I went from making $2.13 an hour as a waitress, with plus tips, to being like, okay, I can eventually move out of my mom's house, having a future.
Kent C. Dodds (04:34):
Oh my goodness.
Cher Scarlet (04:34):
So that's really how I got into this career, and I spent the, so that was 2006 and I didn't get the job at USA Today until 2011, so the five years following that was me figuring out how to make ends meet between working in the restaurant industry, and then also doing development, and learning more about development on the side to try to make myself marketable as an employee to get another job after the one that I left, it was called luxuryrealestate.com, that was my first tech job, so it was just real estate websites.
Kent C. Dodds (05:12):
Wow, boy, that is really amazing to me. I can't imagine the pressure that you felt to get a job as you were getting started, and to make sure that you were able to take care of your kid, and it's honestly really inspiring that you put the effort into learning to code so you can get that job. They honestly got you for a steal, my goodness, for 30k in the US, I know in other countries-
Cher Scarlet (05:44):
In Seattle too.
Kent C. Dodds (05:45):
Cher Scarlet (05:46):
So not exactly a cheap place to live.
Kent C. Dodds (05:47):
Yeah, for real. So I'm really amazed. You mentioned that you knew coding, how did you get introduced to coding? You mentioned that you dropped out of high school and college, were you going into college for coding, or where did coding come from?
Cher Scarlet (06:04):
No, so I actually went to... I tried to go to college for many different things. I majored in astrophysics, medical physics, I was pre-med at one point, architecture, a lot of different, and just math, period, I just was always really good at math and so I tried to use that in different ways, and I think it's just the bottom line is I'm not good at school. And even though I really enjoyed learning, which I think is part of the reason why I have done so well in this career, the structure of school is just really hard for me, maybe because I have ADHD, maybe because I'm bipolar, maybe because I'm on the autism spectrum, maybe all three of those things make it difficult for me.
But coding itself, I think, actually came from ADHD, because I was playing this video game online called EverQuest, and I had a guild that I was running. And at the time, all of the other good guilds on the server that I was on had websites to get recruitment, and mine was the only one that didn't, and I wanted to be competitive so I needed a website, but I was like 14 or 15 years old, I was in eighth grade, so how was I going to get a website? And so I was on internet Explorer and I was looking at a website and I just right clicked it and I saw view source, and little old learn how things work brain Cher was like, I bet this is how this works, and I can take it apart and put it back together and make my own, and that's what I did.
Kent C. Dodds (08:35):
Oh my goodness, wow, that's so funny.
Cher Scarlet (08:37):
Kent C. Dodds (08:43):
That is so awesome, oh my goodness.
Cher Scarlet (08:48):
That's exactly how I learned.
Kent C. Dodds (08:49):
I love it, that's a brilliant story. That's so great. And that, to me, is inspiring as well. The people who learned how to write code before it was as accessible to learn now are really inspiring to me. I started learning coding myself right around, Google was very popular and widely used, it was, let's see, 2011, around there, is when I really started getting into coding, and yeah, you could find so much stuff. I mean, there's a lot more now than there was then, but I could Google around and answer my questions. I know a lot of people are sad about bundlers and code obfuscation, and everything, view source means nothing now. But I'm not one of the kids who grew up on view source, so I'm just really amazed by that.
Cher Scarlet (09:50):
I can't imagine anyone trying to learn anything from view source now because there's too much. I feel so lucky that I had this enormous amount of time to learn this much information, and then this much information, kind of like a ladder, and now it's like, here's the whole staircase, the whole house, the mansion, good luck. Pick a room, it's right there. I can't imagine anybody learning from view source now, we have to bundle, we have to do all of these things, and that's because things have changed, and I'm not one of those get off my lawn kind of people, except for if you're actually on my lawn, then probably get off my lawn. Just kidding. I think that the reason that things change is because they have to, and it's for the better. We're not changing things to make things worse and harder for people, we're changing things to make things better and more accessible, as you said.
Kent C. Dodds (10:48):
Thank you for saying that, because there definitely is that an alternative opinion of, it's so complicated now, but it really comes down to, our expectations are bigger. You would not be able to learn the stuff that our customers expect us to do from view source, it would not be possible to do that. And so yes, it is a shame that view source is pretty much useless now, for the most part, unless you really know what you're doing already.
Cher Scarlet (11:19):
Kent C. Dodds (11:21):
But there's so many resources out there for learning, and I'm glad of that as well.
Cher Scarlet (11:29):
Kent C. Dodds (12:44):
Cher Scarlet (12:44):
Kent C. Dodds (13:18):
Cher Scarlet (13:20):
So we're doing a lot more and we expect a lot more out of the web applications that we're writing. You can't expect that you would get all of that and you can still just view source, copy and paste the HTML and learn how to make a website, because websites aren't just websites anymore, we're not webmasters anymore.
Kent C. Dodds (13:39):
Cher Scarlet (14:33):
I think artificial intelligence, machine learning, all of that stuff, I think that... I mean ultimately, what a software? It's trying to do things efficiently, or create efficiency in other processes, and I think as we're doing that, in some cases it's running away from us. And there's like so many different outcomes that are somewhat extremely negative because we're not really thinking about it beforehand, almost like the Jurassic Park of tech, you never stop to think if they should.
Kent C. Dodds (15:08):
Yeah, yeah, that's a great analogy.
Cher Scarlet (15:11):
And I think about Bitcoin and how I remember buying Bitcoin when it was not ever what it is now, I used it maybe not for the best purposes, and I wish that I hadn't used it on what I did use it on, but I don't do those things anymore. But now Bitcoin mining is an energy, it's an environmental concern, and it's one of those things where we're trying to make things smarter so that we don't have to work as hard, we want the machines to work harder and we work smarter, but we're not thinking about how that is going to affect outside of the machines, so natural resources and humans themselves, writing biases into our software, things like that.
Kent C. Dodds (16:04):
Yeah, the impact of software on humans, and humans on software, I guess, when you're talking about biases, we write our biases into the software, and I'd love to dive into that some more. But when you said what you said it just made me think of the impact that software is having on humans, and I think immediately about Facebook and Instagram and TikTok and things, not that these things are bad in themselves, but I think that these platforms should be held accountable for the impact that they are having on people. The research is pointing very directly that Facebook is really causing problems for, especially teenagers, from a self value perspective, and things. And I think that, like you said, it's running away from us, we didn't really think what the implications of what we are creating have.
Cher Scarlet (17:09):
Yeah, and I'm even just thinking about myself, as somebody with ADHD, if I showed you my browser right now, earlier you told me you could hear the fan on my computer and it's probably because I have, I don't know, maybe like 70 tabs open on this particular browser window.
Kent C. Dodds (17:30):
Cher Scarlet (17:32):
And I know that I was keeping that structure in my head a little bit, which is why I'm like, oh, I'm going to do this, I'm going to do that, but I honestly think that browser tabs have made it worse. Windows, I couldn't really keep that many open, eventually I would forget about it and I would close it and it wasn't a big deal, but now not only can I keep all of these tabs open all the time, they reopen when I lose them, and if I accidentally lose them, they're not just gone, which I probably would've never opened this stuff again, that's why they're still open, and it'll ask me if I want to open them again. It's going to try really, really hard to make sure I get back to the stuff that maybe I was going to buy, which it was all what it comes back down to.
And I think about too the impact of things like likes, or followers, those sorts of things that is now become an inherent part of somebody's worth, and even if you're not somebody who, I'm not somebody who I don't define my worth on that, I think maybe I could have at one point, but maybe because I grew up with people having very few, it doesn't really have that big of an impact on me, but I know that when somebody that has no followers says something extremely vile to me, I will use that as a way of like, what do you think that your opinion matters? Nobody's even following you. And then it comes out of my mouth and I'm like, I don't even care about that, but it's written into me that that is something that determines the worth of what you were saying to me.
So maybe I'm not going to use that against somebody who is being nice to me, I'm not going to be like, oh, you have no followers, I don't care about what you say, but if you say something mean to me it is still written into me, because of being a part of this software for 10 plus years, being a part of social media and having that, even back to Myspace, top eight, "I'm not in your top eight? You don't even like me." Mine's the default, man.
Kent C. Dodds (19:36):
Yeah, it's so interesting. We just haven't fully appreciated the fact that software writes us. Software that somebody else has written is writing me as I use it. And often, maybe the developer didn't think about it, and maybe the product manager didn't think about it, but altogether this is what happens. And now when you throw weird incentives in play, where, okay well we've got advertisers now, and we've got to make sure we keep them happy, or whatever the case may be, we've just landed in this area of the open web, everything's free, but we've got to make it sustainable, so we start putting in some weird ways to make money, which often ends up being not better for the consumer. So Twitter and Facebook are researching ways to make their platforms more addictive, because that's all that matters, that's what pushes the dollars forward. And yeah, that has me pretty worried about that.
Cher Scarlet (20:46):
Yeah, me too, big time.
Kent C. Dodds (20:49):
Yeah, and it's kind of like, what drives more engagement? Well, it's toxicity that drives engagement. And so I actually recently stopped following people on Twitter and I use lists now, and Twitter hasn't planned around that, and so their algorithm doesn't apply to me. So it was a while ago, I saw a lot of people tweeting about how toxic Twitter was, and that's all I saw, was that people were complaining about how toxic it is, but I didn't see any of that toxicity because Twitter, they just weren't sending me that stuff because I'm outside of the Twitter algorithm now.
Cher Scarlet (21:23):
You know what's interesting about that is that I'm sure that who I follow and your lists have a very strong crossover because we're in the same tech community.
Kent C. Dodds (21:23):
Same groups, mm-hmm (affirmative).
Cher Scarlet (21:34):
I did not see any of those posts talking about how toxic Twitter is, but I did see a lot of toxic Twitter posts.
Kent C. Dodds (21:41):
That's so interesting. And it all comes down to, they feed it into the machine, and the machine says, well, here's how you're going to get all your money, and they don't think about the fact that this is actually negatively impacting humans lives. So yeah, it's scary. So anyway, I do want to get into the other half of the equation, though. So we talked a little bit about how software writes us, but how we write ourselves into the software, and that has a negative impact too.
Cher Scarlet (22:12):
Yeah, it does. And I think that even to your other point of software writes us, I think it's a circle, and I think that there's this almost worse disconnect of we talked about people in the past, people doing mind control, and really it was experiments of how do we get people to do what we want? Which is advertising and marketing, that's what that stuff is used for in present modern times. And when people realize that a person is trying to trick them into purchasing something or influence them in some way, they don't like it. But when a machine is doing it, I think that there's this disconnect that it's like, oh, well it's just a computer, and so they're not thinking, there's a person who is writing the software with the sole purpose of accomplishing the same goal that somebody in a boardroom had an idea for.
And what's worse is when we add in this layer of machine learning, people are like, oh well, it's just an algorithm that the computer is spitting this stuff out, that means it's true. And almost everybody I know that isn't working adjacent or in that software, that is their baseline understanding, is that an algorithm is math, math is pure science, therefore it's real. And even people who are not in the scientific community and are more faith-based people I know, they're not going to say to me that the algorithm is false, because they don't not believe in math, they believe that one plus one is two, so why wouldn't, if you tell them, well one plus two plus three is X and that's five, so X is five, then they say, oh yes, that makes sense.
But you're creating a rule somewhere in there that they're not realizing that you and I have created that rule, and we don't have all of the context of those rules. So a really good example is prison software is trying to determine lengths of sentences, or whether or not somebody should be released on compassionate release. So they input all of this information, and again, there's a person putting in this information, there's a person writing the software, and then there's a person who is taking the previous data and analyzing it, and they only have their context and their perspective. So the results that they say, here's the trend, here's what's most likely to happen, regardless of the math there's going to be that person's opinion written into what they're doing because of the context that they have can become a part of that.
And I think it's hard for people to recognize that, okay, no, they're not writing in the software, if you're black you're getting a longer sentence, but there are reasons that somebody who is black may have had certain things in their past that are true that is maybe more true for people that are black than people who are white because of a long history of slavery in this country and not having access to opportunity, access to finances, access to free land 100 years ago, those sorts of things, not having access to family because they don't even know who their family is, they were stolen.
And so there's all of these other vectors of data that are totally missing from the data, and so when you put that in, well if you're somebody who's come from a poor background and you're more likely that maybe your parents had to steal so that you could eat, maybe some of these things are true for you. Maybe you come from a family that's full of despair and you're your mom had access to drugs because maybe she had a mental health problem, maybe she was really poor and was very stressed out and couldn't handle life, which if you just heard that minus the drugs, maybe you could empathize with that, but then you hear the drugs and suddenly you don't empathize anymore.
There's all these different vectors of data that are missing from the things that we're writing to determine things, and when you talk about like credit worthiness, it doesn't sound so bad, but when you start moving into other things you're like, wow, this is really bad, but we need to realize that not only are those things that we can articulate are really bad, but some of these things that seem benign, they're really bad too because somebody else's writing in a bias in the software, or a multitude of people, because they're trying to make money. And we're taking that information as, okay, well that person is more likely to commit a crime, or this person is more likely to default on their loans so they shouldn't be able to own a home, they shouldn't make a certain wage, they shouldn't be able to have insurance, they shouldn't be able to rent.
All of these things, now I'm getting the bias. I'd never had that bias, but now I'm accepting this other thing as fact, and now the bias is getting written back into me, and maybe I'm going to go write some software, and maybe I get a job in FinTech and I believe all of these things about credit worthiness, or rental worthiness, or insurance worthiness risk, and now I'm going to go write this into some more software. And it continues to build back and forth, back and forth, until we don't know what is justice anymore, what is fair.
Kent C. Dodds (27:35):
Yeah, oh my goodness, huge, great words that you've said. And it is a cycle, because I feed those biases into the software that I write, and the software that I write means that those that my software is biased against have an even harder time, and so that biases the software even further. And that's societal, that's not just software, that's forever, for as long as society has been a thing we've had these problems. But I think software enhances, that or amplifies that negativity, because-
Cher Scarlet (28:15):
It makes the negativity more efficient.
Kent C. Dodds (28:17):
Yeah, yes, that's well said. Yeah, I totally agree. Now there's got to be something that people can do about this, at least for themselves, and then maybe for others, and before we started you mentioned that you have some books that you could recommend for people to read. Do you want to just mention a couple of those and what you like about them?
Cher Scarlet (28:39):
Sure. So I have a few of them here, so I thought since we have video I could have some props. So the first one I actually don't have because I lent it out to somebody, it's called Race After Technology, and it is by Ruha Benjamin. And it is not a cheap book, but it's very big, and I'm sure you could find it at the library. But that book specifically talks about the way that algorithms have created and amplified biases against people that are black, and other indigenous people and people of color. This one is called Technically Wrong and it's by Sarah, I can't pronounce her last name, but it's sexist apps, biased algorithms, and other threats of toxic tech. This is a really, really good one. It's a pretty light read in retrospect, or relatively, it is a light read relatively to the one, and it talks how applications have these biases written into them, and then also they are starting to shape kids' identities around those biases.
Kent C. Dodds (29:57):
Wow. And when you start messing with kids as they're growing up, my goodness, that is a big problem.
Cher Scarlet (30:04):
And I think, living where I do in rural suburban, I call it, Missouri, you see this polarization very clearly. My daughter, I'm from the west coast, I'm very different than most of the people here, and so for her, she feels that very 180 feeling from her and she talks about how it's from these social media bubbles, and the creators and influencers and the applications themselves, they're feeding her what she has gravitated towards, which has helped shape her, she's a good person, and she feels that it shapes the bubbles that these others have entered into because of their parents, have shaped them to be what she deems to be bad people, which I don't necessarily think that there's good and bad people other than some very strong outliers.
Kent C. Dodds (31:00):
Cher Scarlet (31:00):
I think it's really just becoming, we talk about this amplification of that negativity, becoming more and more polarizing towards these two different sets of ideals, and there's no more real gray area.
Kent C. Dodds (31:17):
Cher Scarlet (31:18):
So the next one is really good, it's called Algorithms of Oppression, and this one is by, I took the dust jacket off of it, Safiya Umoja Noble from the New York University Press, and this is about how search engines reinforce your biases. So again, it's not something you would think about when you just search for something, you're just looking for an answer. But the answer that you are getting is telling you what the truth is, and it's determining what it is that you think and feel as the truth.
Kent C. Dodds (31:50):
Even the suggestions as you type something, where it suggests an autocomplete for your search, that can give you a sense for what the answer is sometimes.
Cher Scarlet (32:00):
Yep, absolutely. And I think sometimes too it can deter you from what maybe your initial bias was, it can change that, and change your bias for you. It's basically telling you, as a human, from a machine, you're wrong, that's not what you were thinking, this is the correct thing. So last one is Hello World, which is Being Human in the Age of Algorithms by Hannah Fry, and this book outraged me. I don't think it's not that I wasn't thinking about how AI affects literally everything around us, but this one was definitely a bit more broad than, some of these are very specific to sexist applications, or specific biases against a certain group of people, this is more broad, like the conversation that we're having, should be a really good takeaway. And it's just over and over about how we're shaping software, and then software is shaping us, and circle, it's a circle.
Kent C. Dodds (33:05):
Yeah, that's terrifying, because it's not intentional, or often it's not, and when it is, it's motivated out of-
Cher Scarlet (33:13):
Kent C. Dodds (33:13):
Impure motives, yeah, yeah, greedy. Yeah, well thank you for those suggestions that's four books that we can recommend to folks, and that's going to be our homework assignment for you all is to it, maybe not necessarily one of these books, but just in general to read a book, and these are good recommendations, that expand your perspective on bias and technology, and how technology writes us and we write it, and it's a negative cycle if we aren't intentional about making it a good one. Is there anything else that we didn't talk about? We're down to the end of our time here, but is there anything else that we missed that you'd like to bring up before we wrap up?
Cher Scarlet (33:51):
Maybe in the spirit of biases and coming from the background I do, obviously we didn't get too much into it other than I had a rough go of things, and obviously even my five years of software were very atypical versus what you see now, and especially where I am now, very different. I think it's hard to say straight out, I don't want to say anyone can be a software engineer because it's really hard, and would I do the same thing again? Other than the fact that it's had a wonderful effect on my daughter, I don't know that I would. It was very hard, and it still is hard a lot of times.
But just to remember that if somebody is a software engineer, or they want to be a sophomore engineer, that they can, regardless of where they're coming from. And it's more about who I am now is like sure, is it unlikely that I would have gotten to where I am? Okay, but I got here, so give me the respect, it doesn't matter that I don't have a CS degree and dropped out of high school, and did a bunch of stuff that I am absolutely not proud of before that.
Kent C. Dodds (35:07):
You mentioned earlier that there aren't good or bad people, and I totally agree with that, and I think that it's just people who do good and bad things, because every day is a new day and we can make different decisions, and so we can do good things today, even if we did things that we weren't proud of in the past. And now here you are working at Apple as a software engineer, and that's amazing. It's an awesome accomplishment, especially given where you've come from, and I just appreciate you sharing that story with us, and I appreciate you sharing your thoughts on bias in technology. There were like five minutes where you gave us some really amazing, just non-stop thoughts on that, and I appreciate that. But we do need to-
Cher Scarlet (35:52):
I know you want to wrap things up, but I just thought of this line from this Tool song that just always really speaks to me whenever we're talking about recognizing that each day, you can have a new intention to consider the impact that you're leaving on the people around you and the world, and it's from this song called Right Into You, which is basically about how human beings tend to find something and tear it apart, which can be a positive thing or a negative thing. I take things apart and put them back together because I want to understand how they work, that's not a bad thing of splitting it right in two, but splitting another person right in two, not a good thing, so that's that same aspect.
But the line is, "Repugnant is the creature who would squandered the ability to lift an eye to heaven conscious of its fleeting time here." And that's basically just saying, if you don't take the time every single day to think about, I have this time on this earth and I'm lucky for that, I need to do more than just think about what I can take from the world, I should be thinking about what I can do to leave the people and the places that I've been better than where they were when I found them.
Kent C. Dodds (37:12):
Oh, I love that. Not everybody gets to live as old as you are, however old you are, and you should take that as a trust and use that for the better. That was great, thank you for sharing that. Okay, so as we wrap up, Cher, what's the best place for people to connect with you online?
Cher Scarlet (37:35):
The toxic place known as Twitter. I'm @Cherthedev, And if you are off Twitter these days, or off all of social media, you can also reach me at firstname.lastname@example.org.
Kent C. Dodds (37:48):
Awesome. Thank you so much, Cher, it's been a pleasure chatting with you, and we will catch everybody next time.
Cher Scarlet (37:55):
Kent C. Dodds (37:55):