#78 - Facebook whistleblower Frances Haugen's fight to reform social media and protect children

July 16, 2024
51
 MIN
Frances Haugen

Episode Summary

What is it like to challenge one of the world's biggest tech giants? Host Joey Odom sits down with Frances Haugen, the whistleblower who exposed how the pursuit of profit at Facebook jeopardized user safety - especially among teens. In her memoir, ‘The Power of One,’ and through her initiative, Beyond the Screen, Frances champions a movement towards technology that honors human well-being over algorithms. Hear from Frances about the devastating impacts platforms like Facebook have on our kids, and her bold vision for reforming Big Tech to make online spaces safer for all. Frances offers data and insights that urge a reevaluation of our digital priorities.

Spotify Icon- Radio Webflow TemplateApple Podcast Icon - Radio Webflow Template

Watch the Conversation

Episode Transcript

Joey Odom  0:05  

They say not all heroes wear capes, but our guests might as well have one permanently affixed. This West High School Trojan left the Athens of Iowa for Beantown and Rose, like Frank the Phoenix at Olin and Harvard. She has helped you Google pin Yelp and friend but she's best known for her whistling. You see this Hawkeye had the vision to put Facebook in the public eye, and we can all see better because of it. She has been a recurring guest and Congress Parliament's and Mark Zuckerberg nightmares, and she's our guest today on the reo podcast. To show us the power of one please welcome a hero to us all. This Francis Halligan Francis, it's so good to see you.

Frances Haugen  0:44  

So good to be here. Thank you for inviting me.

Joey Odom  0:46  

Absolutely. Thank you for thank you for that. Did I get that right? You went to West High School? Yeah. And nail that one intro? Yeah, you

Frances Haugen  0:53  

nailed that one. You exceeded expectations, right? The

Joey Odom  0:57  

Phoenix that was at a deep pull?

Frances Haugen  0:58  

I think that is I even don't know the name of our mascot we have we have a lack of

Joey Odom  1:03  

Orleans mascot breaking Phoenix. For its I would love to jump into right in the middle of the story. So it's it's 2000 22,021 you are copying 10s of 1000s of documents at Facebook, internal documents. How terrified were you?

Frances Haugen  1:25  

You know, it's interesting. It's one of these things where the thing I was most scared of was knocking to finish right? Like there was this this question of like, at what point was I going to like trip over the wrong trip wire. And like what I had was was going to be that was what we were in middle of work with. Because like I was I was working from a place of fear that the public didn't have enough information to avoid what seemed to be kind of an inevitable course that the company was on. And so you know, it was just, you know, I knew that only one person would ever get to try, because they would clamp down on the things that I was doing. And so we had to make it work. Because it was like the one shot

Joey Odom  2:11  

wasn't interesting statement you're in you're so right. Only one person did did you only one person would get to try? Did you really like in the moment you actually knew like this is one or no? Never. And so you actually had that realization in that moment? Well,

Frances Haugen  2:23  

for me, it was like a, you know, it there, I made very many assumptions about how good Facebook security was, which did not turn out to be true. Like I had a level of of care, that that I probably didn't need to like I've since encountered whistleblowers who actually got out similar amounts of documents to what I got out of that people have never heard the names of. And, and, you know, my assumption was, you know, if this made the level of impact that I hoped it would, you you would you would make things much harder to access and Facebook did exactly that. So in the wake of when I came forward, they did a project that was meant to prevent another Francis Algan. And they locked down all the safety information inside the company so that only people who worked on safety could see it. Which is intriguing because in reality, like as people pointed out, I would have still had access right all it did was reduced the number of people with access to safety information inside the company

Joey Odom  3:30  

if we go back to the very very beginning you came to and again your your your tech pedigree was great. I mean, you Google Pinterest, Yelp so and then going back to when you joined Facebook did you have this? Was it it was a kind of like an exciting I mean, this is the this is the company to be at was an exciting company. Were you even hoping to hit maybe I can accomplish some good and connection? What were your kind of maybe going in what were those initial hopes and excitements for when you join Facebook?

Frances Haugen  3:55  

So I had a number of opportunities to join Facebook before I joined. And I'd always said no, because I felt like you know, there was I wasn't the only one in Silicon Valley who got kind of some misgivings about how Facebook operated. But the moment in which I joined the company, it felt like maybe they were acknowledging they had to do something different. So for example, in the wake of the 2016 election, and everything that happened with say, like the Macedonian troll farms, right, so people always like to say like the Trump campaign, Trump campaign had a tiny tiny dent like the Russians had a tiny, tiny dent compared to the entrepreneurial troll farms of Macedonia, which I talked about in my memoir, the power of one, but they they woke up like Facebook woke up in the in the wake of getting basically buried by the press about how negligent they had been. And they started building this team civic integrity. And when I joined the company in 2019, it was kind of in the swing up towards 2020. You know, they were rapidly hiring people they it seemed like they work was really serious about making different kinds of choices. And going in, I thought I was going to work on making sure that the 20 Us 2020 election was safe. But I got there and discovered my job is actually very different than what I had expected.

Joey Odom  5:15  

What did you discover when you said, Okay, now wait, I thought I was doing this. And now it turns out, I'm doing this.

Frances Haugen  5:20  

So to give you a sense of like how rapidly the the space was changing, you know, when I came in the engineering pier that I was going to work with hadn't even been hired yet. Right? Like, there were these things were the inner normal team, when you join a team, you got to meet some of your teammates, you got to meet the people who you're going to be collaborating with. And none of that existed when I when I went in interviewed. And so I didn't have people really to ask questions about like, you know, exactly what am I going to be doing when I work here. So I show up, and instead of working on, you know, say the information environment around the 2020 election in the United States, I thought I was gonna be working on misinformation outside the United States, and specifically information beat on the scope of what's known as third party fact checking. So for some of you out there who have been like, deep in the internet, you know, when people say things like, they censored Hunter Biden's laptop, or things like that. Third party fact checking was the program that went through and picked individual stories or individual pieces of content and said, you know, this is not true. That program only existed in a few countries, you know, it exists in United States, it existed in some parts of Europe, it was meant to kind of, you know, superficially made the platform look a little cleaner. In many countries in the world, there were no even basic safety systems. This is something I didn't know, right? Facebook told people, they could, you know, use magical AI to keep the platform clean, then they declined to build those systems for the vast majority of languages in the world. So my job was will What do you do with misinformation? When it could be causing something like ethnic violence or like in a crisis, you know, rumors end up causing violence? You know, what do you do if you don't have access to working with content, you know, in one of these dangerous situations?

Joey Odom  7:10  

So that's the that's the extreme. Obviously, there's no, there's no guardrails over that over misinformation? And then you said, and I wrote this down, because it kind of jumped out to me, you said you it was used within the countries where it was there within the US said it was used to make the platform look cleaner.

Is that all window dressing? Or is it just so unwieldy? Like there's just nothing you can do about it? And so you have to make it at least making an attempt to make it look like you're doing something? Or was there legitimate? Like, no, let's actually do something that's, that's very effective here.

Frances Haugen  7:52  

I would say the core of the work that I do every day now is working on educating decision makers, be they in government, be the investors be the litigators, be they, you know, advocacy organizations, on what options exist online for making this platform safer. And the reality is, we have lots and lots of options on how to design platforms that are just safer, like ones where we don't have to come in after the fact and kind of polish off the rough edges. Third party fact checking existed to allow Facebook to say, you know, when, when some crazy rumor, some thing goes viral, you know, they could say, oh, we caught it, we took it down. But the reality is that system, even even third party fact checking was so slow. Content got half or 75%, as many views as it was ever going to get by the time a journalist got to it. And so, I think in terms of one way to think about Facebook's approach to safety is is much more about like respectability, you know, instead of coming in there and saying, I'll give you example, should you have to click on a link before you reshare it? Right? Like, should you have to actually look at the content you're about to consume? Or like to blast out to other people? Should you have to actually look at it before you can reshare it? It sounds really small, but that tiny moment of reflection reduces misinformation by like 10 or 15%. Wow. So there's a lot of these things where platforms can do it. But they add a little friction to the system. There's a little bit less content distributing it's a little bit less exciting, right? Because like misinformation, you know, it speaks to something deep in our minds and that's like why it's so delicious. You know, it's the french fries and information on the kale and, and the I love kale, you can make exactly

Joey Odom  9:36  

as much we're gonna lose our whole kale audience now. Yeah, just

Frances Haugen  9:39  

but but the, you know, they don't do it because it costs tiny slivers of profit. You know, there there are many of these decisions where trade offs can reduce misinformation 30 or 35% without having to choose what's true and false. But you're gonna have a system that grows a tiny bit slower or people consume a little bit less and in isolation where there's no kind of feedback loop allowing Facebook consider an incentive other than growth and profit. You just not going to make those choices.

Joey Odom  10:14  

You're getting to to maybe a real core question I had about all of this is the you have this enormous company and again, it's just Oh, then it would be easy to say as Joe is just so unwieldy. Am I hearing correctly that no, this actually is a deliberate you need the deniability to say we're doing something about you say respectability, it sounds more like deniability, right to say like, oh, no, we're doing something about it. But that is by design, because it's a public company, right? I mean, your your responsibility is no longer to your users or maybe never was to your users your responsibility is it is for your shareholders. That's your your responsibilities, too. So am I hearing you correctly that this is you believe a deliberate, no misinformation spreads. And that's good for clicks, which is good for profitability. So it is a deliberate decision.

Frances Haugen  11:11  

So a big part of what I walk through in my memoir, The Power of One is that Facebook was full of really kind, conscientious good people, like they were known in the industry for putting a lot more effort into their hiring process for filtering out, you know, assholes, for lack of a better word, right? And so the question is, like, how can you take so many, you know, high EQ, kind, conscientious people and have them produce such horrible outcomes? And I think part of what happened was that they devalued judgment. Right. So they, they created a system that was based on metrics, where they said, you know, we need a system where a 23 year old can join the company and have an insight about social media, and beat a 43 year old, and has some kind of objective way of making that decision. And that way it was metrics. And the problem is that any basket of metrics inherently is reductive, you know, it doesn't, it doesn't capture the meal, the whole complexity of a given system, it captures what you're choosing to measure. And when you have to trade off between different metrics, like misinformation, and profit, or like, I don't know, the anxiety of children and profit, you know, you have to make a judgment call, like how many points of anxiety or how many points of sleep deprivation is worth 1% of profit for the company. And and those kinds of systems are subjective, like unless you go in there and make some judgments about those trade offs, you do keep doing kind of an extractive mindset where you are acting just in the benefits of your shareholders. I think the thing that's really unfortunate about all this is that if Facebook had been optimizing for the happiness and well being of its users just a little bit more, I think more of us would still be using Facebook, right? Like face, people love Facebook in 2009. It was about their family about their friends. It wasn't about weird click Beatty things. It wasn't about million person spam groups. It was about your family, friends, but that version of Facebook doesn't keep you online, as long as you know, it doesn't keep you consuming doesn't keep you looking at ads. And so they kept making decisions that were bias from short term gains than the end came at long term costs. Wow.

Joey Odom  13:25  

That's such an interesting calculus you just described in the trade off of metrics, and I love the way you put it in sleep deprivation or harm to children. And so Okay, yeah, that because there is a point, obviously, we'd say no, that's not okay. So, but you're, I mean, you're butting up to the real edge of that. And at the, you know, in doing that the amount, there's actual human consequence to doing that, so interesting. So, actually,

Frances Haugen  13:50  

a real good thing we can go into because I don't think most listeners are aware of it is there is a huge lawsuit going on right now kind of like the tobacco lawsuit or the opiates lawsuit were 44 US states have sued meta saying they lied about how they were treating kids, or the consequences to kids. And one of the real examples of that, and you can read through it, you can look at the quotes from the emails from the documents inside of Facebook. They went in there and talk to kids and kids said, hey, when you send me notifications really late at night, it makes me anxious makes it hard for me to sleep. They send them to me during school, it's hard for me to focus like it hurts my productivity. It makes me anxious. They went and ran experiments that it sounds like they ended notifications. They just stopped selling them during the school day or they stopped selling them late at night. And kids came back and said I feel as anxious I feel more productive. But it also reduced consumption by like 1% and the quotes verbatim from these documents say things like you know Mark really cares about total time spent on the platform. There's no point in us launching brainless experiments him Wow

Joey Odom  15:30  

So and I don't know that we're painting Mark Zuckerberg to be this huge terrible person with all these evil plans for the world he's not necessarily Dr. Evil petting the bald cat on in, you know in the in the War Room here, but do you believe that it actually it was just like he looked at things in such a, you know one or the other way like no this sacrifice is profit. So his people clearly knew that but that didn't necessarily I guess it gives him a little bit of deniability to say, Well, like I didn't even know about it, because my people didn't bring it to me, but he had clearly created a culture where people wouldn't bring something to them that would be for the public good over over profit.

Frances Haugen  16:05  

You know, I I'm very careful, I never ever paint Mark Zuckerberg as an intentionally malicious person. I think one of the big challenges and like he has lived a life that, you know, I don't think I don't think anyone has really lived a life like it since we had, you know, boy kings, right, like, think about how much power he has and how unilateral it is. And he's been surrounded by people since he was 19 years old, who said he was changing the world for the better for a long, long time, like all the way up until 2016 people people through parades for Mark Zuckerberg in places like Africa, because he was seen as like changing the world that profoundly. And then one day a light switch flipped and said, Hey, and you are causing a lot of harm. And when you have a company where you have a single person who has all the control, so he has 5556, whatever the number is today, percent of all the shareholder votes. So he controls the whole thing, like no, it doesn't matter how much stock you hold, he still has more votes than you. He's the chairman of the board. He's the CEO. But he's also someone who spent his entire adult life, he's now spent more than half of his life building Facebook. And it is if you know, there's a classic line, you know, it's it's almost impossible to convince someone of a truth that that, you know, compromises their livelihood, right, and that their salary depends on not knowing or not believing, you know, a flipside of that is, it's almost impossible to talk to someone about the idea that something you do is harmful, if that thing is so integrated with your own identity. And that percolates down into the culture, you know, there's there was one of the things that made me feel very uncomfortable. You know, within a month of joining like, it immediately jumped out at me was, you couldn't talk to anyone about anything negative about Facebook. Right, like, like, it was, like people would like really pull back from you. And I think there was a thing where, you know, there just wasn't a lot of space for bringing up negative information. And if you think about it, you know, at Google, for example, in Google's golden days, you know, we're talking 2010 2009, you know, they're, they're printing money, they have 60% profit margins, there's no other options, you know, they are they have the goose that lays the golden egg, you know, it was very hard to have conversations about, you know, like, the reality didn't really intrude, right, because you're gonna print more money. Facebook, similar like that. You don't have to have a system that self corrects, if the system has large enough profit margins. You know, most businesses, if you don't have space for criticism, the business fails. Yeah, because Facebook has such a strong network effect, because they are the internet and huge swaths of the world. People don't have their own websites, they just use Facebook. You know, it, there's no penalty for having a culture that can't deal with negative facts. And one of the ways that that I was really horrified that played out and I intentionally brought out documents that documented this was that you know, even people like the CTO would would post publicly internally, things like connecting people is so important that even even if people die, and people are going to die, because of what we do, it is still more important to connect people. And so that kind of mindset where you're not saying how do we do harm reduction, you know, every other industry, we don't just say it's an either or you get to have it or you don't have it. We say, oh, you know, your cars kill a lot of people. But every single year, we work on making cars safer. And that should be our attitude about social media. It doesn't it's not meant to be a criticism. It's like a how do we make you an even better thing? And that kind of culture, I don't think is is present at the level it should be inside of Facebook.

Joey Odom  20:02  

This is maybe more of like a question for the end but I just am so driven ask it now is this you have 44 states in front of the front of the state attorneys general and their their lawsuits filed and you've done such great work in the Facebook files on Wall Street Journal and like all this stuff, but it largely feels consequence free still. Facebook, if it doesn't, I mean, in fighting the hopelessness in me about this is it it feels largely inconsequential, like it feels like, like, oh, yeah, okay, yeah, we'll get better. I promise. Don't worry, I will do it, you know, wink, wink, but it doesn't, it still feels like the Facebook's more for, you know, me and my parents and Instagrams a little bit more for, you know, people will be younger than me. But it seems like there's a ton of users on the platform and their market cap still high, is it? What do you think the consequence actually is here? To them?

Frances Haugen  20:53  

I think one of the things is really simple. It's not frustrating. I think one thing that's that's kind of subversive about technology, is that we live in a world where we expect things to happen on very, very short timescales. And when they don't happen on super short timescales, we it's very easy for us, it'll never happen. And I think within the tech community, this is particularly pronounced, because, you know, if we look at VC funding, and VC funding is on a two to three year cycle, at best, you know, you basically every single time you raise more money in Silicon Valley, you pitch them like, this is the experiment we're running next, like our thesis is that if we do this, the company will grow, right? You're not funding us forever, you're funding us for just this next experiment, and that's successful, we'll fund for the next experiment. And, and that kind of culture that that looking tears out means that when things don't happen in a two year basis, it can feel like oh, it's never gonna happen. So for example, self driving cars, I think, is a great example of this, like when I was at Google, in 2013 2014, is is like, right, before I left, you know, we were like, Oh, my God, in the next couple of years, Google's gonna have self driving cars, right? And is now a decade later. And we still don't have them. We do in some places, we have some self driving cars, but we don't have their you know, it's like our hoverboards, we don't have hoverboards. And but the reality is that the world changes slowly. And part of why we have had such stasis up until now social media is they don't want to have to give us any information legally, that they don't want to give us. And one of the core parts about the lawsuit was about them actively and intentionally giving misleading information, like their Transparency Center being intentionally misleading. One of the things that has happened since I came forward, which is a really big deal, it's just still in like, the setup stage is the European Union stepped in and passed something called the Digital Services Act. And the Digital Services Act has provisions like research or access to data, that say, you know, you have to give us access to data, like you can't anymore, go in there and just give us marketing materials. When we ask you about real problems. You also have to engage with us, like, you know, you have to proactively disclose what risks you know about if you don't volunteer someone, we can ask you questions about them. And then for those risks, you have to come up with a plan on how you're going to reduce those risks, and give us enough so that we can also monitor your progress. And that set of incentives is really different than anything we have seen since 2004, when Facebook was born, or let's be honest, before Facebook, there was Friendster, right, like we've had, we've had a series of a series of social networks, and all of them have been almost equally guilty. And so I I'm super optimistic, you know, the world changes, but it changes because we change it. Yeah, it just takes time.

Joey Odom  23:48  

It is the and by the way, I love that perspective. It's almost like Oh, good, you know, when you when you came out people Okay, finally, something's gonna happen. Something's gonna change. And I love that perspective that yeah, these things take a while. It Do you think that legislation in the US? Do you feel like it's particularly slow or maybe particularly influenced by lobbyist or or something that do you think enough action is being taken? And if not, why isn't more action from a legislative perspective not being taken?

Frances Haugen  24:18  

Well, I want to remind you, we passed the tic toc divestment bill.

Joey Odom  24:23  

Less than you know what, good good point. Thank you for that. That's a pretty darn big deal.

Frances Haugen  24:28  

Yeah. And I want to I want to just re emphasize this here, because every time someone says tick tock ban, I want to be like, No, don't buy into their mark, don't buy into their framing. So for context for our listeners, like this is not the first time we've done this, like Grindr was also owned by a Chinese owners, so Grindr for context for people who haven't heard of it before, think Tinder, but for people who are gay, you know, they came in there and said, not everyone is out in the world. Like that's a giant thing. You can hold over someone if someone's not out about their sexuality. You know, it's a national Security issued have that be controlled by a Chinese company. And like for context for listeners, tick tock I've been, I've been seeing since I came out, I don't understand why tick tock is allowed to operate the way it is. Because tick tock is different than other forms of social media we're used to, and that it makes no promises about what you'll receive. It says like, trust us, the algorithm will send you things will make you happy that you'll keep staying there. But that also means that you know, if they want to just show you what they want to show you, they can. And we've had whistleblowers come forward and say that they intentionally heat content. So they intentionally give content, certain content, more distribution, and then that makes up one to 2% of people's feeds. And when we say numbers like those, we may remember, not all people get an even one to 2% of this content that Tik Tok chose for us, they get to choose who gets the content. And so just imagine for a second, you have a country that is actively hostile to us on a regular basis. Do you really want them to get to choose when society gets what information? Right, right? One of the, you know, I, when I tell my story of like what I worked on at Facebook, you know, I got there and I had this traumatic experience for like maybe three months that was about how social media has caused ethnic violence in multiple places around the world. And I had never ever, like thought about that before. And I've been working on social networks for at that point for five or six years, right? Never thought about it. And I got taken off that project within three months, because they caught a foreign interference operation. So this was they they trace the IP addresses, they traced the accounts, they knew was coming from Russia, they caught one of these foreign influence operations that was targeting us police officers with misinformation. So think about that for a second, like imagine your Tiktok I know what content you're interested in, like I can figure out if you're a firefighter, I can figure a few in the military, I can figure if you're a police officer, and I can choose to send you content, like how is that something we accept? And so like it, I'm very proud that the US Congress actually got together and passed that. But on a on a larger scale, like holistically, I totally get the frustration, right. Like we've had, for example, the kids Online Safety Act, which has been in limbo for a couple of years, I think we're getting really close to passing that. So one of the key changes that happened between last year and this year, was it used to be the bill was enforced by individual states attorney general. And then the people were concerned that it'd be weaponized by some states, and it has now been brought back to level the FTC. And so you know, we keep getting close. Like, these are processes that are complicated, you know, like, we don't know, we have a little bit less of a muscle anymore at being able to pass bills, like when I was in Australia, last September, September and October, as part of like, answering questions for the government meeting with universities meeting with various groups. You know, it was really embarrassing, like people would ask me like, so why are you here? And I be like, Well, we haven't had a speaker of the house for 11 days, the Speaker of the House for 16 days, you know, like I was a joke, I got to try it out every single day. And we've seen remember that the US goes through these phases, like it's happened before, kind of scarily, it's happened during kind of high levels of instability before as a precursor to instability things like the run up to the Civil War. But we aren't in a place right now, where it's a little bit harder for us to think a bit in an ambitious ways. And so I'm not I'm not frustrated about the current progress. But I do spend my time focusing on places like Canada and Australia more than the US for that reason.

Joey Odom  28:55  

So the, you know, we say all the time, I can't wait for Congress, and we can't wait for the Surgeon General do something, we can't wait for the school board to do something. So we as individuals have to begin to do something for our homes. We're so focused, so many people who listen to our podcast are families and families, yeah, communities and families who need to do something. So I want to get to what we can be doing. But before we get to that, and you've alluded to some of it, but I'd love to hear for people. And I don't want to be fear mongering in any way. But I do think it's important that we know like, we know the structure of the algorithm and the structure of these things they are built for, for things that that can lead to harm. They're not built, or at least to not safeguard our kids against harm. So in all the stuff that you uncovered, then, in what you've learned now, what are some things that parents should just be aware of when it comes to, hey, here's really how it works. And you can go down Instagram, Facebook, Snapchat, Tiktok, whatever it is, but here's why this should be important to us, I guess, is the question.

Frances Haugen  29:56  

So, one of the big projects I'm working on right now is something called standard of care. And it's about trying to get out in the open in a collaborative way, what information we do have around the harms of these platforms to kids. And we're trying to scale up with Jonathan Hite did with his, his, he did an open literature review around mental health and kids, which ended up turning into this book, the anxious generation. But we're scaling that up. And currently, we're looking at 32 separate problems for kids. Wow. And those range from things like bullying, right? People hear about online bullying all the time. One little piece of context that I think is really important for people who are above the age of say, 30, because above 30, you didn't really experience this is that kids? Now, you know, it used to be that if you were really a target of bullying, you know, a small fraction of kids really have a rough time in high school in junior high. You know, eight hours of your day were really, really hard. But you got to go home. And most kids come from at least Okay, families. And that meant you got 16 hours to reset, you know, you got to go and sleep and come back the next day and tackle it again. And today, because kids carry on these computers in their pockets. When they come home from school, the bullying doesn't stop, it goes all the way up until they fall asleep. And then the first thing we see when they wake up in the morning is someone being mean to them. And that kind of constant never ending Onslaught is a very different experience psychologically, even if your kid isn't the target of of bullying from other kids, even just the fact that we now have moved beyond largely social media that is directed by people to social media, as drafted by algorithms means that that people get pushed, they get pushed along by the algorithm. The algorithm is as Facebook describes it agnostic, like it doesn't understand the significance of content. And so it can end up taking you from a neutral topic to something more severe. I'll give you an example. I did an interview with a journalist who had just had a new baby. This is maybe two years ago. So happy, healthy baby boy. He's a modern father. So he made an Instagram account for this baby. It had maybe four or five baby friends, all the all the pictures posted by any of these parents are healthy, happy babies, all the thing he all the things he's ever put a comment on or light were healthy, happy babies. And yet about 10% of his feed were suffering children. This is kids who've had horrible, like mangling accidents, they have tubes coming out themselves in their own hospital bed, they have deformities that look painful. And he was like Francis, how did we get from healthy happy babies to this? And the reality is these algorithms, they don't just have your intentional signals, they have your subconscious signals. So these are things like did you stop and dwell. Now they're literally monitoring you at that level, their mind how you hold your phone. And so that means that you know, you know, a new father can't just scroll by a picture of a suffering kid, you know, he's gonna pause, even if he doesn't do it consciously. And these algorithms they picked up, he is interested in babies. But they didn't understand the idea that a picture of a suffering child is particularly harmful to a new father. But it does get a reaction. And so let's apply this to a teenage kid, you know, someone who's not as self aware as that dad was like, he was like, I never engaged with this, why am I seeing this? You know, if you're a teenager, and you're feeling a little bit uncomfortable with how your body's changing, you know, and you search for maybe healthy recipes, you can very quickly get pushed towards content that glorifies eating disorders or, or glorify self harm. And so it's one of these things where parents need to be aware that these these applications are designed very reductively. You know, it's like the example I gave earlier about like, Facebook went and talked to kids, they knew they were making the kids anxious. But if you have a system that just optimizes for how much time people are on the platform, and there's not that human in the loop, you end up with pretty disastrous consequences. So, you know, I think that's the one liner, you know, like, do you want your kid spending time with kids? Or do you want your kids spending time with algorithms that are pretending to show them representative lives of their friends?

Joey Odom  34:24  

I think I know the answer by what you just said. In fact, we just we just had Aaron Lochner who's on our podcast, she's just releasing a book called The opt out family. And it's basically saying, hey, let's just opt out. And it's and it's for all the reasons, in my mind pops up all these impracticalities of what you just said, which is the answer, right? Which hey, let's have them spend time more time more time with kids. So I just you fed me the answer. I fed it back to you but I would like know as a family as a as an adult, a parent listening right now. What do you do? What is the answer and you can be as reductive as you want to hear and I think that's okay, because I think we need to start listening some true and kind of even for myself, all of us just begin, you know, stop rationalizing. But be curious, what is that? What are those things that families should be doing right now?

Frances Haugen  35:08  

So there's a movement that I'm a big fan of called wait for eight sets. And wait for eight says, Hey, like we need to address the fact that kids exist in community, right? And kids put pressure on each other to go online or or you talk to individual kids. And you say like, Hey, do you want to spend time in person with your friends? They say, yes. If you ask them, do your friends want to spend time in person with you? They say no. Right? It's one of those things where like kids are losing, if they feel like other people don't want to change with them, they're not going to change. Because the network effects are so strong. So I think there's this really interesting opportunity. And this is like one of the areas that I really want to do something within in the next couple of years. Like I want to get standard of care out the door. But I want us to think holistically around like, how do you do an intervention at a school level? Like how do you how do you come in and say, hey, you know, we live together, right? Like our our decisions impact each other? Like you can be a family and say I we're opting out? You can't you can't you can do that. That's a huge burden. It's different come and say, how do you have a conversation with your your kids friends with their parents and say, Hey, like, there might be a happier, there might be a happier middle, like, what if we intervene together? I think one of the things that it's also important acknowledges that video games are important also in this equation. So it's not just social media, it's video games as well. Because both of these things are things that take kids away from in person lives, right? Like Video games are designed to be very pleasurable, you know, they're a world that you can control. They're just challenging enough that you don't get bored. But they're not messy, they're messy, like having to, you know, take a risk and ask someone to hang out with you. They're not risky, like asking someone out on a date, that aren't risky, like throwing a party and not know if people are gonna come. And these are the things that the kids are missing out on, you know, they're just the more minutes you spend online, the less minutes you have available to spend in person. And so I think thinking holistically around like, how do we help school communities make space for each other in person is I think the the key step forward.

Joey Odom  37:15  

You're so right. Jonathan Hite describes this, you know, this collective action problem that we're in where it's, it's there's an individual detriment to one person opting out of the system, but everybody wants it. And so but it does require to your point and requires everybody to do it. And again, what an amazing opportunity we have. I mean, it's very, it's a very cool opportunity. And for the people who are thinking ahead and thinking forward, there's going to be some real benefits to that. And we love wait until eighth, Brooke Shannon and their organization, we love that group so much. You say something that I really like, and I'm going to I've want to press against that for just a second. And you say that this is solvable. And it's and we've only touched the tip of the iceberg on what this is doing. And I read all the when when the Facebook file stories came out, thanks to you. And in the Wall Street Journal. It was just you know, had my popcorn reading it. And it was just it was so devastating. And all the effects and all the things that they knew about they know about that are happening. So we haven't even gotten really into like the real meat of how this is affecting teenage girls and 14 year old daughter like how it's affecting teenage girls and all the detriment there. But even with all of how big it is. You still say this is solvable? In your heart, like Do you actually believe this is solvable? I really feel so far gone. And again, we share the same again, I'm playing a little devil's advocate, because clearly we believe it's solvable as well. But why do you think that it's solvable? Like what what about it leads you to believe that this is solvable for as huge of an issue as it is

Frances Haugen  38:49  

the the book I want to write next. So I my memoir came out last year. For those out there. It's a policy book for people who don't like policy books. So there's lots of fun other stories. The book I want to write next, like basically takes the Attorney General filings, which are amazing, you know, they're freely available to anyone right now you can go I have a highlighted copy that you can link to your users, if you want you can see, like, what were the things they redacted didn't like to read them. You know, I want to go through and just take the evidence in all those court filings and say, Hey, here's an adjustable version. These are the things that Facebook know, and what they chose not to do. And like if we were going to build pro social social media, how would you address these problems? Because it gives you a chance to say let's let's stop arguing over if hypothetically, it's making kids anxious, or let's stop hypothetically arguing over you know, is it making teenage girls more insecure and feel worse about their bodies, like Facebook went in there and interrupt things, you know, they said on average, I think it's like 10 or 20% of the average the average so it's gonna be much worse for some girls. The average girls feed Yeah induces negative social comparison, right? These kinds of things. You can say, how can they even measure that? Well, it's it's one of those things where, you know, they ask people, they say, hey, does this piece of content make you feel worse about yourself? So they can go in there and develop ais that help them measure and assess the overall content. So I want to take the Ag filings and say, let's stop hypothesizing whether or not these problems are real. Just assume they're real, right? Just take Facebook's word for it. How would you design the product in a way that respects the autonomy and dignity of the user? Because I think right now, we often get locked into thinking about these problems as like the way they are today. Or the way China solves them. It is the only country out there that is actually like aggressively saying our children matter. We have to protect them in China. Right? You know, they go in there, and for the equivalent of tick tock in China, you can only use it for 40 minutes a day, if you're under the age of 18. Same for video games, you know, cuts off at 10pm. That's a very top down, controlling authoritarian way of intervening. What would a way that said, Hey, I respect your autonomy and your dignity. Right now, you don't actually have those that much in our current system? What would it look like? So I'll give an example. In the case of sleep, you know, it's things like, imagine if at noon, we'll say you stay up till 2pm, you're 16 years old, you stay up till two, excuse me, 2am. You stay up till 2am. The next day, when you're kind of hungover not class, maybe 10 or 11 o'clock in the morning, a little thing pops up and says, Hey, I noticed you're up quite late last night, when you want to go to bed tonight. And you know, everyone has willpower at 11am in the morning, right? 10pm at night, maybe less willpower midnight, much less willpower. So imagine for two hours before that bedtime that you picked, you know, we're not picking it for you, you picked it for two hours before that bedtime, Instagram that's a little bit slower and a little bit slower. It's almost like you're pushing it harder, right? Like the animations are just you get less for movement for each little flick. You know, this is remember right now, that amount of movement is finally optimized to be as you know, maximize how much you consume. But imagine you've slowed it down just a little bit. We've known for 20 years that small amounts of latency caused people to use these products radically less, you know, around your bedtime, you would just get tired and go to bed. You know, that's something that respects your choices, and the fact that you're vulnerable human, but it works with you, it doesn't work against you. Or like if you had a girl, or a boy who's looking at more and more content that might make you depressed, right, like to give parents context, there is a lot of content on Instagram that glorifies self harm, right? Like that is just something you need to know this is real. If you don't speak English, it is substantially worse, right? Like in Norway, in Norway, one of the big things that happened was there was a network have over 500 Instagram accounts glorifying self harm. And like 35 girls die from that cluster, right? Don't speak English as much, much worse. I remember I said before they can tune they can make little AI systems to say, is this content likely to have you say like, it's making me depressed? They can go in there and say, Hey, I noticed you're looking at more content that people say makes them feel depressed than you were a month ago. You know, here's a here's a little graph, like we see people who have more of it, do you want to keep looking at it? Would you like a little bit less? Maybe you only want to during the day, maybe you don't want after? 6pm? You know, we could be designing software that put control back in the hands of people. And so like, right now, those conversations, I feel like are really blunt, you know, they'll say things like, you know, turn off the algorithm. And that's it. You know, we don't say like, what if we were designing with people instead of designing as if people don't exist?

Joey Odom  43:58  

So the question in one, it's shocking that China's the one taking this seriously, I think we should all pay attention to that. Men, good for them. But they again, like you said they have a system where they're able to do that they're able to be authoritarian, we live in a free market where people really do value their own personal autonomy and really hate being told what to do, which is we won't get into the goods or Bad's of that. But in a free market, this product only happens if it's in demand. So how can we people listening today? How can we make sure that it's in demand? What can we be doing now to make sure that we can help promote this sort of this sort of product that can come and we I think anybody listening, listening to what you just said, like, Okay, that sounds amazing, but what can we do to make sure that there shows a public interest in that sort of thing? Does that make sense? Totally,

Frances Haugen  44:50  

you know, the products that we use right now are just the the consequence of the incentives we have in place. You know, when we have a system where the only thing that gets rewarded is my profit I kept you online longer, you clicked on more ads, you got more minutes, you got more ads. This is what we got. Imagine a world where Facebook had a report, this is how many kids were online at midnight 123 AM, that's a, you know, that's they've, it's one of 10 metrics, they have to report every every week or every month, very rapidly, they would figure out how to make that number smaller. Because right now, about 30% of teenagers are on screens till midnight or later, which means that like 10%, are on to two, and that's most school nights. Right? And so it's one of these things where, you know, we can demand more transparency, you know, we can if you're an advertiser, you know, you can say, hey, you know, I'm joining with a bunch of other advertisers, if you don't release XYZ data by the state, you know, we're not asking for the crown jewels, we're just asking for some. And one of the projects that might nonprofit beyond the screen is working on is, you know, once we have a taxonomy of harm, we can go in there and start saying, what's the minimum amount of data? What's the minimum viable queries we need in order to be able to actually hold these platforms accountable?

Joey Odom  46:23  

There's a lot of failure, we can go another hour, Francis, what what is you can do back? I promise you, I will believe me. So how can we help you? I mean, for the simple listening, I mean, one I do want everybody to go read your memoir is fantastic. The power of one everybody needs to buy that to to be aware of it. Beyond the screen, your nonprofit, how can we help you You're You're such a powerful voice, what can we be doing for you to promote what you're doing?

Frances Haugen  46:52  

I'm hoping to start doing crowdfunding this year, I'm hoping to start doing YouTube, I would say for now, I on my website, Francis haugen.com, even go and sign up for our mailing list. Like put your name on there. And I can reach out when we start doing those things. I would say like, call your congressman call your senator. Like say we want change. You know, we want things like the kids Online Safety Act. One of the things that I find really interesting right now is if you Google something like that, you get a lot of scare mongering from from people like the Electronic Frontier Foundation. You know, we don't hear as much from the advocates that are pushing for kids. But at this point, there's like over 60, co sponsors of that bill. So beginning to come in there and say like, Hey, we haven't passed a law about kids online since 97, like 9097. Right.

Joey Odom  47:43  

Now, that's unreal.

Frances Haugen  47:45  

And so, you know, we're working with laws that came about in an era where we had bulletin boards and dial up modems. And so we even even just calling and saying, I want I want something more, you know, I deserve more. My kids deserve more. I think that's a great step forward.

Joey Odom  48:01  

Francis, thank you for your voice. Thank you for your time. We're grateful for you. We'll link everything in here. Please do the one thing that's the one job I'll put this in the intro and the outro outro everybody go sign up at Francis hogan.com for your newsletter and let's we love please keep us informed. Please keep us honest. Please keep our kids safe. Thank you for everything you're doing. Have a great day.