How Social Media And AI Impacts Our Mental Health: Reclaiming Our Minds And Hearts And Healing A Divided World - Transcript

Introduction: Coming up on this episode of the Doctor's Pharmacy,

Tobias Rose-Stockwell: We need to renegotiate our time with these tools. We need to renegotiate our attention with these tools, and there are increasingly better ways of doing that.

Dr. Mark Hyman: Welcome to Doctor's Farmacy. I'm Dr. Mark Hyman. That's Farmacy with an F, a place for conversations that matter. And if you've been worried about the impact of social media and technology on our minds, on our mental health, on our wellbeing, you're going to love this conversation with a good friend of mine, Tobias Rose-Stockwell, who wrote a book called The Outrage Machine, how Tech Amplifies Discontent, disrupts Democracy, and What we can do about it. Thank God for what we can do about it, because otherwise it's a very depressing story. It'll be a horrible podcast to listen to. So Devise is a writer, he's a designer, media researcher. He's worked in media for a long time. His work has been featured in the Atlantic, wired NPR, the B-B-C-C-N-N and lots more. His research has been cited in the adoption of key interventions to reduce toxicity and polarization within leading tech platforms. And I don't remember a time when we had more polarization in our society in my lifetime. He has previously led humanitarian projects in Southeast Asia. He's focused on civil war reconstruction efforts, and for that work, he was honored with an award by the 14th a Lama. He lives in New York with his cat waffles and he's an awesome dude. Welcome, Tobias.

Tobias Rose-Stockwell: Thanks so much for having me, mark. Really, I would be here.

Dr. Mark Hyman: Okay. Well, this is a topic we've covered a little bit on the podcast, but I want to dive deep into it because I think we are not really aware that we're living in the Matrix, and we may not be plugged in a pod like Keno Reeves in the Matrix movie, but it's not much different where our worlds are being curated, controlled and manipulated by forces that are often invisible and that change our behavior, change our beliefs, change our actions, and ultimately lead to serious problems including mental health issues, anxiety, increased range of suicide, and it's a big problem. Many of you might've seen the movie The Social Dilemma, which was a very sobering movie about the impact of social media. There's a new online video basically by a lecture by Tristan Harris and his colleague Ara called the AI Dilemma. And we're going to go both into the issues around AI and social media because we're at this next inflection point where if we don't deal with this, it's going to get out of control. And I think there's increasing awareness from even the leaders in the space that we need to start being more judicious about how this gets rolled out with social media just kind of was like, shoot the gun and then see what happens.

Tobias Rose-Stockwell: Shoot for questions, fire, ready, aim.

Dr. Mark Hyman: Exactly. Fire ready, aim. So the rise of social media has really revolutionized the way we connect and share information, interact with each other. It's wonderful. I use it for my business. You get to watch me on Instagram, it's awesome, but it also has significant effects on our health and our wellbeing. So tell us from your perspective, as you begin to research this and your book outrage, how has the internet broken our brains? What do we got to do to fix it? I mean, give us a high level. This

Tobias Rose-Stockwell: Is the

Dr. Mark Hyman: Real Yeah, let's get a high level

Tobias Rose-Stockwell: Simple question

Dr. Mark Hyman: There, and then I want to get into why you got into this, what you're doing and more.

Tobias Rose-Stockwell: Totally. Yeah. So social media specifically has given us these very strange new superpowers. And the primary superpower I think that has given us that we didn't really recognize that we were getting when it happened was the superpower of hyper virality, right? The ability to instantly say a thing, feel a thing, post a thing, and have everyone instantly feel it themselves. So that hasn't really happened in our species history. We haven't had this kind of magical power to just beam our thoughts anywhere on earth, and we've been struggling with it ever since this particular set of features were given to us on social media and we don't really think about it. We like the power, but it actually is just kind of like Spider-Man, right? Great power, great responsibility. You end up doing a great deal of damage when you don't understand how to use these kind of magical new abilities we've been given. So yeah, I think that's a really, really big important piece of, it's just recognizing this is something wholly new in our species history, and also it kind of follows an interesting kind of historical path of disruption and renegotiation and learning to manage our new powers that we've had in the past. So I'm hopeful that we can figure it out. But the last decade specifically has been the chaos of the last decade has specifically been a result of this new superpower of virality.

Dr. Mark Hyman: Yeah, it's fun. If you're watching a gorilla video and I want to share that, friends, awesome.

Tobias Rose-Stockwell: But

Dr. Mark Hyman: It gets very dangerous when misinformation and increasingly things that are fabricated. I mean, that's the scary part about artificial intelligence. You have somebody looking like they look sounding like they sound and being totally fabricated. So you could have Trump or Biden completely being AI manufactured and saying stuff that they didn't even say. And so we don't know. It's true, it's not true. I mean, I am working on a project called Function with my partner z, and one day he sent me a video of this doctor talking about this lab tests. He says, what do you think of using this for our platform, which is Functional health is a lab testing platform. And I am like, wow, that's great. Who is this doctor? And they go, well, she's an AI generated doctor, not even a real person.

Tobias Rose-Stockwell: Oh boy.

Dr. Mark Hyman: I'm like, whoa, this is scary. But how is it really the virality piece? How is it actually shaping our behavior in a negative way? We know the positive benefits, but how is it actually hurting us? I think this is where I kind of worry as a doctor what's happening to the mental health of children, of increasing adults who are hearing one perspective. I mean, you click on one conspiracy theory in your social media and you're going to get every single one. I mean, I was in Hawaii once and I've had these hippies that lived up in the country and they're like, well, there's no such thing as germs and bill gates and planted a chip in our neck and the earth is flat. And I'm like, what? Totally.

Tobias Rose-Stockwell: Yeah, totally.

Dr. Mark Hyman: I'm like, how do you get all these things in one?

Tobias Rose-Stockwell: It's funny, I think when we think about conspiracies and stuff, you probably remember back in the day, conspiracies used to be kind of fun. They used to be kind of funny.

Dr. Mark Hyman: Who killed JFK?

Tobias Rose-Stockwell: Who killed JFK with alien autopsy, that kind of thing. They used to be actually kind of fun entertainment for us. But there was a

Dr. Mark Hyman: National Enquirer, right?

Tobias Rose-Stockwell: Exactly. Exactly. Yeah. Hillary Clinton and the bat person had a baby, that kind of thing. And there is something that's kind of inherently just joyous and irreverent about engaging with conspiracies, right? You're like, but what if? And it's like a fun sharing activity with our friends. It feels really positive a lot of the time, especially when we're feeling a little disempowered by the man and society and our jobs. It's like, yeah, but what maybe? But we don't really have a mental model for understanding reality without what is basically historically the media. We all use proxies to understand the world writ large, and we need to rely on other people to tell us what is true. And that involves a whole system of verification of checks and balances of citation that comes from a hundred plus years of institutional knowledge that has gone into the world of academia and journalism.

Tobias Rose-Stockwell: And it is not without faults, don't get me wrong, there's problems with that system and ways it can be improved, but is much, much better than just the instant viral shares and the instant confirmation bias that comes from clicking on something, getting more of it. You're like, oh see, I told you so it was that person that did that thing. We can kind of find any thread that confirms our inherent beliefs. And confirmation bias is a real fundamental problem when it comes to understanding the world wr large. So we require these institutions, we require better knowledge systems to actually understand the world. And social media has exploded. Our traditional knowledge management systems, which historically used to be journalism. Journalism has really deeply suffered in the age of social media. And I think you can see that across the board with the vast new conspiracy theories that are becoming

Dr. Mark Hyman: Used. Yeah, I mean, you had the news, it was three channels. It was like Walter Cronkite and Brinkley, and I forget the other guy, but there was, and they all basically said the same thing, reporting on stuff that was factually checked and verified. And now investigative journalism is, I mean, I was reading an article about some scientific report from the WHO yesterday about aspartame and the challenges of potentially causing cancer.

Dr. Mark Hyman: And I was like, wow, this journalist really talked to a lot of people, but they didn't actually do the homework of looking at the studies and actually dissecting them. And as a scientific reporter, they should be able to look at the data and talk about the validity of the data, the challenge of the data, the nuances of interpreting it, help people make an educated decision. So understand the context, not just say, oh, well the American Beverage Association guy said it's fine, but the WA showed it's not. And the FDA supports the Soda Pop Association, or we call it the Soda poppa, but it actually was called the Soda Pop Association. It's not called the American Beverage Association. And it it's like, wait a minute, why don't you actually tell us what the science showed so I can make an informed decision rather than just tell me what all these experts said and then all have vested interests and all have conflicts of interest. Right?

Tobias Rose-Stockwell: Totally. And that's hard. I want to acknowledge that it is actually very difficult to figure out what is true. Truth has been something that our species has struggled with forever. We finding the actual truth of the matter. There's very few points of reality about the wider world that we understand on our own, that we learn about on our own. We actually rely upon these proxies, these trust proxies in our extended networks of humans whose job it is ostensibly to figure out what is real and what is not real. And we only see our day-to-day lives, and we have to hear these stories from other people to really figure out what is true. And so historically, journalists would do a decent job of that. I think that one of the biggest difficulties now is recognizing that the media is very imperfect and we're exposed to so many anecdotes about journalistic failures that it becomes hard to trust the entire enterprise.

Tobias Rose-Stockwell: And we all have at least one of these in our pocket where we're like, you can't trust that newspaper outlet. You can't trust that particular journalist or you, I don't really believe them. And that's hard, right? That's hard. And it is important. I think though, put this on a spectrum of sense-making in that they probably know better than your neighbor or your Uncle Joe about this particular topic. They don't know the best possible information. They're not in the academy studying this stuff every day, but if you put it on a spectrum, they're probably at a higher point, a higher point of accuracy than your average person that you're going to see on Facebook or average your average relation person that you follow on Twitter. And so that's been one of the biggest things is kind of reckoning with how truth is hard. Truth is just a very difficult thing, but I want to be a little bit hopeful here is we actually know how to figure it out. We've done this before a few times in the past when we've gone through periods of real chaos and misinformation, and there are tools that we have to actually help sift through what is real and what is not real.

Dr. Mark Hyman: So Tobias, how did you get into all this work? I mean, you've done working in Cambodia, you've helped and deal with some of the aftermath of the Civil War and the traumas there, you won an award from for it. How did you get into this whole work around technology and its influence on our health and mental health and wellbeing?

Tobias Rose-Stockwell: Yeah, yeah, so I had an experience going, I have kind of a weird backstory here, so I just want to acknowledge that this is kind of a curious entry into the world technology, but when I was 23, I was traveling through Asia as a wee backpacker after college, and I met this monk, this Cambodian monk who basically invited me out to the middle his of the countryside where he lived with his extended family. And so I was like, yeah, sure, why not? I'll go out with you. That sounds great. I've been doing some volunteer work here and there, and I was very curious about the culture and he brought me out there and brought me this little squat, little pagoda in this dusty square and sat me down at bamboo mat and I didn't meet his family. I met hundreds of villagers, common council of village elders, and they got up one by one and said, thank you for coming and for agreeing to help us rebuild our reservoir. Thank you for helping us with this project. We are so grateful that you've agreed to do this. I'm like, excuse me, sorry. I agreed to do what?

Dr. Mark Hyman: And I'm thinking of maybe the wrong person.

Tobias Rose-Stockwell: This was

Dr. Mark Hyman: Not

Tobias Rose-Stockwell: Discussed beforehand, but basically this monk ran a local NGO with a bunch of other monks in this community, in Siir province, this tiny little province in Cambodia, and he was looking to rebuild this massive irrigation system that had been destroyed during the Civil War there. And Civil War was really bad in Cambodia, if you've ever heard of the killing Fields, that movie was entirely based on real life events in Cambodia. Really like a terrible, horrific time. The country saw the highest per capita loss of life in modern times. It was like

Dr. Mark Hyman: 3 million people in a very small population, right?

Tobias Rose-Stockwell: Yeah. It was over a million people in a very small population, and it was a weird sort of civil war. It wasn't like one tribal group against another tribal group, which is usually what happens in civil wars. It was actually kind of an auto genocide in which there was this hyperpartisan political extremism that just swept through the whole country and turned the nation against itself. It was this really scary, weird thing that happened in the seventies. Anyway, basically I was motivated to help these monks. I said, look, you have the wrong guy, but I'll at least be an advocate on your behalf. And so I wrote an email to some friends and family back home. It was in a passioned email. I had a lot of emotional kind of resonance in it talking about this experience and making these monks. And that email, it went into, I had an email list of a friends and family, but that email specifically went into a list serve that a friend of mine had developed that you would recognize it immediately today, this is 2003. This was before Facebook, before Twitter, before anything,

Tobias Rose-Stockwell: But you'd recognize it as social media. It was a platform to connect friends of friends. It was for my extended community in California. We were all interested a specific kind of music, and we hang out there, but if you looked at it today, you'd be like, oh, this is a social media platform. Anyway, my friend was very ahead of the curve. He's a programmer ahead of the curve, and along with another person built this platform, that person went on to be one of the first engineers at Twitter actually. But the email, rather than just being like, oh, cool, that's a weird story, cool. It actually went hyper viral in my existing community, and I got this huge outpouring of interest in this project, friends of friends of friends that would pass it on. And all of a sudden I, being this poor backpacker, traveling through Asia, had people email random strangers emailing me, being like, how can I help?

Tobias Rose-Stockwell: What can I do to support this project? This is amazing. So for that reason in that way, and suddenly I had support for this project. I was pulled by this new viral superpower. I touched this new viral superpower a little bit before other people did, and was able to suddenly find this huge reservoir, reservoir of interest in rebuilding a reservoir in Cambodia. I ended up living, I thought it would take two months and $15,000 to rebuild this thing. I ended up living there for almost seven years, rebuilding a segregation system and raised a quarter million dollars this community to rebuild it. It was a huge project. We found landmines. We had to, it was like a whole ordeal, but it was empowered and the entirety of its inception came from touching this new superpower, virality a little bit earlier than other people. So I've been tingling with this kind of concept of the influence of virality in our lives for a little bit longer than most people. I went back to Silicon Valley and I worked in Silicon Valley with a lot of designers and activists and developers that were working to make social media a really good thing, and we're very, very interested in trying to make social media had this great promise. You remember the Arab Spring? You remember this era of 2011, 2012. Everyone's like, social media is great for democracy. I mean, think about

Dr. Mark Hyman: It, good for the world. The Arab Spring allowed people to coordinate and communicate to help liberate themselves from oppressive governments. Absolutely. But the same way it was used in January 6th to coordinate and help people attack the capitol and create insurrection in America. So it can be used for good or bad. It's like you can use a gun to sort of feed yourself by hunting deer. You can kill somebody with it, right?

Tobias Rose-Stockwell: Right. Exactly. Exactly. And I think I'm a little skeptical of some of the more pernicious, I think narratives about social media. I know it was not designed to be M Zuckerberg's, not a Machiavellian genius trying to ruin democracies on the world. I think he really does care about solving these problems. He makes a good scapegoat and so do a lot of the owners of these companies and stuff. But there's these inherent harms, what I call a dark valley of hidden harm that comes when we start using tools, when there is this huge mass adoption, and we just don't know how it's going to be used for ill, and understanding that in the context of ai, which you mentioned appropriately a similar thing, I think we're going to start to see huge benefits. And then we're also going to see this very distinct set of harms that are going to be hidden from us until they're right in front of us. And I think we need to learn to manage that and approach that with a real sense of caution because these tools are so, so powerful and getting more powerful by the day.

Dr. Mark Hyman: Yeah, that's really, it's quite almost like Charles Dickens, right? It's the best of times, the worst of times. These tools can be used for incredible good, but also incredible harm. We saw that with the movie, the Great Hack. We saw that with the social dilemma, and now we're going to see it with increasing awareness around ai. And I think when you look at the political process, when you look at the polarization of society, when you look at the N many, I was just at a dead show in Gorge in Washington state, and it was so beautiful. There were 30,000 people there, and I just looked around and there were people from all size and shapes, colors, well, many tie-dyed colors for sure. And everybody was sort of in this culture of, and the dead culture is sort of like this of collaboration, of friendship, of helping each other, of celebrating together, dancing together and enjoying the music together.

Dr. Mark Hyman: I imagine if you had political conversations with all these, they'd all be fighting and shooting each other, tearing each other's hair off. But there was this moment of like, wow, we're all people first. We're all the Americans second or whatever, wherever we're from, and then we're whatever else we are, Republican, democrat, communists, anarchists, libertarian and whatever. None of that kind of really matters when it comes to our humanity. And it seems like we've lost this common thread of our humanity and be able to look at each other in the eye and understand that maybe we have differences of opinion or different perspectives. And it doesn't mean that we have to hate each other. And it seems like the algorithms are so driven by things that are really fostering the worst in humanity. They're designed to keep us addicted, to keep us engaged, and those algorithms are based on this psychograph profile and that actually engage with our limbic system, our ancient reptilian brain and keep us in the state of fight or flight and stress response.

Dr. Mark Hyman: But it also has another possibility. It can really help in a very different way to elevate people. But the problem is that we haven't built a system of social media that actually prevents this. In other words, the algorithms are set up to increase eyeballs, but what if the algorithms were set up to elevate our consciousness and the systems of engaging that were different? What if you had to pay to use Instagram or pay to use Facebook or pay to use Twitter? And their revenue came not from advertising, which encouraged the sort of stimulation of our dopamine receptors and our worst instincts and did the opposite. So there could be different business models that allowed for this. Now, it's harder to get 3 billion users if you have to pay for it, but I think we might want to rethink how we're doing. And I'd love to understand, I've, I've been studying mental health and social media for years.

Dr. Mark Hyman: What do we know about the pitfalls and the harms as well as how it impacts our overall levels of stress and mental health issues? And then how can we understand how that technology is used in a little more granular way? And then eventually I want you to take us through how we can maybe reimagine this sort of the promise that's at the end of the sentence of your subtitle of your book, which is How to Amplify Discontent and Disrupts Democracy and what we can do about it. I'm very interested in what we can do about it part, but I want to get into sort outline the problem first and then I want to get into

Tobias Rose-Stockwell: What

Dr. Mark Hyman: The hell are we going to do because screwed.

Tobias Rose-Stockwell: Yeah, absolutely. Absolutely. Yeah. I think it's important to think about the harms of social media, not in terms of just one specific thing. These are very complex systems. Humans are very complex. The human body, the human mind is very complex. So if we approach it as if there's just one silver bullet, we're not going to get anywhere. If it's just fixing advertising, I don't think that's actually going to solve the problem necessarily. So I think it's more important to think about it like you would maybe like a human body and an illness. You're not going to solve your athlete's foot with the same thing that will solve your broken arm. So I think there's some good analogies from the world of medicine in terms of thinking about how social media influences us and how we can intervene and make it better. So we'll just talk about the problem for a second, which I think is important to just be clear about.

Tobias Rose-Stockwell: One of the primary problems social media in terms of mental health and anxiety is that it is, it's becoming our primary source of news and information. For a lot of people, it's becoming, even if you still go to the New York Times or you go to Fox News or wherever you get your news from, social media actually is a certain type of filter on the news. And more than half of American adults get news from social media today. And social media posts tend to be negatively valent when it comes to news. We actually tend to respond more immediately to news that is negatively emotionally valent. So stuff that is outrageous, missing context, anger inducing, disgusting, we tend to respond to that most quickly and immediately. And then that's a really good signal for algorithms to track. And if you build a basic engagement algorithm that's trying to track what people are interested in, this actually goes back to traditional news. If it bleeds, it leads, algorithms have figured that out. If it gets you stuck on the item, you are going to keep responding to it. That's right. Is

Dr. Mark Hyman: That why I get so many gorilla videos?

Tobias Rose-Stockwell: Maybe perhaps.

Dr. Mark Hyman: I love it actually. Totally.

Tobias Rose-Stockwell: Totally. And I want to calibrate that we've been with these tools for a little bit now long enough I think in some ways to start to see new versions of slightly more kind of healthy algorithmic curation that have begun to be, that have tried to keep you happy and not keep you depressed and sad and doom scrolling, but we have a word for it, doom scrolling. It's that consensus. Everyone knows what that is. Everyone has done it at this point in time. It's a strange new pathology that it has no clinical reflection yet, but it's something that we absolutely feel. And I think that when we are exposed to too many of these negative stories, too many of these negatively valenced posts, it does create a sense of learned helplessness, which is when we feel a stressful situation consistently and repeatedly, we basically start to believe that we can't control the situation and we can't actually respond, and we can't do anything to solve the world's problems. And that's really problematic. That's a problem in itself. We need to feel like we can actually tackle the problems that we're facing. And if social media has given us this huge new body of available problems and issues, if we are feeling helpless to solve those issues, then that that's a recipe for extreme depression, recipe for disaster in terms of our mental health. And that's really important

Dr. Mark Hyman: When we think about what's the genies out of the bottle and our identities, our beliefs. I can't think of any better analogy than the Matrix. I feel like we're all plugged in the matrix and we need to fricking unplug so we can actually have a sense of what's true and real, and we don't even know where to go to find out what the truth is anymore. It's a little bit disorienting for people. Its disorienting for me, and I feel like I'm fairly well educated, fairly well read.

Dr. Mark Hyman: I pay attention. I travel the world, I meet all sorts of people. I listen to different perspectives, and it's like what actually is true and what is actually a person to do as they're trying to navigate their life and figure out how to make sense of the world. And in a way, social media has helped us make nonsense of the world. And how do we start with sensemaking again? How do we deal with these digital technologies that we have now? Is there a way to put the genie back in the bottle? And I want to eventually get to artificial intelligence where the genie is not quite fully out of the bottle yet, and what do we got to do?

Tobias Rose-Stockwell: Scary genie. It's a little pinky out of the bottle right now. Yeah, so I think focusing on the issue of sense making is really important. So in studying this book on, so this book is actually a history book. Once you get through the first half of it, it goes straight back to every previous media disruption in history that I go back as far as the printing press, trying to understand basically what happens when you increase people's ability to see knowledge, share knowledge, and be emotionally excited by knowledge. And every single major media technology has had a tremendous influence on our species. And so starting about halfway through the book, it goes back to Martin Luther, the printing press, and what happened when we were introduced to the printing press, and it turns out the printing press was arguably the most violent invention introduced to continental Europe up until that point in history. It caused huge schisms within the existing power hierarchies. It totally upset society, and it caused about a hundred years of civil wars.

Dr. Mark Hyman: Printing Press.

Tobias Rose-Stockwell: Printing press, literally

Tobias Rose-Stockwell: Old books, those books. And you wouldn't, at the end of that point have said, no, we don't want that printing press. We don't want this book, these books. We want people to go back the old ways. But there was this deep, deep, dark period in which people were deeply confused in that era. A tolerance like tolerance, the idea of tolerance was actually a sin. It was actually a sin that, so if someone was of a different political persuasion or religious persuasion, it was kind of your job to go up to them and confront them about it or be violent with them about it. So you think about how disruptive that was to go from one way of being to another way of being in the world.

Tobias Rose-Stockwell: And so we don't tend to think about information technology as being such a disruptive and violent thing, but it absolutely can be. And the reason is because it confuses us. It confuses us. It gives us access to huge new, huge new models of moral reasoning about the world. And it also exposes us to a tremendous number of possible outrage. And many of those outrage are real. Many of those outrage are not real. And to figure out the differences between what is worthy of our attention is really part of the problem that we're facing right now. So when we talk and just kind of lean towards some optimism towards solutions here, coming back to the beginning of our conversation, I can't emphasize enough how problematic myth and disinformation is and we have as Americans, I think we have a healthy skepticism of authorities. We have this kind of anti-authoritarian disposition where it's like, don't trust the government, don't trust the experts. We can figure it out on our own. That's a very American disposition.

Tobias Rose-Stockwell: But there is a real difference between authoritarian speech and mediated speech, which is speech and information that comes from media entities that are built to help try to parse truth and falsehoods. And they don't always get it right. They're not always going to give you the exact right, the exact right truthful item, but it's going to be oftentimes much better than your average person trying to figure it out by doing their own research online. And so we do need to find these proxies, these middle layers of proxies, and it's, we're actually lucky in insofar as good information has a fingerprint. And what I mean by that, by good information, I'm not just saying good information. I'm saying that good information, accurate information has a fingerprint. It tends to be well sided. It tends to go through a few of these layers of refutation and peer review and people that are trying to figure out whether or not it's accurate. And we can look at that in how the information travels. If it's one person's idea that comes to you directly, it's less likely to be true than one person's idea that has gone through three or four cycles of other people calling bullshit on that idea

Dr. Mark Hyman: And

Tobias Rose-Stockwell: Trying to actually figure out if it's true or not. We are much better at identifying the failures of other people's logic and the failures of other people's assertions than we are our own. And that's really the point of free speech in the first place, is so that we can share and we can criticize each other openly and improve the available knowledge for everyone.

Dr. Mark Hyman: I mean, it's increasingly disturbing to hear censorship happening and free range of opinion not being heard and debate being shunned. And I think, wow, what kind of a society we're in, we're burning books. Totally. I just watched the Ken Burns documentary, us and the Holocaust, and it was just, I thought, gee, we had this sort of relatively new rise of division in society, but it always existed. It's always been a thread in America, right? The north and the south, the slave owners and not

Dr. Mark Hyman: The isolationists and the globalists in America. And we're sort, I think, always prone to this, but how do we find the, as Abraham Lincoln talked about, how do we find the search for our better angels? How do we get to our better angels who are going to inspire us instead of take us down the path of the worst aspects of humanity? There's been war and violence and rape and destruction for millennia, but I feel like human consciousness seems like it's slowly getting better, but it's again, these forces that are at play now in technology are seemingly taking us really out of any age of enlightenment that we were in for any period of time. How do we find our way back from that?

Tobias Rose-Stockwell: Yeah. So I think it's really important to recognize that, and this is why I come back to the misinformation point a lot, is because the feeling right now is one of deeply, there's threats everywhere. We feel deeply threatened by the world, by our political enemies, our ideological enemies, our the enemies to the identities and the people that we hold dear. That is something that social media does very well, is serve us threats more than it would traditional media would, right? Tradit media would focus on a single threat at a time. And now social media has exposed us to basically infinite threats, always right? It's like if you're worried about something, you can find an anecdote that represents that deepest fear. And social media is very, very good at serving us up. Those algorithms that prioritize certain content over other content and our own biases play together to actually make us see these threats more apparently than otherwise.

Tobias Rose-Stockwell: And something strange happens when we are exposed to a lot of threats. We actually seek, there's this basically this kind of tribal switch that happens in our brains in which we start to affiliate ourselves more strongly with ingroup and outgroup behavior. So we look for safety in identities that feel like they're more like us, and we look to denigrate the outgroups that are threatening. And if you're curious about why there's so many hashtag identity, this hashtag identity that on social media, that is actually one of the reasons why, is because everyone feels a little bit threatened on social media and they feel like, I need to declare my allegiance. I need to protect the things that I hold dear in this space. I think it really does come down to this fundamental idea of threat that if we're exposed, if too many threats, then it actually causes us, it causes these basic tribal emotions to increase dramatically.

Tobias Rose-Stockwell: And there's some decent research showing that this Jay Van Beil at NYU, he has a great book called The Power of Us, which I'd recommend on this topic as well, which shows much identity shapes our behavior, particularly online. It really does. It does influence how we see others. It's like a lens that we suddenly are putting over our eyes, and I see you and you are not just a human. You are that you are now a Republican, you're now a Democrat, you are now. And these identities become far more instantaneously salient to us because we're looking at everything through the lens of social media, and unfortunately it doesn't just stay on social media. I think one of the biggest problems here, it's like these narratives stick with us. They follow us to our dinner tables, to our congregations. They follow us everywhere we go. And if the threat is pernicious enough, then if it's scary enough, then it will keep running a little process in the back of your head and it will kind of infect most of your interactions.

Dr. Mark Hyman: So how do we start to roll this back? I mean, what's the last part of your book? Let's go into that. I think people understand that we're in this crisis, and maybe before we get into the how to fix it, let's just sort of touch a little bit on artificial intelligence. I think I hope to have Tristan Harris on the podcast soon, but he is sort of fighting this fight in a very vigorous way to ring the alarm bell before things are too late and it's accelerating so fast. And I wonder if you could talk about how maybe we're at the year 2005 in terms of where Facebook was, we're now

Tobias Rose-Stockwell: There

Dr. Mark Hyman: With

Tobias Rose-Stockwell: Artificial

Dr. Mark Hyman: Intelligence, and nobody was really paying attention. Everybody thought this was great. There's no downsides, there was no regulation, there was no oversight. I mean, it was just striking. Even regulators don't have an idea of what the heck it is. I mean, even recently as a year or two ago, I think one of the senators was questioning Mark Zuckerberg at a hearing, and he said, well, how does Facebook make money again? I'm like, well, advertisement, which was just amazing.

Tobias Rose-Stockwell: Not our best moment in our political understanding of the world. Yeah.

Dr. Mark Hyman: So tell us why should we be concerned? There's a lot of promise to social, I mean to artificial intelligence. I'm building a platform for helping you

Tobias Rose-Stockwell: Using

Dr. Mark Hyman: Artificial intelligence to help people with their own health information, with their lab tests, and to bring in functional medicine into water acceptance using the power of machine learning or artificial intelligence with your own health data from wearables, from your lab data, from your medical history. And it's a good thing, but there's a dark side. So how do we think about this? Can you explain where are the challenges and where are the opportunities for us to shift things before it's too late? Yeah,

Tobias Rose-Stockwell: Absolutely. So first, I think it's really important to note that I think AI is going to be, and it already is in many ways going to be a true miracle of economic amazing economic engine, an incredible tool for our medicine, for understanding the world better, for creating incredible new opportunities for everyone. I want to note that upfront that I think that the opportunities here are truly magical. And anyone that's used Dolly or mid journey or chat GPT, I think you can instantly see that these are powerful, powerful tools with tremendous potential. I think that the greatest danger that I can see in it as it relates to our cohesion as a society is our ability to coordinate together and cohere is when it comes to our perception of reality itself. Reality is becoming a little tenuous even without ai, right? Reality, capital R reality, it's just very difficult to figure out what is true.

Tobias Rose-Stockwell: And when there are enough countering narratives, when you see enough AI generated images of a political thing that happened, we stop believing in base reality. We actually start just going with our gut and going with this that we go with this confirmation bias. We go with our initial gut reaction when it comes to, oh, that thing didn't really happen. I saw another image online. If you can't know that's true. What's most important is just to go with my gut. It's most important to just go with the feeling that felt true and go with the guy that resonates with my political message. That feels true. And in a word like that, basically politicians operate with impunity, right? They actually, they can say, oh, no, that didn't happen, right? Oh no, this thing that I said, I didn't say that. That's just a deep fake. And so that's a huge problem for democratic norms and discourse actually, because we're not able to check our biases, we're not able to check reality against it. So the coupling of AI plus social media, I think poses the greatest kind of systemic risk to our ability to understand what is actually happening. I think, again, fortunately, we do have systems in place to help us figure out what is true and what is not true. I think a good heuristic for this is like don't believe everything you see on social media, especially in this upcoming election. I think this is going to be a very, very, very hard time for our country

Tobias Rose-Stockwell: Because AI is getting extremely good. It is becoming basically zero cost to generate DeepFakes to generate any images that are politically triggering at the same time, while social media companies are pulling back from moderation systems that might actually, I think provide some mitigation to these things. And there are big problems with bias and moderation tools. There are big problems with censorship on these platforms for sure, but there are ways to manage it that help us reduce the spread of the most viral and the most fake content. And I think it's important to note that basically information has patterns, and if you look on social media, the stuff that tends to travel the fastest tends to be lacking. Context, tends to be emotional, and it can also often be false. Really true. It can be false as well. But I think a good model for this is Dan Kahneman's work, behavioral psychologist who studied our brains and how we process information. He found that we have these two fundamental systems and how we process information. We have system one, which is like heuristics, snap judgments tends to be emotional, reactive, reflexive. And there's system two, which tends to be more, it tends to be more thoughtful, reflective of, it's a little more expensive for our brain to run, but it actually is this, it's just like this, the part of our brain where we do hard mouth problems and where we try to make hard decisions.

Tobias Rose-Stockwell: I'm done with that. Social media is built for system one and not system two, and there are elements of it that can be oriented towards system two, I think. And when it comes to parsing good information versus bad information, the social work could be designed for better parsing and good information. I mean, I just want to note, you can look at a couple of different platforms online that we use regularly. Right now, they have their own problems, but they're far better at social media for parsing the truth, which is Wikipedia, which is a great reference point for a lot of information. It's a free resource for the world, and Google, which actually has citation embedded in its design. That's actually how Google became a search powerhouse because it rank ordered based on references and citation inside of it. So I just want to

Dr. Mark Hyman: Understand, although Wikipedia is challenging, if you read Wikipedia, I'm a quack and functional medicines bullshit, so there's somebody controlling that, and we tried to fix it, and there's people who have agendas who are actually manipulating the content to their particular perspective. So pharma has billions of dollars to spend on its agenda. It's going to undercut things that actually challenge the existing per prevailing paradigm. So

Tobias Rose-Stockwell: I've

Dr. Mark Hyman: Been shocked to find, because I think you, Wikipedia, I think, okay, I can rely on this information, but it's actually not accurate and it's often wrong. And I'm like, when I'm in the one who's in the Wikipedia or something, I know very well. I'm like, geez, it's kind of a

Tobias Rose-Stockwell: Shit shock, not a hundred percent. Yeah, definitely. And people make mistakes in these now

Dr. Mark Hyman: Maybe I'm a quack, but so far people are getting better. So maybe just because a nice guy, I think I had one doctor tell me at Cleveland Clinic, oh, your patients get better because you're just a nice guy. I'm like, I don't know. You're a nice guy too. They don't better

Tobias Rose-Stockwell: From the same problem. No, I mean, you're absolutely right that each one of these tools has problems associated with it for sure. I think that it's, again, just really important to put 'em on a spectrum of is it hearsay or is it someone that whose job it is to actually try to figure out what is right and what is wrong? And I think it's important to try to improve the tools that we have and to really fight to actually make things better and not throwout the entire baby with the bathwater, if that makes sense. Right. Because we need better tools for understanding the world.

Dr. Mark Hyman: I mean, I'm looking at it right now, and since I was born in New York City, I was born in Barcelona, so no. Oh

Tobias Rose-Stockwell: Gosh.

Dr. Mark Hyman: I'm just kind of like, okay.

Tobias Rose-Stockwell: Yeah, I'm very curious. It's

Dr. Mark Hyman: Better, it's gotten better. It's gotten longer and better. They said, instead of saying it's unproven and quackery, it says it's a controversial form of alternative medicine, which actually it isn't. It's not even alternative medicine. It's just an operating system based on systems biology. So

Tobias Rose-Stockwell: Yeah. I'm actually really curious if you go through the, and you've probably done this before and tried to fight with moderators on this. I'm curious.

Dr. Mark Hyman: Yeah, we spent hundreds of thousands of dollars through the Institute of Functional Medicine, and it's brutal. There was just no way to fix it.

Tobias Rose-Stockwell: Yeah, yeah, yeah. That's interesting. Okay. Yeah. Well, I would say don't not trust Wikipedia, but you can be skeptical of Wikipedia for sure. I think that's important to, you should be skeptical of things you, especially stuff that is again, emotionally valence content. I think that that's problematic. But I'm sorry you've had that interaction with the platform. I don't worry about it. It probably doesn't matter to you that much, but I do think it's important to look for these tools in these areas in which there are more people involved in the process of verifying stuff. And I would hope that they are able kind update that and process. It sounds like there's probably one moderator there that has be in his bonnet when it comes to your work, so that's not great. Yeah,

Dr. Mark Hyman: No, I'm saying it's just tough to know where the truth is, even when stuff that you think is

Tobias Rose-Stockwell: People. I agree. And just to be clear, back in the day when there was an encyclopedia person whose job it was to go out there and find a particular fact about the particular notable individual, they probably got it wrong more often as well. They probably got it wrong a lot of the time, and that one to print and then people could see it. So putting on a spectrum information I think is really important. And trying to figure out our check, our biases wherever we can is super critical as well. But

Dr. Mark Hyman: So the AI genie, how do we keep that in the bottle? Do we need regulation legislation? How do we deal with that? And then I want to get into social media. I think this next part of your book I think is important, what we can do about it. So take us through both social media and AI and how do we address this crisis we're in? Because it's not just creating divisions politically, it's creating serious mental health issues for children, increasing suicides and wars and destruction and violence and really serious things. I mean, literally insurrection our capital. I mean, it's not trivial. So how do we think about fixing this?

Tobias Rose-Stockwell: Yeah, so I think it's really,

Dr. Mark Hyman: Now the pressing part is over everybody,

Tobias Rose-Stockwell: This optimism joy. Yeah. So look, the harms of social media I think are very clear right now. My colleague at NYU Jonathan HA has focused really, really closely on the harms, particularly to a very specific subset of the population, which is teen girls seem to have very, very negative reactions to social media. They're very special and tender parts of their life, and there's a lot of evidence showing that there's increased risk of depression and suicide for young girls as a result of increased social media usage. And that's partially a result of social comparison. That's partially a result of, I think, the social hierarchies that young girls are very prone to at a young age. And I cannot imagine growing up with social media in middle school or elementary school or even high school be probably worse, but I cannot imagine what that would be like these days to actually have these tools, the likes of your friends and your enemies plotting against you in the hierarchy of middle school and high school. I think it's terrible. And so I think there is a role for governments to step in and actually enforce age limitations, a coordination problem for girls.

Tobias Rose-Stockwell: If all of your friends are on it and you're not on it, then you're actually ostracized from the group, which is a huge problem. So I think there's a role for authorities to step in and say, look, no social media use, but before a certain age, now what we can do about these broader problems. Again, to come back to the kind of more specific items that I think are really important to focus on here, I think it comes down to three different buckets. And I think it's actually somewhat applicable to AI too, in terms of the, so the three buckets are what individuals can do, what governments can do, and what platforms can do. As for governments, I think that focusing on section two 30 and making platforms more liable,

Dr. Mark Hyman: What is P 30?

Tobias Rose-Stockwell: Yeah, so sorry, section CDA two 30. This is a law that was passed, and thank you for asking.

Dr. Mark Hyman: I'm not sure everybody's familiar with all the bills in Congress and what they mean. Right. A hundred percent on the sections of all the bills, but I personally haven't kept up. Yeah,

Tobias Rose-Stockwell: This is the Communication Decency Act that was passed in, I believe, 1995, section two 30 it's called. And if you hear people talking about social media and regulation, usually it involves section two 30, but basically it makes platforms not liable for the harms that come from people using their tools. So it was a super instrumental and important law for regulating the internet in such a way that allowed for free and open exchange of information, you can post something and the company that is responsible for serving that content is not liable for what you post, which I think in general, it was a fantastic idea. But when it comes to the algorithmic amplification of content, there are opportunities for great harm that make this not a neutral telephone system. It is not like a neutral kind of, oh, I'm just sharing this with my friends. It's actually something that is serving content that might actually cause real harm in the world. And so I think that focusing on section two 30, updating it to make sure that platforms are more liable for some of the things that happen on them, I think is really important.

Tobias Rose-Stockwell: On the platform side, I think that there's many, many, many interventions that can be done to help improve our relationship with these tools. There was a handful of actually using ai. There's a bunch of interventions that you can use AI to actually help identify content and not demote content based on the identification, but to give users basically the ability to see, for instance, when a kid is about to post something like a comment that is extremely toxic and bullying of someone else, the platform can cause you to pause and it can identify that content, and it can cause you to pause and say, actually, maybe you should think about not sharing this, or you should just take a minute. You can share this still, but here's a little bit of friction in place that keeps you from doing the worst thing. And I think there's a tremendous number of possible frictions that can be employed that reduce a little bit of the kind of outrageous engagement and addiction problems with this, but which is again, part of the bottom line of these companies.

Tobias Rose-Stockwell: But they make it a much less toxic place for us, for us all. So if you give users more choice and more frictions at key intervals, it can actually really dramatically improve the kind of content that's shared on these tools. And that's been shown in studies that frictions actually really do dramatically help. They help us make sense of misinformation. They keep us from sharing on the worst type of information to all of our friends, and they keep us a little bit s saner because we're seeing fewer threats that are just stuffed up on our feeds.

Dr. Mark Hyman: So that's one way to create these regulated structures within social media to give pause when there's content that might be damaging.

Tobias Rose-Stockwell: Right. And that goes a long way. I just want to note that actually goes a long way. If you think about these vectors of information sharing, if you stop a single share of a single piece of outrageous inflammatory conflict, if in Myanmar during these ethnic pogroms that happened, if you had put small frictions in place at that moment in time, it probably could have saved thousands of lives. And that's the kind of thing that's really important to recognize. These are really influential things.

Dr. Mark Hyman: Yeah, I mean, I think that's powerful, but do you think it's enough? I mean, do we need to go further and figure out how to just change the economics of social media so that it doesn't create this perverse incentive of the worse information that there is, the more inflammatory, the more disturbing it is, the more likely it's to spread. Is there a way to sort of say, wait, guys, business model change? Is that even possible? Or how do we fix that? It just seems like that is such a pernicious force where all the incentives, even if you put a little pause, okay, well, you have to kind of be asked a question. If you want to drink a bottle of wine, well, maybe you think you want to drink that bottle of wine, maybe it's not so good for you, and I'll just drink that bottle of wine. I don't know, is it going to really work?

Tobias Rose-Stockwell: Yeah. I think that we have small frictions like this throughout society real large, to try to keep people a little bit more on track in their lives to keep them from getting totally sucked into a problematic thing, a problematic behavior. Certain extremely addicting drugs are illegal for a reason. I'm not for legalization of every single drug that's available. I think the world would maybe be worse off if we had heroin that you could buy at the corner store. Well,

Dr. Mark Hyman: We do. It's called sugar.

Tobias Rose-Stockwell: Well, that's actually true. That's a good point. That's a good point. And huge, huge other problem there for sure. And thank you for fighting a good fight on that. Yeah. And just to qualify and put some historical context here around advertising as a business model, before we had advertising as a business model, we actually didn't have newspapers. Newspapers rely upon advertising as being a real foundational subsidizer of our sense-making capacity. So NPR just laid off 10% of its workforce earlier this year because they had an advertising, right? So there is a huge subsidy on sense-making that comes from advertising. And I think that that's important to note that advertising itself is not the most demonic thing in the world. It is the kind of internal structures at these platforms that are pushing a level of engagement with the extreme. And it's really important to note this, right? It's going to fix it. If enough of us are pissed off about the extreme stuff that we're getting served on a regular basis, we can't really coordinate well right now because there's so many other issues that seem pressing and urgent. We've been kind of flooded with threatening information and problems, and it's very difficult to coordinate collectively when everyone has a different problem. And I think this is one of the core problems that we're facing is that we can't cohere if there is more misinformation than real information. So that's a big part of it.

Dr. Mark Hyman: Yeah. So that sort one strategy is about this regulation and of enforcing this pause. What other things can we do to sort solve this problem?

Tobias Rose-Stockwell: So as individuals, I think it's really important to recognize. So I have this concept of healthy influence online, which is recognizing that we are not just influencers, we are being influenced by these platforms as well. So it's important to recognize this kind of multidimensional influence, that it's omni, omni-directional. It goes both ways. We are influenced by these tools, and we are influenced by our communities as well as we're influencing our communities and taking some responsibility for the stuff we post is really, really critical. It's a really important piece of that. You don't just do it for the metrics, do it because you think this is actually a really important healthy thing for the rest of the world. I think that's an important piece of it, recognizing that you are a steward of an audience. You are actually creating content that other people will see. And I feel this, I'm sure we've all felt this to some degree, that the FOMO that comes from someone posting something about something that's happening and you're like, ah, I wasn't invited to, that sucks, right? Oh, bummer.

Tobias Rose-Stockwell: The simple way around that, which is holding on to the post for a couple of weeks and then posting, post posting, essentially post posting well after the fact so that people don't feel, and your existing audience, they don't feel as left out by that particular thing. And I think it's important to think about social media as a community. We are in these communities together, and you need to approach your communities online as if they are real life communities. These are real humans that you're impacting with your content. It's not just for you. This is for other people too. So I think that's a really important piece of it.

Dr. Mark Hyman: Interesting. So that's sort of, individuals can be more conscious about their use about, I mean, I try to go on social media holidays. I don't really look at Facebook. I don't really pay attention to Twitter. I use Instagram to post stuff for myself to educate people about health issues. Sometimes I'll just get scroll, I'm standing online or something and look around, but I don't spend that much time doing it. And I feel like if I do, it's just a big suck. And my life energy is really important to me. And I think the question is, what are we missing by not picking our heads up? What are we not getting? And I think you talk a lot about in the book about how we need to kind of reconnect, we call it social media, but it's almost anti-social media, and how do we get back to really being in connection with each other in person and real world relationships and why those are really important. And when you think about what's going on with our children today, I mean the rising rates of mental illness, depression, anxiety, suicidality, ADD, mental illness, obesity, I mean, it's all connected. And so how do we help our generation of kids who are growing up in this social media world to actually shift what they're doing? Like taking a heroin needle out of somebody who's an addict's hand? Good luck,

Tobias Rose-Stockwell: Right? Yeah, yeah. No, the addiction is very real. And I think it's important to recognize that it is real and that these are pretty powerful tools for addicting us and keeping us there. I can offer a couple of really just pragmatic, easy things that people can do that are helpful. There's an app called One Sec, which is great. It takes about five minutes to set up, five to 10 minutes to set up on your phone, but it actually, it will probably save you dozens of hours of your life over the course of the next few months. But basically it's a content blocker, but a content blocker. It's actually just a little piece of friction that you can employ. What it does is that when you click on your app, you open up your phone and we impulsively, reflexively always go to the app that we just, we black out for a second and we wake up 20 minutes later and we're in someone else's stream or watching a crazy video. We're in TikTok, whatever it is. But this app, it basically forces you to take a breath before you enter into the app. And something as simple as taking a breath, it reformats, it snaps you out of that impulsive click, and it makes you think about your intention, why you are doing this. And I found it to be amazingly helpful. Things like that, content blockers like that to keep you from

Tobias Rose-Stockwell: Just defaulting to the instant habitual behavior. There's another app called Self-Control for the desktop that's really fantastic. It's a content blocker that you can set time limits for yourself. That is extremely helpful for managing your time if you find yourself automatically going to a new site or to Instagram or to Facebook or to TikTok. But a lot of these tools, I think they will become much more prevalent because we need to renegotiate our time with these tools. We need to renegotiate our attention with these tools. And there are increasingly better ways of doing that actually just they snap us right back into that system two processing right back into that better part of ourself. We're like, ah, this is why I'm doing this. I'm not just an impulsive emotive human doing things with no control. I actually have agency.

Dr. Mark Hyman: And what can the platforms do? I mean, are there ways that they're self-policing or regulating?

Tobias Rose-Stockwell: Yeah, there are a lot of great people at these companies working on trust and safety teams that are trying their hardest to figure out what to do to help people. I think that these platforms could invest more in those teams. I think they've been some of the first to get cut in this recent round of layoffs. But no, the trust and safety teams are incredibly important for helping us understand what is best for us. So all the research that Francis Haugen pulled from Facebook, that was the result of trust and safety teams internally doing good research on what Facebook's harms are to the world. And so there are good people inside these companies really trying to make these tools better. I think that can see, it's all the work that you don't see behind the scenes to kind of keep out the really toxic stuff from social media. I think that if social media had zero moderation, you would be horrified by the kinds of things that would end up on a regular basis. And you kind of see this with different WhatsApp. This is

Dr. Mark Hyman: Actually the moderated version we're getting is what you're saying,

Tobias Rose-Stockwell: Right? Oh, absolutely. Yeah. Yeah, definitely. Yeah, which is creepy and strange to think about, but social media could be so much worse. And in fact, in a lot of other countries, trust and safety teams don't have as much of a significant say in presence. So a lot of the social media that goes to other countries is actually some of the worst stuff, which is bad. We're kind of exporting the worst version of social media. We think what we have is bad is actually much, much worse than a lot of other countries. And then I think it's really important to recognize that the moderation decisions at these companies are having huge influence and a say and the information that we see or don't see. So TikTok, just to touch on this for a moment, is a huge problem in terms of actually getting accurate news because we are facing one of the most sophisticated algorithms at tracking our attention and giving us more of what we want. It's used by so many people in the country right now, and it could throw 2024. They could make some algorithmic decisions at TikTok that would throw the election in 2024. That's how powerful it's, and you can think about this in terms of this is a Tristan Harris. Well,

Dr. Mark Hyman: Facebook did it in 2000 and

Tobias Rose-Stockwell: Inadvertently, right? Facebook accidentally did it, right. And saying is no,

Dr. Mark Hyman: I'm sure. I don't think people at Facebook

Tobias Rose-Stockwell: Were like, we're going to throw the election necessarily. But I think that if you think about the equivalent of this is interest on Harris, but during the Cold War, if PBS kids was owned by the Soviet Union, would that be okay? And that's what we have with TikTok right now. We have such a huge number of our children that are addicted to this tool and being served information based on the whims and the algorithmic decisions that are being made by company overseas that is embedded with the ccp. So I think that's really important to note because these are hugely influential tools. And the version of TikTok that they have in China is like, eat your vegetables. Highly regulated. It's highly regulated and they're serving kids how to be an astronaut video. So I think there's a midpoint between this authoritarian regulation that influences what we can and cannot see in a draconian style and something that is moderate and reasonable that is not just the race to the bottom, the brainstem, which is the majority of the tools as they stand.

Dr. Mark Hyman: Yeah, I think that's a great line. The race to the bottom of the brainstem, meaning our reptilian lizard self that is impulsive, reactive, instinctive, but not rational and not controlled by our frontal lobe, which is the executive function or the adult in the room. And I think that's what I'm seeing increasingly across the world is this sort of activation of the limbic brain and the loss of our ability to have deep conversations about challenging topics, to have disagreements with people but not vilify them to be curious and understand what's happening in the world rather than being stuck in ideological battles. And I think it is having such a detrimental effect on our wellbeing. And as a physician, it really worries me and I think we all use it. We're all in it. We're all doing it, and we're all going to promote your book on it. It's like this podcast is going to be on it, but it feels like we need some really deep thinking about this and we need some really deep solutions. And it just seems like it has to come from the government. There has to be some regulatory or legislative ways. It's not something where the industry is going to self please, just like with food. I mean, they're going to keep selling addictive, deadly substances unless

Tobias Rose-Stockwell: They're forced not to. I think that's a fantastic note and kind of metaphor for what we're dealing with, right? It's like fast food. We have dietary restrictions. We've been exposed to so many caloric problems as a result of fast food and this kind of industrialization of our food creation and dissemination. It's like we used to think that fast food was, oh, it's just food that's fast. Actually, it turns out it's really bad for us. And there's a lot of things we should try to do to mitigate and manage our diets and our information. Diets are very real information. Diets are a real thing and we need to manage them in much the same way.

Dr. Mark Hyman: And they're currently like junk food for your brain. A

Tobias Rose-Stockwell: Hundred percent. A hundred percent, absolutely.

Dr. Mark Hyman: So any final thoughts on a hopeful note we can close on?

Tobias Rose-Stockwell: Yeah, absolutely. So like I said, this book has really been focused on the history of these media disruptions. And I am very hopeful and can go, if you buy the book, you can read through these different periods of tremendous disruption that came from different media technologies and they all follow a pretty well-worn pattern of increased virality, increased confusion, oftentimes violence, and then a set of fixes that are employed by governments, by people alike, a bunch of concerned citizens trying to fix things. And the book is full of those anecdotes. And I think I'm very hopeful looking to history that we can find our way through this stark valley into a better place.

Dr. Mark Hyman: Well, thank you so much, Tobias, for your work, for what you've done for digging into this and having a hopeful view where we can actually find our way to a better place where we can maybe have a little bit of a search for our better angels and create a better world. I think we're in a very precarious moment, and particularly with the advent of ai. So thanks for what you're doing for the awareness you're bringing. Everybody got to check out the book. It's called Outrage Machine, how Tech Amplifies Discontent, disrupts Democracy, and What we can Do about it. Check it out. You can get everywhere you get books now, it's available for sale and do the right thing. And for those of you who love this podcast, please share it with your friends on Dare I Say it, social media, it can be used for good and leave a comment, how have you felt Your life has been compromised or enhanced by social media? We'd love to hear and what are your concerns about ai? How have you seen it affecting your life? And subscribe wherever your podcast and we'll see you next week on The Doctor's Farmacy.

Closing: Hi everyone. I hope you enjoyed this week's episode. Just a reminder that this podcast is for educational purposes only. This podcast is not a substitute for professional care by a doctor or other qualified medical professional. This podcast is provided on the understanding that it does not constitute medical or other professional advice or services. If you're looking for help in your journey, seek out a qualified medical practitioner. If you're looking for a functional medicine practitioner, you can visit ifm.org and search their find a practitioner database. It's important that you have someone in your corner who's trained, who's a licensed healthcare practitioner and can help you make changes, especially when it comes to your health.