Home
cover of episode Was the Internet a Horrible Mistake?

Was the Internet a Horrible Mistake?

2021/10/13
logo of podcast Honestly with Bari Weiss

Honestly with Bari Weiss

Chapters

Jaron Lanier discusses the dangers of groupthink, digital maoism, and the impact of algorithms on society, emphasizing the need for a fundamental shift in how we structure our relationship with technology.

Shownotes Transcript

This episode is brought to you by Shopify. Forget the frustration of picking commerce platforms when you switch your business to Shopify, the global commerce platform that supercharges your selling wherever you sell. With Shopify, you'll harness the same intuitive features, trusted apps, and powerful analytics used by the world's leading brands. Sign up today for your $1 per month trial period at shopify.com slash tech, all lowercase. That's shopify.com slash tech.

I'm Barry Weiss, and this is Honestly. A few months back, I went to this gorgeous neighborhood in the Berkeley Hills to have coffee with one of the original visionaries of the internet.

Hey. Hi. The computer scientist, technologist, philosopher, composer, virtual reality pioneer, author, and legendary eccentric, Jaron Lanier. Hi, I'm Mary. Hi, I'm Jaron. It's really nice to meet you. Well, do we do this? I don't even know what to... I'm vaccinated. I am too, but... We're fine. You can hug me. It's all good. Wow, okay. It's all good.

Jaron's appearance is striking. He has these piercing pale blue eyes. He wears a black t-shirt and long dreadlocks well past his butt. It's the same look he's had ever since he arrived in Silicon Valley back in the now legendary garage era days of the 1980s. Your neighborhood is gorgeous beyond belief.

Oh, thanks. His house sits on a winding road next to the home of other legends of science and technology. A friend of mine who discovered dark energy and won a Nobel Prize is a few doors that way. And like Danny Kahneman is a couple houses that way. Oh, wow. And the woman who just died next door was one of the founders of 20th century number theory and et cetera. I mean, there's just all the guy. Hi, kid.

And perhaps as you'd expect from someone who helped dream up the internet, he's got a lot of cats. When I think about the aesthetic of Silicon Valley, the way that its founders lived, but also the aesthetic of the internet itself, I think of sleekness, of flatness, white, glossy, sheeny, clean and minimal. But Gerald Lanier is the opposite of that.

I feel like I need to apologize for the house because during the COVID year, we kind of dug in and everything kind of went to seed and we're sort of gradually digging out. His house is the most unusual house I've ever seen. You do not need to apologize at all. It's painted in these pastel stripes and it looks partially like a shrine to the god of science with sculptures of atoms and electrons. This is a caffeine molecule. It's a molecule we care about a great deal here. And partially like a psychedelic...

like playroom. Jaron has over a thousand rare instruments, including mouth flutes and ouds, all of which he plays, hanging all over the walls. He's got lava lamps and toys, stuffed animals, bright colored tapestries, and also a very nice espresso machine. This is the espresso machine, and it is starting, the pump is operating, the espresso is coming out. I love your narration.

And I went over to have coffee with Jaron, which was delicious, by the way, because even though he had a big hand in creating what we know is the internet today, he's also become something of a Jeremiah, shouting warnings about what the internet has become. I worked really hard to get the internet working in the early days, you know, and I still believe in this idea of having this information thing between us. And I think it has potentially more benefits. And the benefits are real, even in social media as exists. It would be silly to

to say that everybody who finds someone else with some commonality, maybe rare illness, or everybody who enjoys a silly cat video or whatever, it would be silly to condemn all that. That stuff can often be either innocuous or wonderful, but it's the manipulative algorithms that are the problem. Especially the internet, as ruled by the algorithms that have been making headlines lately in light of the Facebook whistleblower.

What that results in is people being directed rather than exploring, and that makes the world small. And I think that that is fundamental. And so when you talk to people who do this stuff at Google or Facebook, they'll say, well, it just means we need to make our algorithms better. But you can't. I mean, like you can't say...

We want to have a better form of constant incremental manipulation of every person. It's like the whole concept from the start is poison.

Sometimes to me, the internet feels like oxygen. It's so all-encompassing. It's affecting every single part of our lives that it's almost like it's not just invisible, but also inevitable. It's hard to remember sometimes how new it is, that nothing about it was inevitable, and that the people who brought it into the world, arguably the most transformative technological change in human history, that they're still alive, and that we can ask them what they make of where it's all ended up.

Even if you don't think of yourself as curious about technology, even if you think of yourself as a Luddite, even if you need to call your kids to figure out how to log into Netflix, I still think you should give this conversation a listen. Ultimately, what comes out is that so much of the pain in our society, the extremism, the polarization, the isolation that we're feeling, it comes down to the machines and the technology that are climbing deeper inside ourselves.

What this is about is how we stay human in the age of the machine. Please stay with us.

There are no other shows that are cutting straight to the point when it comes to the unprecedented lawfare debilitating and affecting the 2024 presidential election. We do all of that every single day right here on America on Trial with Josh Hammer. Subscribe and download your episodes wherever you get your podcasts. It's America on Trial with Josh Hammer. Something interesting happened last year, which is.

Or maybe the year before that. So I had originally written books about very big picture things related to information technology and humans, like the spiritual angle and the economic angle and all that. And then I wrote just a book that was popular called the 10 Arguments book about the dangers of social media that was a bit more directed at individuals. I read that.

But what happened after that is that whole beat, which had been a desert before, got very populated and with excellent people. So you have Shoshana Harris, Shoshana Zuboff, and a bunch of other just really excellent people writing on it. At this point, it's almost like the normative sort of thing that one reads. Like the Times has two columns this morning that...

might have been written by me a few years ago or something or longer. I mean, if anybody's counting in 92, I wrote a piece predicting that bots could throw elections and all this stuff. But anyway, so I've been on this thing for a long time, but it's satisfying to see people take up your arguments or, well, you know what it is, is it's freeing. Um,

They're not exactly mine. I mean, they're different. Like Shoshana is a bit more anti-capitalist. So she talks about surveillance capitalism. And I'm not really. I view capitalism as a tool that has proven useful in cases and quite destructive in other cases and that we don't know how to use well yet. To me, it's capitalism is this thing like nuclear power that's gotten a bad name, but

in truth, probably can be used to good effect. We just have to figure it out better, you know? That's my feeling about it. But Shoshana is a bit more anti-capitalist, and Tristan is on the perils of being psychologically manipulated by algorithms, whereas I'm a little bit more focused on the emergent

economic perils of the whole civilization running on that kind of manipulation more and more. Sort of like what Andrew Yang was talking about during the presidential campaign. Yeah, Andrew Yang... I was actually so curious what you think about him before we get into it. Yeah, if you want, you can... He interviewed me for his podcast and that sort of thing. There's a record of us talking. And the thing about Andrew Yang is...

His particular approach to it, which is to emphasize a basic income solution, I don't think

is good. I think the basic income idea should only apply to help in kind of limited situations. There are just a lot of things wrong with it. The main thing that bothers me about it is political, which is that if you have a basic income, there has to be some organization that operates it of some kind, whatever it is. And if the basic income is only applying to some small number of people in the society,

then the only question is whether it's helpful. But if it's applying to a lot of people, like maybe a majority because of there being a lot of robots and whatnot, then you start to have a power center that becomes a temptation for seizure by bad actors. And so the way I've always put it for my students and the way I've always referred to it by saying the way I always put it for my students is that you start with Bolsheviks and you end up with Stalinists.

You start with perhaps whatever you think of the Bolsheviks are certainly better than the Stalinists, right? And then after that, you end up with a corrupt plutocracy run by a madman. One of the things about the American experiment that I think was really well informed and enlightened is the idea of having enough overlap and confusion in the structures of society that power is hard to seize.

You know, and so we have this idea of checks and balances, but also of a lot of different domains that there's both government and business and society and journalists who are protected and all that, like that layered overlapping confusion. And I think that should also apply to human identity. I mean, I really like the idea of people having hyphenate identities in various ways. So it's very hard to concentrate a group of people.

on a particular belief system or a particular ethnicity or whatever it is. I think that this kind of ambiguity of where power lies is really essential to decency and stability.

But anyway, so that's one reason I don't like the basic income. But the other thing is it's all kind of based on a lie because the whole motivation, if you're talking about basic income for a large number of people instead of for a small number, it means you think that AI is real because you think that there's some brain in a box that'll operate robots that take on human jobs. Whereas in fact, if you look at these things, it's just as easy to interpret these devices as collaborations of people. Yeah.

a ton of people contribute example data and training data and desires and feedback to these algorithms to make them work. And so if people were compensated for that as a new kind of labor, then the more algorithms and robots were functioning in society, the more

compensation there'd be for people and the more paths to livelihood and status and pride. So like every time I fill out a reCAPTCHA, I should get... And like I'm training Google AI. That's exactly right. I should get paid for it. Yeah, I think you should get a buck per reCAPTCHA session. No, seriously, that seems fair to me. In which case...

Yeah, I do think that what a high-tech prosperous future should look like is a sense of expanding creative classes in the economy. And the way to get expanding creative classes is for people to be paid for information that exists because they exist. And so in order to have AI and robots as commonly conceived, you have to pretend that people aren't contributing when in fact we are.

So, okay, before we get to... Let's rewind the tape and start from kind of the big picture.

I was at a dinner party recently where the argument was, is the world getting better or worse? And by every metric, of course, as Steven Pinker and others have pointed out, it's getting better. There's less poverty, like life expectancy is so much longer. And so I found myself arguing for that position because intellectually I know it to be true, even though in my heart I feel like things are getting worse and

Or at least I feel scared about where things are going. And I wondered if we could start there. Sure. Who's right? Yeah. Is it getting better or is it getting worse? There are a few things that are true simultaneously. One thing is that if you measure grand averages for how things are going, they tend to be getting better. There's often...

great divide hidden by those averages in which they're getting worse for some people. So for instance, in the United States, life expectancy has actually been going down for a lot of people instead of up. But if you look at the average, you could say, oh, look how great this is. If you look at some measures of the economy, it's amazing. And yet so many people are living on the edge. And this is where

I think we have to acknowledge a special kind of problem that's a little different from the way problems have presented themselves historically. The problem now is living on the edge in a multitude of ways all the time and the constant stress of that. So on the average, the American economy is great. In actual practice, there are a whole lot of people who would lose their homes if they got sick and so on.

And that living on the edge creates a constant stress and I think also a challenge to one's pride and self-concept that's really disturbing. But also on a large scale, we're living on the edge. Like here in California, we love our neighborhood, but the last five years, every fire season has been terrifying.

terrifying like where you're you're under evacuation orders repeatedly the air is unbreathable for weeks at a time and you're ordered to stay inside and so on i mean it's just it's extraordinary and so we're you know in a well-to-do place here and yet we're on edge and so i get calls from my silicon valley friends saying oh california's been messed up we're all going to new zealand come and buy into new zealand and like if we could fuck up cal pardon my language here but

If we could fuck up California, what is going to stop us from fucking up New Zealand? Right. From one paradise to the next. We totally have the power to wipe out anything good about New Zealand in a few years. Like that's in our power. I mean, have you looked at us lately? We can go screw them up. And so there's this notion that we don't take paradise and put up a parking lot like Joni Mitchell said. We take paradise and we turn it into an escape room.

You know, we turn it into this weird, stressful, like almost on edge, weird, like, are we going to make it through this? And that's the character of our time. The looming, slow-moving climate challenge, which is potentially existential. We have a situation that...

You know, it's a funny thing. If you look at the world historically, and Steven Pinker will certainly point this out, as will many others, there probably was more violence of all kinds than there is now. And yet, I don't think people had in the back of their minds, if they just went to work, that there might be a mass shooting here, or if they sent their kid to high school, that there might be a mass shooting. Like this kind of weird way that

Sudden violence. And it reminds me a little bit of the Shirley Jackson short story, The Lottery, about...

creepy little town that on the surface is... Pleasantville. Yeah, Pleasantville, you know, all American, very, you know, decent and whatever. But once a year, they draw lots and somebody gets stoned to death by everyone else. It's proto-hunger games a little bit. Oh yeah, very much so. But that's the character of our times, that we're all on edge. And it doesn't matter, you know, the thing is,

In the life I lead, I know a lot of billionaires. I'm not one of them, but I know many people who are. And they're on edge, too. Like, everybody is sort of living on the edge of catastrophe in a way that wasn't typical before. I think it used to be that we had a little bit more resources.

wiggle room. Perhaps that wiggle room was illusionary. Maybe we always were living on the edge of catastrophe, but people used to feel like, this is nice. I can relax here. I don't think people are relaxing these days. I think that this idea of having an interval where things are okay, at least for a while, and enjoying that is harder and harder to come by. Well, I think you said this thing that I thought was so powerful that connects to me to this idea that we're in paradise, but we're in the escape room.

You said, "The thing about technology is that it's made the world of information ever more dominant, and there's so much loss in that.

It feels as if we've sworn allegiance to a dwarf world rather than a giant world. And it's like the Internet. Did I say that? Yes. And I think it's so true because it feels like the Internet should make everything feel more expansive, bigger, more elevated, more giant. But it feels like it's like exactly this metaphor of the escape room. It feels like it's shrunken us down. Like so what to what extent do you.

I guess, the Internet and social media for making us feel this trapped on edge feeling? Well, you know, casting blame is a little difficult because we haven't had enough real world experiences with alternatives. But my working assumption is that the problem isn't the Internet or social media in the broad sense, but rather it's specifically the use of the algorithms. So what happened was when Google and Facebook and others started

went to what they call the advertising business model, where anytime anybody did anything, anytime anybody connected with somebody else, it was financed by a third party whose motivation was to manipulate what happened. Then the whole business model was about how to manipulate more and more. And what that results in is people being directed rather than exploring, and that makes the world small.

That is fundamental. And so when you talk to people who do this stuff at Google or Facebook, they'll say, well, it just means we need to make our algorithms better. But you can't. I mean, like you can't say we want to have a better form of constant data.

incremental manipulation of every person. It's like the whole concept from the start is poison. That is the original sin. The idea of the internet itself, I worked really hard to get the internet working in the early days, you know, and I still believe in this idea of having this information thing between us. And I think it has potentially more benefits. And the benefits are real, even in social media as exists. It would be silly to

to say that everybody who finds someone else with some commonality, maybe rare illness, or everybody who enjoys a silly cat video or whatever, it would be silly to condemn all that. That stuff can often be either innocuous or wonderful, but it's the manipulative algorithms that are the problem. And

I mean, lots of people have compared this era to the 14th century and the printing press. Obviously, the advent of the printing press changes the entire world, including setting off hundreds of years of war, tons of bodies piled up. And by comparison to that, you could say, actually, this is a lot more peaceful. By that comparison, we should expect so much more violence. Well, I mean...

The point of comparison I usually draw is the way the Nazi regime was able to use radio and even television to effect, and also grand public spectacles, grand public rallies, that they were innovators in media, and that was part of how their propaganda was able to be so effective, because people had no experience with it, and it was raw and fresh and had amplified power. The printing press comparison is...

is an interesting one. There are a few differences. One is that humanity didn't have the power to destroy itself back then. That's a big difference. No nuclear weapons or bioweapons or ability to change the global climate quickly or many other things. There were plagues and droughts and all those things, but they were always local. We can do global ones now. And so

Our peril level is higher. Maybe let's pick up on the Nazi analogy. Explain that analogy. So who are the second Hitler? It's like reductum on Hitler, someone says. The second you bring up Nazis or Hitler, you kind of say, I deliberately didn't bring up Hitler. I talked about the Nazi regime. And the reason why is I don't want to focus on one person's personality here. What I'm interested in instead is a society of probably...

no more than hundreds or maybe a few thousand people who figured out a way to exploit emerging technologies for propaganda. That's a different question than Hitler. These were people who were using cinema in not entirely new ways. There had been propaganda cinema before, but maybe not quite so, you know? And radio certainly, and as I say, television, but also the nature of the mass rallies, the way of creating...

a mass rally that would also be filmed and that was organized in a certain way to create a visual effect and a sense of inclusion and power was a little different from what there had been before. It was innovative. And it's continued today, like in North Korea or something, you still see a similar kind of sort of very conformist organized rally. And that body of people who created that propaganda machine is...

So here, I don't want to say they're the same as Silicon Valley, but, you know, there's some, and in fact, some of them, you know, after the war started,

kind of were rehabilitated because it sort of felt like they were like Lenny, how do you say it? Lenny Riefenstahl. Riefenstahl, right. You know, was somewhat rehabilitated because there's a sense, well, here's, you know. The Nazi propagandists. Yes, exactly. There's a sense that maybe these people were more technicians who might have done propaganda for anything than real ideologues. And

I suspect there's some truth to that, you know? That's not to say they should have been rehabilitated. That's another question. But all I'm saying is that that professional class at that time bears some analogy to current Silicon Valley. How do you think your friends in Silicon Valley would hear you saying that? Well, I hope they wouldn't hear me saying they're Hitler because that's not what I'm saying. What I'm saying is that mass manipulation generally ends in tears, right?

Now, that said, I've been thinking about another thing lately, which is after World War II, there was another generation of people in media and education

who were horrified by the success of Hitler's propaganda machine and attempted to counter it with a new style of think-for-yourself anti-propaganda. And what are those movements? Well, I'm thinking of Dr. Spock's advice to parents. I'm thinking of Mr. Rogers' Neighborhood. I'm thinking of... Great Pittsburgher. That's where I'm from. Right, right. That's right. Yeah. Yeah, I think there was this feeling that

Just this idea of mass, of groupthink, was in itself a hazard, totally aside from what the groupthink was. Okay, well, this is a phrase of yours that I've become obsessed with, and I kind of can't believe that you coined it in 2006, is this phrase, digital Maoism. Explain to me what digital Maoism is. Do you even remember coining that phrase? I do. That was an essay. Digital Maoism was an essay. And I got a certain amount of...

criticism for it as red baiting or something but i you know what's funny is um when it was translated into uh mainland chinese the translator called me and said what do you want to say here and i explained it said oh okay that sounds right and they published it really as digital i don't know if they would maybe they were proud of it i mean maybe they were hearing it in a different way well also you know times change i i don't know what would happen today uh

But anyway, the idea of digital Maoism was that the way engineers like to insert algorithms between people tends to create feedback loops that make people fit more and more into what the algorithms expect because the people see the information from the algorithms and are able to just do whatever they want to do if they conform to what the algorithms expect more and more until you start to have this kind of...

and official single reality that is not unlike a cultural revolution feeling. And so I thought digital Maoism was an apt title for it. And one of the reasons I wanted to use it was that, especially at that time, there was this hyper-libertarianism that was supposed to be guiding everything. And I wanted to point out to people that sometimes...

I used to call them ideology sluts, that you think you're adhering to this one ideology, but you're actually slipping into this other one because you're such a slut. And that had happened in Silicon Valley. And that we'd moved from this supposed free thinking thing into this technocratic way of enforcing the same thinking. Now, at the time, some of my targets seemed relatively unbiased.

now compared to other stuff. So, for instance, Wikipedia, by having only a single article for something, like if you compare the encyclopedias on print that competed with each other because they actually made money from selling copies, you'd have an Encyclopedia Britannica and an Encyclopedia Americana. You wouldn't expect, in fact, you would be horrified if there were identical entries in both of them. They reflected...

a different perspective because there's no such thing as a view from nowhere. There's no such thing as a universal, absolute perspective in most human affairs. And that's even true in technical things. Like you might say, well, math is always the same. Not really, because the exposition is really important.

And so the Wikipedia created this illusion of a single truth. And so that's an example of people being herded for the convenience of an algorithm because to write an algorithm for pluralism is a whole pain in the butt compared to making the Wikipedia, which rejects pluralism. Well, it's like...

I mean, just to draw out the Maoist analogy a little bit more, it's like Maoism is sold as the idea of like a collective will. But in fact, you know, it like it's sold to us as the will of the people, but actually it's run by this small, powerful insular elite. Right.

Another interesting thing about Maoism is that it was a youth movement that was engineered to manipulate youth. And I think that that's another quality it shares with Silicon Valley. And this is something that's hard to talk about because I don't want to become the cranky old person who's saying, oh, these kids today, they just don't know and whatever. But there's some truth to it. What I've observed is that

The first year or two of a social media platform that kids like, we could be talking about Snap or TikTok or Twitter, even Facebook. Well, Facebook was always kind of creepy from the start, but the first year or two of most of these things is actually...

kind of charming, you know? But the thing is, it's inexorably on a path to the manipulation machine, and the manipulation machine intrinsically makes everything dark and paranoid and creepy and exploitative and horrible and turns people on each other, intrinsically and irrevocably. And so we're seeing that transition happen on TikTok now. And so, you know, one has to be subtle about this. It's hard to...

It's hard to criticize somebody who's having fun dancing on TikTok, and yet it is part of this thing. And it reminds—I hate to say this, but it does remind me a little bit, you know, the Cultural Revolution as an engineered youth movement had them dancing, you know, and they'd go on these little dance trips visiting villages doing their dances. And—

It's a little like TikTok, you know? It's a little engineered. And it's a little bit. It's not totally. It's not exactly the same. There's nobody at a desk at TikTok engineering the dance. And yet, there's a certain kind of... But isn't there to some extent? Well, okay. Isn't the Chinese Communist Party scraping everyone's data on TikTok? Right. So I have to say, I'm...

I'm in a bit of an unusual position in that I've chosen to remain an insider in this world. And whether that decision is the right one or not is hard to know with certainty. But anyway, it's a decision I've made and very few have. So I currently have this arrangement with Microsoft where I'm encouraged to speak my mind. And I do, as you know. But on the other hand, there's some places where I have to kind of be careful. And since we almost got TikTok at one point, I kind of shouldn't talk too much about it. But

It just happens to be the thing of the moment, you know, that kids are probably more into than other things. And there are a number of things that are Chinese-owned that everybody's using. Zoom's another one. Would this lead to something that's potentially a problem? It's a real issue. TikTok is definitely doing some overt things, the stupid censorship of stuff about Uyghurs or Tibetans and all that. And

I almost feel that that's so blatant that it might backfire. I feel like that's almost like a fetish or a tick of the Chinese Communist Party that

creates incompetence. At least I hope so for the sake of my Tibetan friends and even Uyghur. I don't know that many Uyghurs, but I know a few. But what I worry about more is societal destabilization. So what Putin's psychological operatives proved is that the United States can be destabilized very inexpensively just by promoting craziness and paranoia and pitting people against each other. And the reason it's so inexpensive and easy to do is

is that our social media platforms are designed precisely to do that for different reasons. But it was like incredibly easy for Putin's people to step into Facebook or Twitter or YouTube and just amp up what was happening naturally. Let's go back to digital Maoism for a second, because if Maoism is about creating a new norm and creating groupthink and herding us into packs and mobs, I just see that

Not being confined to the Internet. Like if I look at the world of, you know, legacy press, for example, which is the world that I'm coming from, but really the world of publishing or the world of the academy, it's all a monoculture. There is no marketplace of ideas. It's it's all hyper conformity everywhere we look. What is driving it? Yeah.

The unique thing today is this algorithmically generated conformity. And just to be clear, I don't believe there's an engineer saying, hey, I want everybody to think the same. What it is, though, is that there's a feedback loop where things are a little easier if people meet the expectations of the algorithms. Like if you do what the algorithms expect, everything happens a little more conveniently and a little more easily. If you try to rub against the grain, it's actually work.

Like, if you say, wow, those search results aren't what I was looking for. Let me try to modify it. It actually gets to be work. And you have to kind of try to guess what the algorithm responds to and all that. It's actually pretty difficult. Yeah. And it's like, I don't know anymore to what extent I like something because the Internet is telling me what like a 37 year old lesbian who likes moomoos and like, you know.

Chelsea boots should like, you know, or if it's that I genuinely like it. Okay, wait, let me work on this. It's a Subaru's? Yes, exactly. T-Bus. T-Bus, yeah. But it's like I walked into the doctor's office the other day and the doctor came in, similar profile to me. I was wearing these shoes. And she was like, oh my God.

you know, whatever the brand was, something like, I think they were called like Thursdays. She was like, I keep getting the same ad for it on Instagram. And then we were sort of like comparing what the targeted ads that we were getting on Facebook and Instagram were, and they were almost exactly the same thing. And I had this moment of like, did I actually like the shoes? Or did the internet convince me that I like the shoes because I've seen them repeated so many times before? So that's like a...

You know, that's a weird little horror story that's foreseen in the many, many stories from fiction about people unsure if a simulated dead person is really the person brought to life or not and that sort of thing. This peculiar modern problem of the ambiguity of reality, I think, creeps into you and makes you lose yourself. And it's...

That's one of the reasons I advocate going off this stuff, at least for periods, just so you can kind of find yourself again. But it's a hard one. Well, is there something fundamentally flattening about it? Like, is there something intrinsic to the technology that makes me feel like...

It rubs the edges off what it means to be an individual and makes us blobby. It's very true. If you say, is there something intrinsic to the algorithms? The answer is yes. If you're saying, is having an internet at all? I would say no. It's the algorithms. But if the internet is about connecting us all.

Well, I guess explain the distinction. Okay. Let's suppose we entered into what we call the data dignity regime instead of the artificial intelligence regime. So in that case— Explain data dignity. Data dignity is where you don't believe there's ever a brain in a box. There's no autonomous robot. All it is is giant human collaborations for which people are acknowledged and paid. So—

In that world, there would be people who make a living curating shoe recommendations and fighting with each other about it. You would have a direct engagement with a real professional

person or a few people who would say, we like these shoes, we like those shoes, and it's time for lesbians to move on to other shoes, for God's sakes, what's wrong with you? And whatever it is. And that would be a different world in which you would know you're being talked to by a person, you would know who was saying it, that person would be paid

you would be able to have a relationship with that information. It doesn't mean you wouldn't be swayed. I mean, attempting some sort of absolutely pure lack of manipulation is senseless because that would end all communication. What we want is to end algorithmic, massive-scale, sneaky, disingenuous, constant ambient manipulation.

That the world is damaged by. That we should not have. But, you know, a little manipulation. Like if somebody says, hey, I'm a book reviewer. This is a great book. You should read this. And they throw in a few zingers. Great. Like I'm not a purist about lack of manipulation. I think that we manipulate each other all the time. I think...

We would be quite lonely if we absolutely refused to manipulate one another at all. How do we get people to fall in love with us? Yeah, yeah, yeah. I mean, like, let's face it. And like, you wouldn't want to remove illusion from love, right? That would be horrible. So explain to me, like, play it out for me in a practical way. Right now, I go on Google and I Google, you know, clogs. Okay, since we're on the subject of shoes. Okay.

Who decides what clogs are coming up for me? An algorithm, right? So in the data dignity model, if we adopted Jared Lanier's data dignity model, how would that search play out? Well, Google would be a very different thing. Google would charge you a penny per search or something. You'd have a subscription to it like Netflix. You'd be paying Google $10 a month.

Google would still thrive. Their shareholders would be happy. But Google's interest would be in keeping your business. It would not have – Google has a network effect lock on advertisers where they can't afford to leave it. It would no longer have that because it wouldn't have advertisers. It would instead compete with other search engines that would be opinionated in the same way that the Britannica versus Americana encyclopedias are. I think Google would do great. I think they'd be fine. I like the people at Google.

I sold them a company. Hey, you know, I'm not anti-Google. I just think their business model and their algorithms are destroying humanity. But that's all. That's all. Really, really. They're lovely people. So the advertising is the original sin. Oh, it totally is. It totally is. Because then we're the product. Exactly. Yeah. The advertising model is the original sin. So then I would say is that within that, what would be the current search algorithm is basing on averaging out what other people have done. You know, it's based on this idea that

It's a Maoist idea that the collective knows best. That was the original Google search algorithm. I would trash that. I would instead have a network of human publications about human publications. So I would have shoe reviewers who are paid, but not by the shoe companies, but by subscribers through some sort of indirect means that would accumulate. I would have reviews of reviewers. I would have a world of humans who are recommending things, not algorithms.

And I think that would be a better world. That would be a human curated world with a lot of people employed. It would be an opinionated world. It would be a lively world. It would be a less autistic world. It would be a more colorful and changing world. Let me give you an example I use a lot for data dignity. Where we live might burn down this year or next year or something because of the climate change. And...

So one of the things that keeps this particular hillside relatively safe is that there are groundskeepers who run around trimming trees and are thinking about it, and they cooperate across lots to prevent trees from kissing to spread fires. So I talked to some of them. A lot of them are undocumented. Some of them are very documented and have great businesses that have lasted for generations. There's like a whole variety of people.

There's no cliche about them, although it must be said most are Hispanic. And so when I talk to them, they have this anxiety, and this is part of this living on the edge thing we were talking about, that the robots will come and their children or their grandchildren will not do this anymore because there'll be robots trimming the trees. And you know what? There will be. I mean, I've played around with robots that are getting there, and there should be. You know, there should be a bunch of robots on a larger area than people can cover doing a better job of

And it would be not just for fire prevention. It would be for climate change mitigation, for all kinds of... There are all kinds of things that could happen. But then here's the question. How does this happen? Does it happen because some company like Google comes out with a groundskeeping robot that then just replaces everybody and they all have to go on Jerry Yang's basic income? Or could it be something else that's more interesting? What if instead...

There's a data collective created by the people who used to be groundskeepers where they're providing examples and directives and goals. And what happens is they gradually become a creative class where instead of one program running the robot groundskeepers forever, it becomes an area of culture that's changing. And you have weird spiral pumpkin patches for a few years, and then you have this other some sort of.

expressionist, realist fusion idea of gardening or whatever. I don't know. And crazy topiary. Like what if it becomes something creative that's ever changing and an expanded part of culture and these people get paid for it and are proud and become known? Like, isn't that a better future? And so the thing is both of them are possible and the difference isn't technological. They have exactly the same robots, exactly the same algorithms. It's just a different ideology in one versus the other.

And the data dignity future is clearly the superior one. It clearly has more dignity, more beauty, more creativity, more respect for the future, more of a sense that we don't know everything, that other future generations should accept our wisdom as absolute. It doesn't have that. It has an open-ended generosity towards the future, which is what we should have. And so that's data dignity. ♪

After the break, the dangers of groupthink, why humans can be so cruel on the internet, and what wokeness and billionaires have in common. We'll be right back.

Can we articulate a little bit why a monoculture is bad beyond just the fact that it's not as like aesthetically pleasing or exciting or serendipitous? Like what are the political effects of every single Airbnb, whether you're in like Barcelona or Berkeley or Brooklyn, having the same like macrame wall hanging? Or like...

Like, do you know what I mean? Like, there's like this way. Oh, my God. Like the way my friend has described it is like there's like a flattening aesthetic of all of this stuff. So, yeah, I know. I know. Well, can I nerd out for you slightly? I mean, I think you're already there, but sure. Oh, no, no, no. You have no idea how bad it gets. You don't know. So, look, there's...

There's this thing that happens when if everything is channeled through a single central hub in the same way, the whole world gets sorted on the same criteria.

And then there are a few things about that, whatever it is, whether it's the interior decoration on an Airbnb or what kind of writers get through or anything at all, who, who becomes a billionaire, all that stuff. Who gets the prizes, like everything of any kind. It, it, it,

It follows a mathematical power law where there's just like a small number of winners and then this long tail that's incredibly desiccated. The alternative to that is a bushy network where there's all kinds of little local situations, which is what used to happen with local news and local music clubs. You know, when I was researching my first book, I found census data showing that there were 100,000 recording musicians making over 100,000 a year in the U.S. at the turn of the century.

which seems incredible now because I doubt there's a thousand. There might be, but for a little while I was trying to track musicians who were actually earning enough to send kids through college just off the internet, and it was like paltry numbers. And I think it has gotten a little better, but it's still really rare. And this is also this issue of living on the edge. Like there's these Horatio Alger stories. You know the reference? Okay, I don't know if... So there are Horatio Alger stories that, oh, you know, so-and-so is making a million dollars.

a year just talking about sneakers on this social media platform. But the truth is, there's hardly anybody like that. It's just a token handful and it won't last for them.

And so that's a Horatio Alger story. That's an illusion. And so you end up with all these little peak illusions. But if you have locality, if you have jazz clubs and local news, and if you have just local settings that are different, then you have a different mathematical result, which starts to resemble a bell curve, where you have a middle class in outcomes. And the middle class is where you get stability. It's where you can get nations and politics and things that aren't insane. There are problems with the middle class.

My whole generation was rebelling against it. But the thing about the middle class is that weirdly, in America, we think of it as kind of 50s television and suburban houses that are hyperconformist. But in fact, it isn't. What it is, is it's broader.

It's actually broader than that peak, which is what we get with the hub-and-spoke central filter. So if you have the central hub filter, you have just a tiny number of super economic winners. You have a narrowing of what expression is possible, and they result from the same mathematical process.

So the issue of wokeness and the issue of billionaires is the same issue. It's a mathematical mistake. That is fascinating. I want to hear more about that. Well, it's what I was just saying about—so if everything goes through the same central filter, then there's going to be a peak. Sometimes we call these things zip curves, where you have like a tiny number of winners, and then this sort of almost everybody's a hopeful. They might feel like they're really close to

to the elbow of getting into that little tiny peak, but they aren't. It's an illusion. And increasingly, you don't have an in-between in anything. Increasingly, if you can't tie into that peak, you just disappear. Like, okay, let's imagine you have a thousand people.

And you're going to tell those thousand people, we want you to form affinity groups and we want you to find other people who like similar music or whatever. You'll end up with all these clusters. And then within them, you'll say, oh, which music do you like the best? And you'll end up with a bunch of different answers from the different clusters. And then if you look at all that, that'll start as it gets bigger and bigger, it'll start to approach a bell curve approach.

meaning that there'll be a kind of an average that comes out. And that's a mathematical feature of reality. If you make accurate measurements of some... a large number of accurate measurements of some phenomenon, you should get a bell curve of sort of an average coming out, which is where middle class would come from in an economy, in some hypothetically perfect market economy, which, of course, has never existed. But, you know, you can kind of approach it more than not, perhaps. But if you tell all those people...

You're not going to talk to each other. We're just going to talk to you. There's only a central authority here, and you're going to all say, who's your favorite artist? What's going to happen is there's going to be a single reckoning for all of them. And then I'm going to say, well, since you chose this one, this is the one I'm going to show you more. And then what you'll start to do is you'll start to heighten that peak until it becomes more and more of what we might call a zip curve until everything that's not right at the tippy top starts to disappear. Yeah.

That's true in the economy, which is where this incredible, like the billionaires of Silicon Valley come from. But it's also true kind of in the woke world because there's this way that everything's going through a common filter. And so it gets accentuated more and more. And you start to get more and more of a refinement of a top peak. The top peak in that case being like abolish the police. Or whatever, yeah. And then everything falls away from that. And so that's where you lose the middle and you lose diversity and you lose –

And so it's a mathematical phenomenon, and it's intrinsic to the way we've done computer networks with this sort of global perspective from nowhere, which is a fiction and yet is guiding us. I can't say it's a fiction since it's operative, but it's a falsehood. What we're doing is we're amplifying noise until it becomes the only thing.

And so I'm speaking now mathematically sort of this is me. This is nerd me talking. So civilian me talking back to you as a person that wants to resist the cacophony, both participating in it and adding to it and wanting to create.

cultural version, let's say, of the bushy network that you described, where diversity is possible and local life is possible and, you know, our friction and edges are possible rather than the flattening effect of the algorithm. How do we do that?

As far as what to do now, oh, my God. I was just reading somebody's dissertation where they had a quote from Camus about that, you know, that if you mess up somebody's economics, you mess up their thinking and vice versa. I'm paraphrasing. But I think it's true. So I think...

One of the first things to do is to try to diversify the economics of whatever you do. Not with a greedy eye to finding every possible way to make a nickel, but with an eye towards not being beholden to any one stream. So I don't have anything against subscriptions. Like I say, everything's horrible, so have some horrible subscriptions. But also maybe I believe in books. I think books...

Books have an interesting quality in that they take so long to write, and they're such a pain in the butt to write, that they require you to make a statement about who you are. In a sense, a book is a stand-in for a person. It's not fleeting. It's not immediately reactive to whatever is there. It's saying, like, no, this is—I am saying I've climbed the mountaintop, and this is what I see. And it took me a whole year to climb this mountain and—

I think that's incredibly valuable. In other words, the inefficiency of the book process is its value. It forces you to really be a person and to really have a point of view, have an integration of your experience and thinking, hopefully, if it's a good book. But economically, the interesting thing about books is that it's actually a pretty diverse channel because what happens is...

You get publishers all over the world that are totally different. You get, you know, your audio book and whatever. And then you have to go out on your speaking tour, which is utterly miserable. But some of it is paid speaking. Well, you know what I'm talking about. I do. Yeah. But I think that increasingly...

It's hard to get through books that don't suit the intellectual orthodoxy. Yeah. And it's possible that the moment of the book has passed. I don't know. I'm going to try again. I have a couple more coming out. I'm really interested in picking up on what you're saying about personhood and like the idea of sort of remaining three dimensional and remaining whole and

I'm addicted to this punk of glass. Like, I check Twitter like it's a smoke break. Like, it's like a fix. And, you know, do you think that...

there is something, like if I spent less time with this, would I maintain my personhood more effectively? Well, you know, only you can answer that question. I really feel it's, if it's possible to notice when one's overstepping and talking about things that one doesn't know about, it's worth trying to not do that. I think it's very human to overstep that way because we're always trying to understand the world and our place in it. And we naturally over-interpret our own interpretation. Yeah.

I feel really certain that I don't know what's best for you, especially with you right here. But you know, okay, but let's choose another example. Like you've often, not often, sometimes have compared, you know, our use of social media to smoking, right? Well, actually, you know, the better scientific analogy is gambling addiction. Okay. You're getting really close there. Okay. So we would, I think, rightly be concerned.

about a person that is spending, you know, 12 hours of their day hitting a slot machine and not seeing sunlight. Let's just take the example of the average American teenage girl who's spending six hours a day on average on her phone. I have a 14-year-old here, yeah. And so it's a tremendous challenge right now. It's a tremendous, tremendous challenge.

I try to look at the bright side, and like I say, kids getting together for Zoom or watch parties or even synchronized TikTok exploration, while imperfect, at least is volitional. And I really do think that's—like, I want to look for the bright side, you know? And I think there's a little bit of good there, although, God, honestly, it's still crap. So—

Oh, God, I want to be positive. I want to be constructive. I want you to be honest. Well, look, you know, our society's never been... Let's just talk about the U.S., although what's funny is how similar we are to the rest of the world. We've never been more similar to the rest of the world than we are now because the whole world is being run by the same algorithms. So one of the things that's amazing to me is, like, I was in Brazil for the Bolsonaro election. I was in Turkey for Erdogan and...

Like, wherever you go, it feels the same, you know? Or UK for Brexit. It has, like, the same weird quality. So it's like Thomas Friedman was actually right about the world being flat. He got made fun of a lot for that. Yeah, but not in the way he thought it would be. Not with the mechanism he thought. But yeah, he was right.

But it's a funny thing. Like, what is the flatness? One of the things I've noticed is that the stupid leaders we have now are different from the stupid leaders we used to have. They're whiny and immature. They're like little babies. And I think there were some like that before, but not so much. I mean, it used to be that leadership meant pretending that you didn't have vulnerabilities. Stalin was the man of steel. And, you know, the leader would be imperturbable and would be...

the iron, the iron man with the, you know, the dour unchanging expression, watching the troops go by. And now it's like, Oh, this one paid attention to me. That was so nice. And Oh, this other one doesn't like me. So I don't like them. It's like, it's like, you know, toddler was mean girls, mean girls. Yeah. Mean girls. It's like, and what's funny is that you see the same personality in so many of these people. Like it's strange the degree to which there's this kind of snippiness of

What to do about all these 14-year-old girls online? And there is this problem that the whole society is being turned into this weird, snippy, self-conscious, insecure kind of unreal thing.

It's always been a little bit like that. Once again, nothing's entirely new and I can already hear people say, oh, but you don't recognize how much it used to be like this. But it's different. It's the uniform, monotonous way it's true is different. Well, what do you read to you're someone who seems really concerned about maintaining personhood beyond, you know, like so. So what without being preachy, because that seems like something you're worried about doing, although I don't find you that way at all. Like what what do you do?

To be a human on the Internet. So one thing is that I try never to read anything that isn't authored by a real person with a real date. I think that's really important. The real person problem is getting more and more problematic because a lot of stuff that claims to be by a person is actually a fake. My friends at Facebook tell me that during the COVID year, the chances of a new account sign on being a real person instead of a bot were only about 1%.

But they can – so like it's a massive fake – like most people online are fake at this point, which is – and the companies throw out a lot of fake accounts. But if they threw out all of them, they would lose value. So they're incentivized to not be quite as good as perhaps they could be at removing the fake accounts.

So you have this giant world of fake people, but I try to only read things by real people whose other work I can find who have real histories and who dated their work. And that's really important. But I just want to point out something. I've never been on social media. I've never had a Twitter account. I've never had one of the Facebook brands. I've never had a Google account. I've never had a TikTok account. I've never had a Snap account.

And that's despite knowing some of these people and having had some kind of shared history in the companies of one sort or another, especially Google. And with all of that, somehow or other, I'm able to write bestselling books and get booked as a speaker. I still have my A-list speaker status. I can still, you know, get...

I can still sell books and get reviewed in the good places and all that stuff. So why is that? And so people will say, oh, you're special because you're older and you started before. And, well, yeah, that's true. But on the other hand, being older kind of works against you, too. Or it could be, well, but you have nerd cred. Well, yeah, but the nerds hate me. You know?

Nerds aren't the people buying my books, you know. And I sometimes wonder if this whole thing about having to be on all the time is actually just false and that you would – whatever it is you're doing, whether it's selling books or getting subscribers on –

What's that thing called? Substack. Substack, right. Or whatever. Whatever it is you're doing. It might be about the same if you just dropped all this stuff, you know? Like, you might not need it. It might just be a drag. It's actually not giving you much. It seems like it is, but maybe it isn't. I don't know. I mean, honestly, I don't know. Maybe I am special and unusual. But I just haven't ever felt some lack from not being on these things. I feel like I'm about as successful as I would want to be, and I don't see...

Like, why do I need all that stuff? Like, what would it have done for me? Well, given that you're not on any of this stuff, and pretty much I imagine everyone that surrounds you in your life is, do you feel the ways that it's changing people's attention spans and personalities that you love? Sure, of course. It changes people quite a bit. And I think it's...

It's damaging, you know? But once again, that's me being judgy. But yeah, sure. I think it's generally degraded people. What's so wrong about being judgy, by the way? Why are you worried about being discerning? I think the problem is that

Judging other people has its own addictive potential. I think you can get drawn into a cycle where you get more and more that way. And then you turn into one of those people at the family Seder who's annoying. And I don't want to do that. I don't want to be that person. Well, Twitter is...

Twitter is basically the most addictive video game in the world because it uses real human beings that you get to, you know, it's not just Nintendo where you get to like smash the duck or whatever. You get to smash real people and to watch as like some of the most celebrated figures in the world spend their brain power doing that is disturbing to me. And I felt the way that it even has changed me. Right. And it's often interesting.

the parts of society that should be allies with one another devouring each other instead like I have one friend who's the head of a important uh black professional Society and is constantly under pressure for not being woke enough and this is a person obviously who's black and like the way that

devour each other in an unproductive way from this thing is just horrible and under-acknowledged. It's not some guardians at the gate trying to expel the worst of the worst from our midst. It's actually often

People just trying to find whoever they can can find. Well, this gets us back to the Shirley Jackson story. But it's also it's also like an inter-elite struggle, right? The struggle inside the professional association or the newspaper is about like what's going to be the prevailing opinion of that thing. And so it becomes like a sort of an intermural struggle.

rather than fighting the person across the street in the different tribe. Like I was saying, part of this might be specific to the inevitable, only human, unavoidable reaction to Trump and might kind of improve itself over a couple of years anyway. Like I want to hold out, like I always try to look for the positive and I want to hold out hope for that. Yet, yet, yet, yet, yet. It's true that Twitter does tend to have the structure of people piling on and, you know,

There is a schoolyard bully sort of thing where everybody's afraid of being the victim, and so they pile on when another victim is identified so that it isn't them. And I think it's an excellent reason not to be on it. It's like the most potent vehicle for public shaming I've ever seen. Yeah. So, you know, there's an interesting history to this. Do you know who designed the very first networked computer experience ever?

B.F. Skinner, the famous behaviorist. Oh, yeah, of course. So he designed the user experience for this very first connected screens experiment in the Midwest between universities. And he had this idea that he could take the behavior modification techniques that he'd perfected for pigeons and rodents and stuff and...

apply them to screens. And, you know, if you look at Facebook, the like button is the button that the pigeon was trained to press. But what's the electric shotgun? What's the candy? It's social pain and social affirmation. So what Skinner learned was that a slightly randomized, noisy,

reward and punishment feedback system was actually more motivating and had more of a behavior mod effect than a perfect one.

So a little bit of randomness is actually built into the system. So you don't know who the victim will be. And that actually makes it more powerful because everybody's on edge all the time. It's unpredictable. And in the same way... But why do we like that? The reason we like it is that there's some parts of our brain that are very ancient that helped us survive by having very quick responses if there was a predator threat or...

or a possibility for food that somebody else might grab before we did, or a mate, or all kinds of things. Like we have a whole system of very rapid, very deep, very old pathways that go back very, very far in the phylogenetic tree that are sometimes, people call them the fight or flight responses or the lizard brain or whatever. They're different popular terms for these things, and they're not fully cataloged or fully understood, but there definitely are

channels for rapid response that have priority over other channels in our brains. And they deal particularly with threats. And the behaviorists learn to exploit them in humans without actual electric shocks or candy. And so you have these social stimuli that are profound. They're very deep. You can't consciously override them. So if you're in a social context in which people

This is what we call social cognition. If people are paying attention to a certain thing, you will too. We rely on each other for this. Like if there's a group of people walking, they're all subconsciously relying on each other to be watching for hazards. And if somebody is suddenly startled by something on the left, everybody's aware of it. And if it's like an out-of-control truck hurtling towards you, that's very functional. Or a saber-toothed tiger. I don't know. Whatever it might have been.

That we were concerned... Well, actually, this predates the appearance of humans, so it would have been other things. But these circuits are just like sitting there waiting to be exploited. And so you have the Skinner box we created in Facebook with the button, with the candy, and with the electric shock. But the button is still there as a button. That's the like button. But the other two things are now social phenomena. And...

It's an awful thing. It's an awful thing. You're a pigeon or a rat in Skinner's old lab when you use these things. One more break and then we'll get some hope from Jaron Lanier.

One of the downstream effects of watching people get turned into the main character of the day, the person that's being publicly shamed, it leads to this epidemic of self-censorship among anyone else. Like, why would you take a risk to say something different on these platforms in public if you see other people getting nuked?

How much of that aspect of it worries you in terms of like, we need free speech in order to pursue progress and to pursue the truth. And if we're living in a time where everyone is self-censoring or more than that, like having a private truth that we would share around a table like this, but a lie that they say publicly so that they avoid that pain. Like it's...

isn't that an existential threat to the pursuing of truth? I mean, my own history with this is that way back in Usenet days, I noticed myself being mean to somebody in a way that I didn't think was healthy for me and certainly was cruel to that person. And I haven't been on a social media-like thing since. Because you saw what it was doing. I noticed myself degrading. And look, none of us are perfect and all of us have bad moments and all of us have sometimes regretted...

how we've dealt with other people. And I'm certainly no exception to that. But I think we have a responsibility to try to be less horrible than we might otherwise be. You know, I just think we have to... We have to at least dig our fingernails into the funnel of shit that's trying to draw us down, you know? And... That's your hello on one foot. Yeah, that's right. That's my hello on one foot. Exactly. So...

Yeah.

hopefully it is not and um it should you know like it should have a diversity of business models instead of just one and it should emphasize people paying for stuff they want because like i think you have to have a real stake in whatever you do for to be real you know and like if everything's fake free there's a kind of a casualness that contributes to the problem so in other words um

Is it more likely that somebody is going to try to be a disruptive jerk at some book reading where everybody was admitted free or at some paid lecture where people bought tickets? Obviously, it's the free one because that person has less of a stake. And now I know the objection there is, well, what about the poor? Well, you have to make some allowance for that. There has to be some sort of arrangement. But

But this idea of just total free, everything's totally free and there's no barrier to entry ever does encourage a kind of a casual, low stakes way of thinking about things that does become part of the problem. Just like if there was a paid playground where every kid who went to the playground had to give a candy bar in order to get access to

I have a feeling there'd be less bullying there, honestly, because they would have a direct interest in something there, like a water slide or whatever it is. You know, like there would be a purpose. I haven't done it in a while, but I used to keep admittedly subjective assessments of how horrible the different online gathering places were. And the horribleness does track a few things. It tracks the degree to which people have a real stake in.

It tracks the degree to which people have persistent stakes as a group with others who are known to one another. It tracks to the degree that people have some stakes in each other's reputations. So what I mean by that is, let's say if there was some truly awful person on Substack that was nothing but poison and nothing but criminality and whatever...

you might want that person removed because that has an impact on you. And that's not cancel culture. That's legitimate quality seeking. This comes to another issue about the future, which is that

Mixing individuals together tends to lead to an all against all competition that tends to lead to the worst emerging. But having what we call societal institutions tends to create people with shared stakes and a possibility for quality to emerge. And it's the only mechanism that ever has. It's the only technique we know. And the master talking about this is Hannah Arendt, who talks about societal institutions well. And de Tocqueville is another good one.

And so the Internet just completely plowed over societal institutions and flattened everything. That gets back to my math thing about the bushy graph versus the Povins book. So we need institutions, obviously. We need to have shared stakes. We need to have...

stakes in each other in the sense of ongoing effort for a purpose. And we need to have reasons that we rise and fall together. So like right now online, there's no perfect online platform. The one with the least poison and the least crap is probably this thing called GitHub, which you probably don't know about. It's a nerdy thing where people do projects online.

And I have to, it's a Microsoft thing, so I might have a bias just to disclose that. But people have shared projects, and that means they're supporting each other in the project, and there's this kind of intrinsic positivity about it. If it's only each person for themselves, you tend to have this, like if somebody else is down, like the kid getting bullied in the schoolyard, relatively speaking, maybe you're up a little bit or at least you're not down. So it creates a different dynamic. And all-against-all competition is,

intrinsically cruel where societal institutions, even if they're competing with one another, can do so with more of a collegial spirit, if you like. And societal institutions have included newspapers and soccer teams and colleges and bands and, you know, there's all these things. And whenever that happens, things are a bit more positive. Jaron, just picking up on what you were saying about how

Institutional structures are the thing that keeps us from being in a sort of zero-sum game of all against all. And in a world in which, you know, it feels like everyone's an influencer and there's no leaders and where people's commitments or even trust in institutional authority is so diminished and everyone's focused on

I don't know, on apps and social networks. Like, how do we recover the things that we know from looking at all of human history are necessary for human flourishing without sounding too high-minded about it? Sure. No, human flourishing is a good term. So in the wake of World War II, there were a lot of people trying to find some new path to institutions, right?

didn't involve the potential for massive manipulation and power catastrophes as it happened with Nazi propaganda. So B.F. Skinner was not one of them. He just wanted to have a center of power and control of humans and

Another interesting counterexample was the VALS program at SRI, which was the creation of modern marketing, which tried to focus on inner directed people as the highest value. It's a very interesting thing. It didn't really last, but that was something. When I grew up, my dad had been in the science fiction world. He was a science fact writer for Hugo Gernsback's pulp magazines, like Fantastic and Amazing.

And he was an early promoter of UFO nonsense. And I grew up in southern New Mexico, you know, in the nexus of UFO nonsense.

And the thing about the UFO world is that people have a lot of fun with it. And it provides an answer to a human need. And people need to be able to have something to obsess over. People need to be able to exercise their brains, thinking about something beyond the edges of what's official. They need to be able to have common quests. They need to be able to

explore things that might not be true because otherwise truth calcifies you know like i actually think this is legitimate i'm not making fun of these people or looking down my nose at them i actually think this is something we all need and so there are some of these things that are popular that are genuinely harmless so far as i can tell bigfoot is a great one you know like

I'd love there to be some big, you know, North American great ape that's wandering around. I don't think there is, but maybe, you know, it's not absolutely impossible. And,

Same thing with UFOs. I mean, great. You know, I don't think we're seeing alien spaceships, but you know what? It's not absolutely impossible. And do you really believe that we've seen every possible weird thing in the atmosphere? Of course we haven't. There's other stuff. And I like, I think it's great. I, but anyway, the point is that these, I think that being obsessed with something at the edge of thought is really important and being able to do that with other people is really healthy and maybe even vital and,

The thing is, some of these things are damaging. An example is people who are trying to talk to dead relatives and get you to pay them money, the seance type people.

exploitative. People who are just making a lot of money, like extracting money for it. Like, you know, I lived in Marin, so I have really had my fill of astrology. Let me tell you, I've heard enough astrology for many lives. On the other hand, whatever, as long as somebody isn't draining someone else's finances for it. I have an example of somebody who's

facing a difficult illness and has lost a lot of money to charlatans. And that really bothers me. So the exploiters. And then there's the sort of QAnon or Stop the Steal type people who are using the same impulse for political power. So the thing is, there are a lot of genuine mysteries on the edge of knowledge that people could focus on. One can do that

by exploring where music can go. One can do that by exploring where movies can go. And there's like all kinds of edges to explore that aren't like Bigfoot or UFOs that are also available. But I think having these things, like a lot of the humanist project, if you go back to the origins of science, was having this like edge of knowledge that people could explore together. And there's something very beautiful about that. Sharon Linear.

Thank you so much. Well, I'm delighted you went to all the trouble to come by. I appreciate it a lot. And I wish you the greatest success in whatever paths you take here. And don't pay attention to Twitter so much. Oh, my God. Like, let go of that stupid thing. I'll try. Thanks for listening. And as always, visit us at honestlypod.com. See you next week.