Home
cover of episode No Mercy / No Malice: Age Gating

No Mercy / No Malice: Age Gating

2024/6/29
logo of podcast The Prof G Pod with Scott Galloway

The Prof G Pod with Scott Galloway

Chapters

Shownotes Transcript

Support for this show comes from Atlassian. Atlassian software like Jira, Confluence, and Loom help power the collaboration needed for teams to accomplish what would otherwise be impossible alone. Because individually, we're great, but together, we're so much better. That's why millions of teams around the world, including 75% of the Fortune 500, trust Atlassian software for everything from space exploration and green energy to delivering pizzas and podcasts. Whether you're a team of two, 200, or 2 million, Atlassian software is built to help keep you connected and moving together as one.

Learn how to unleash the potential of your team at Atlassian.com. That's A-T-L-A-S-S-I-A-N.com. Atlassian. This episode is brought to you by Shopify.

Forget the frustration of picking commerce platforms when you switch your business to Shopify, the global commerce platform that supercharges your selling wherever you sell. With Shopify, you'll harness the same intuitive features, trusted apps, and powerful analytics used by the world's leading brands. Sign up today for your $1 per month trial period at shopify.com slash tech, all lowercase. That's shopify.com slash tech.

I'm Scott Galloway, and this is No Mercy, No Malice. We've decided that children should not have access to the military, alcohol, driving, pornography, and many other things. Social media? Come one, come all. Age-gating, as read by George Hahn. I'm in a dark place. I just watched democracy collapsing as a con man abused an old man.

I haven't hit rock bottom yet, so let's discuss social media and age gating. Social media is unprecedented in its reach and addictive potential. A bottomless dopa bag that fits in your pocket. For kids, it poses heightened risks. The evidence is overwhelming and has been for a while. It just took a beat to absorb how brazen the lies were. We're proud of our progress. Social media can be dangerous.

That doesn't make it net bad. There's plenty of good things about it. But similar to automobiles, alcohol, and AK-47s, it has a mixed impact on our lives. It presents dangers. And one of the things a healthy society does is limit the availability of dangerous products to children who lack the capacity to use them safely. Yet two decades into the social media era,

We permit unlimited all ages access to this dangerous addictive product. The reason? Incentives. Specifically, the platforms disincentivized to age gate their products throw sand in the gears of any effort to limit access. To change the outcome, we must change the incentives. I'm a better person when I drink.

More interesting and more interested? One of the reasons I work out so much is so I can continue to drink. Muscle absorbs alcohol better than fat does. Kids are different, and we've long been comfortable treating them differently. In 1838, Wisconsin forbid the selling of liquor to minors without parental consent, and by 1900, state laws setting minimum drinking ages were common.

There's a good case to be made that the U.S. alcohol limit of 21 is too high, but nobody would argue we should dispense with age-gating booze altogether. That trend has paralleled laws restricting childhood access to other things. The right to bear arms is enshrined in the Constitution, yet courts don't blink at keeping guns out of the hands of children, even as they dismantle every other limitation on gun ownership.

If there's a lobbying group trying to give driver's licenses to 13-year-olds, I can't find it. Age of consent laws make sex with children a crime. Minors are not permitted to enter into contracts. We limit the hours and conditions in which they can work. They cannot serve in the military or on juries, nor are they allowed to vote. That last one we may want to reconsider.

These are not trivial things. On the contrary, we exclude children from or substantially limit their participation in many core activities of society. The only time I have appeared on late-night TV was when Jimmy Fallon mocked me, showing a CNN video clip where I said, I would rather give my...

Fourteen-year-old son, a bottle of Jack Daniels and marijuana, then an Instagram and a Snap account. 4,000 likes and 265,000 views later, it appears America agrees. My now almost 17-year-old son has engaged with all three substances.

Alcohol and Instagram make him feel worse afterward, not sure about weed. However, he is restricted from carrying a bottle of Jack in his pocket, and his parents would ask for a word if his face was hermetically sealed to a bong. Note, spare me any bullshit parenting advice from non-parents or therapists whose kids don't come home for the holidays.

He, we, and society restrict his access to these substances. And when he abstains from drinking or smoking, he isn't sequestered from all social contact and the connective tissue of his peer group. We freaked out when we found, as you will if you have boys, porn on one of his devices. But the research is clear. We should be more alarmed when we find Instagram, Snap, or TikTok on his phone.

Mark Zuckerberg and Sheryl Sandberg are the pornographers of our global economy. Actually, that's unfair to pornographers. Age-gating social media is hugely popular. Over 80% of adults believe parental consent should be required for social media and almost 70% want platforms to limit the time minors spend on them.

Those numbers are from last fall, before my NYU colleague Jonathan Haidt published The Anxious Generation, which builds on the work of Gene Twenge and others making the most forceful case yet that social media is hurting our children.

Reviewing the shocking increase in depression, self-harm, and general suffering our children are experiencing, and the explanations offered by the platform apologists, Professor Haidt highlights the twin specters of social media and mobile devices and the lasting damage they're doing to a generation.

Unconstrained smartphone use, Haidt observes, has been, quote, the largest uncontrolled experiment humanity has ever performed on its own children, unquote. And the results are in. Legislatures are responding. States from California to Utah to Louisiana have passed laws that limit access to social media based on age.

If you haven't noticed any change in the behavior of the platforms, however, that's because courts have blocked nearly all of them. A social media and digital commerce trade group called NetChoice is quick to sue any state that interferes with its members' ability to exploit children for maximum profit.

Judges are siding with the platforms, and probably not because they enjoy seeing depressed teenagers fed content glorifying self-harm or teenage boys committing suicide after being sextorted. The platforms and other opponents of these laws, such as the ACLU, make two main points. First...

They claim that verifying age online is too complicated, requiring the collection of all sorts of information about users, and won't work in all cases. Second, requiring users to collect this information creates free speech, privacy, and security concerns. The platforms also deny their products are harmful to children. On their face, these points are valid.

It is more difficult to confirm age online, where there's no clerk at the counter who can ask to see your driver's license and reference your face. And these platforms have proven reckless with personal data. It's sort of a, they're so irresponsible, but we can't take action dilemma. But these objections are not about age verification, children's rights, free speech, or privacy. They are concerns about the platform company's capabilities.

Their arguments boil down to the assertion that these multi-billion dollar organizations who've assembled vast pools of human capital that wield godlike technology can't figure out how to build effective, efficient, constitutionally compliant age verification systems to protect children. If this sounds like bullshit, trust your instincts. This isn't a conversation regarding the realm of the possible.

but the profitable. When you pay an industry not to understand something, it will never figure it out. Just look at the tobacco industry's inability to see a link with cancer. What's more challenging?

Figuring out if someone is younger than 16? Or building a global real-time communication network that stores a near-infinite amount of text, video, and audio retrievable by billions of simultaneous users in milliseconds with 24-7 uptime? The social media giants know where you are, what you're doing, how you're feeling, and if you're experiencing suicidal ideation. But they can't figure out your age.

You can't make this shit up. The platforms could design technology that reliably collects sufficient information to confirm a user's age, then wipes the information from its servers. They could create a private or public entity that processes age verification anonymously. Remember the blockchain? Isn't this exactly the kind of problem it was supposed to solve?

They could deploy AI to estimate when a user is likely underage based on their online behaviors and seek age verification from at-risk people. If device manufacturers, or just the device OS duopoly of Apple and Alphabet, were properly incentivized, they could implement age verification on the device itself. This is what Meta says it wants when it isn't fighting age verification requirements.

Or, crazy idea, they could stop glorifying suicide and pushing pornography to everyone. The reason Zuck and other Axis powers haven't built age verification into their platforms is it will reduce their profits because they will serve fewer ads to kids, which will suppress their stock prices, and the job of a public company CEO is to increase the stock price.

Period. Full stop. End of strategic plan. So long as the negative impact to the stock price caused by the bad PR of teen suicide and depression is less than the positive impact of the incremental ad revenue obtained through unrestricted algorithmic manipulation of those teens, the rational, shareholder-driven thing to do is fight age verification requirements.

If we want the platforms to make their products safe for children, we need to change the incentives. Force them to bear the cost of their damage. Internalize the externalities, in economist speak. There are three forces powerful enough to do this. The market, plaintiff lawyers, and the government.

The market solution would be to let consumers decide if they want to be exploited and manipulated. And by consumers, I mean teenagers. One big shortcoming of this approach is that teenagers are idiots. I have proof here as I'm raising two and I used to be one. My job as their dad is to be their prefrontal cortex until it shows up.

I told my son on a Thursday it was Thursday and he disagreed. The next approach is to let the platforms do whatever they want, but if they harm someone, let that person sue them for damages. This is how we police product safety in almost all contexts. Did your car's airbag explode shrapnel into your neck? Sue Takata. Did talcum powder give you cancer? Sue J&J.

Did your phone burn the skin off your leg? Sue Samsung. People don't like plaintiff lawyers, but lawsuits are a big part of the reason that more products don't give you cancer or scald you. Nobody can successfully sue social media platforms, however, because of a 28-year-old law known as Section 230, which gives them blanket protection against litigation.

I've written about the need to limit Section 230 before, and whenever I do, a zombie apocalypse of free speech absolutists is unleashed. The proposition remains unchanged, however. If social media platforms believe they've done everything reasonable to protect children from the dangers of their product, then let them prove it in court. Or better yet,

Let the fear of tobacco or asbestos-shaped litigation gorging on their profits motivate them to age-gate their products. Finally, the government can go after companies whose products harm consumers. The Federal Trade Commission has fined Meta $5 billion over privacy violations to no apparent effect. This was perfect, except it was missing a zero. For these firms, $5 billion is a nuisance, not a deterrent.

There's a bill in the Senate right now, the Kids Online Safety Act, which would give the FTC new authority to go after platforms which fail to build guardrails for kids. It's not without risk. Some right-wing groups are supporting it because they believe it can be used to suppress LGBT content or anything else the patriarchy deems undesirable.

But I have more faith in Congress's ability to refine a law than I do in the social platform's willingness to change without one. Until we change the incentives and put the costs of these platforms where they belong, on the platforms themselves, they will not change. Legislators trying to design age-gating systems or craft detailed policies for platforms are playing a fool's game.

The social media companies can just shoot holes in every piece of legislation, fund endless lawsuits, and deploy their armies of lobbyists and faux heat shields, lean in, all the while making their systems ever more addictive and exploitative. Or maybe we have it wrong, and we should let our kids drink, drive, and join the military at 12.

After slitting their wrists, survivors often get tattoos to cover the scars. Maybe teens should skip social media and just get tattoos. I warned you, dark. Life is show rich.