Big Tech's Big Tobacco Moment

2026-04-04 07:05:00 • 58:56

-

Offline is brought to you by Zbiotics Prealcohol.

0:03

Let's face it.

0:04

After a night with drinks, we're not bouncing back.

0:07

Like we used to.

0:08

Cause we're in our 40s.

0:10

And honestly, if you're 30 or over

0:13

and you're not using Zbiotics,

0:14

like I need I want to pull you aside

0:16

and have a personal talk with you about your life choices.

0:18

And also, don't think you're so great

0:19

if you're in your 20s.

0:20

You could use Zbiotics in your 20s too.

0:22

It's very good.

0:22

I think you just feel better.

0:23

You probably should be more varsity.

0:26

Zbiotics Prealcohol Probiotic Drink

0:27

is the world's first genetically engineered probiotic.

0:29

It was invented by PhD scientists

0:31

to tackle rough mornings after drinking.

0:33

Here's how it works.

0:34

When you drink alcohol gets converted

0:35

into a toxic byproduct in the gut.

0:37

It's a build up of this byproduct,

0:38

not dehydration, it's to blame for rough days after drinking.

0:41

Prealcohol produces an enzyme to break this byproduct down.

0:44

Just remember to make prealcohol your first drink

0:46

the night drink responsibly

0:47

and you'll feel your best tomorrow.

0:49

I have Zbiotics everywhere all over the place.

0:52

My house and in my bag.

0:53

I kept running out so I bought like a giant stockpile.

0:56

Yeah.

0:57

It's like it's my strategic oil reserve.

0:59

That's my sprout.

1:00

I gotta release the Zbiotics.

1:02

It's indispensable.

1:05

From the fairways in Augusta to the first pitch

1:07

of baseball season in the start of festival circuits,

1:11

April is a sprint of outdoor celebrations.

1:13

Don't let a rough next day keep you on the sidelines

1:15

drink prealcohol to stay ahead of the game

1:17

and make the most of every sunny Saturday.

1:20

Go to zbiotics.com slash offline to learn more

1:22

and get 15% off your first order

1:24

when you use offline at checkout.

1:26

Zbiotics is back with 100% money back guarantee.

1:28

So if you're unsatisfied for any reason

1:30

they'll refund your money.

1:31

No questions asked.

1:32

Remember to head to zbiotics.com slash offline

1:34

and use the code offline to checkout for 15% off.

1:38

In moments like these, it's easy to feel overwhelmed

1:42

and even easier to feel powerless, but we are neither.

1:46

I'm Stacey Abrams and on my podcast,

1:48

Assembly Required, I take on each executive action,

1:52

legislative battle and breaking news moment

1:55

by asking three questions.

1:57

What's really happening?

1:58

What can we do about it?

2:00

And how do we keep going together?

2:03

This is a space for clarity, strategy and hope

2:07

rooted in action, not denial.

2:10

New episodes of Assembly Required, drop Tuesdays,

2:13

tune in wherever you get your podcasts and on YouTube.

2:18

With so many options, why choose Arizona State University?

2:21

For me, the only online option was ASU because of the quality

2:24

where faculty was really involved with their students

2:27

and care about your personal journey,

2:29

the dedication to my personal development

2:32

from my professors.

2:33

That's extremely valuable to me.

2:35

Earn your degree from the nation's most innovative university,

2:39

online, that's a degree better.

2:41

Explore more than 350 plus undergraduate,

2:43

graduate and certificate programs at ASUonline.asu.edu.

2:49

Once a juror understands that a company has been researching

2:53

this and that the more they looked into it,

2:55

sort of the worse stuff they found

2:57

and then also that research kind of gets canceled

2:58

or the researchers get moved to other projects,

3:00

it kind of does start to feel like a big tobacco moment, right?

3:03

Yeah.

3:04

Yeah.

3:04

I'm John Favreau and you just heard from Tech Journalist

3:15

Casey Newton, who is our guest this week,

3:17

along with New Mexico Attorney General Raol Torres.

3:20

I think there was a huge development last week

3:22

in the fight to free kids from having their lives

3:24

controlled by what's on their screens.

3:26

Something that worried me before I had kids

3:28

and keeps me up at night now that I do.

3:31

It has to do with Mark Zuckerberg,

3:32

one of the world's richest men

3:34

who runs one of the world's richest companies.

3:37

Someone who spent most of his charmed life

3:39

using money and power to remove whatever obstacles

3:42

get in the way of what he wants.

3:44

And what he wants always seems to be more.

3:48

More users, more money, more market share,

3:51

growth at any cost, even if that's meant

3:54

violating people's privacy.

3:56

Even if it's meant stealing data or lying to investors.

3:59

Even if it's meant trying to bury mountains

4:01

of Metta's own internal research

4:04

about the harms that Facebook and Instagram have unleashed,

4:08

about how addictive their products are,

4:10

especially to children.

4:12

Something the employees knew and talked to each other about.

4:17

Quote, oh good.

4:19

We're going after 13-year-olds now.

4:21

One wrote,

4:23

targeting 11-year-olds feels like tobacco companies

4:26

a couple decades ago.

4:28

Here's another.

4:30

No one wakes up thinking they want to maximize

4:32

the number of times they open Instagram that day,

4:35

but that's exactly what our product teams are trying to do.

4:39

Then this exchange between two meta researchers.

4:42

Oh my gosh, y'all.

4:44

Instagram is a drug.

4:46

We're basically pushers.

4:49

We are causing reward deficit disorder

4:51

because people are binging on Instagram so much

4:54

they can't feel reward anymore.

4:57

Mark Zuckerberg has basically escaped

4:59

any kind of meaningful accountability for this

5:01

or anything else.

5:03

Huge regulatory fines haven't faced him.

5:06

They're basically a rounding error.

5:08

Congress halls them in to testify from time to time

5:10

so they can yell at him,

5:11

but they haven't really touched him.

5:12

Whistleblowers from inside his company

5:14

who've come forward have been smeared

5:16

and threatened with lawsuits.

5:18

I know, I was supposed to interview one

5:20

until she got hit with a gag order.

5:22

Zuckerberg is so used to getting his way

5:24

that when local Hawaiians objected to his $300 million

5:27

2000 acre compound and underground bunker

5:30

because it was their land

5:32

where their ancestors are buried,

5:34

Mark actually tried to sue them

5:37

because he thinks the money and the power

5:39

allow him to get away with anything.

5:42

And until last week, he was basically right.

5:46

But now he isn't.

5:47

When he walked out of a courtroom here in Los Angeles

5:49

last week, after taking the stand in front of a jury

5:52

for the first time,

5:54

Mark Zuckerberg was finally held accountable.

5:57

Not by government regulators or board members

6:00

or shareholders,

6:01

but by a young woman named Kaylee

6:03

from his own backyard in Northern California.

6:06

Kaylee was on YouTube when she was six

6:09

and Instagram by nine.

6:12

She said that she initially got a rush

6:13

from all the likes and notifications,

6:15

which during class,

6:17

she would run to the bathroom to check

6:19

because she was panicked

6:20

that she might be missing out on something.

6:23

Pretty soon, she was spending all her time on the platform.

6:26

She stopped hanging out with her family,

6:28

she stopped making friends,

6:29

she hit 16 hours a day on Instagram.

6:34

She tried setting time limits, it didn't work.

6:36

Her mom tried parental controls,

6:39

but didn't work either.

6:40

She was bullied and sexually extorted

6:42

and she still couldn't keep herself off the platform.

6:45

She bought likes and she added filters,

6:47

but all the other filtered photos

6:49

made her more insecure about how she looked.

6:52

She couldn't sleep, she became depressed,

6:55

she started cutting herself,

6:57

she contemplated suicide,

6:59

but eventually she got help.

7:01

She also got a lawyer,

7:03

and when she was 17 years old,

7:05

she sued Mark Zuckerberg.

7:07

For the first time in history,

7:09

a jury held that both Meta and YouTube

7:11

also were defended in this case,

7:13

were negligent in the design of their platforms.

7:16

The jurors found that the company's negligence

7:18

was a substantial factor in causing Kayleigh harm,

7:22

and that they had failed to warn users

7:24

about dangers that the companies themselves

7:26

had long been aware of, but there was more.

7:29

The day before the LA verdict,

7:31

a jury in New Mexico found that Meta violated

7:34

their state's consumer protection laws

7:36

by designing a product that fails to protect children

7:39

from predators.

7:40

The result of a lawsuit brought by

7:42

a new Mexico attorney general Raul Torres,

7:45

whose office set up an undercover investigation,

7:47

where they created a fake Instagram profile

7:50

of a 13-year-old girl that was almost immediately flooded

7:53

with messages from child predators,

7:55

three of whom were then arrested.

7:58

The combined damages in the LA in New Mexico cases

8:00

amount to a few hundred million dollars,

8:02

which is, again, a rounding error for Meta.

8:05

But the money isn't really what matters here.

8:07

What matters is that Meta and the rest

8:09

of the social media giants have now lost

8:11

a legal shield that has protected them for 30 years.

8:16

Because Kayleigh didn't sue them

8:17

over the content on their platforms.

8:19

She sued them because their platforms are defective,

8:23

because the product's design isn't safe for all users,

8:26

especially children.

8:28

Meta knew that and didn't tell us that.

8:31

None of them did.

8:33

And so for the first time, these verdicts

8:35

might finally force tech giants

8:37

to do what no one else has been able to make them do.

8:40

Fix the design, make it safer,

8:43

get rid of social media's most addictive,

8:45

harmful features, infinite scroll, auto play,

8:49

push notifications, beauty filters,

8:51

even algorithmic recommendations.

8:53

This is all on the table now for these juries and judges.

8:57

And there will be many more.

8:59

2000 similar pending lawsuits will now move forward,

9:02

including a massive federal case with 1,600 plaintiffs

9:05

that starts this summer.

9:07

Meta is not happy.

9:09

They will appeal.

9:10

They will keep making the same argument

9:12

they made with Mark in the LA trial.

9:14

The Kayleigh's problem was an Instagram,

9:16

it was Kayleigh, or her mother,

9:18

or anything else in her life that was an Instagram.

9:21

They'll keep arguing that their right to free expression

9:24

protects them from being forced to change their platforms.

9:28

And to be honest, I totally get why so many people are concerned

9:31

that these verdicts could also end up forcing

9:33

social media companies to have more censorship

9:36

and surveillance on their platforms.

9:39

Ideally, you would pass a law that deals with social media's

9:42

most harmful features while still protecting speech

9:45

and privacy, especially for adults.

9:48

But that would require a function in Congress,

9:50

and a president who wasn't the most powerful living example

9:53

of social media brain rot.

9:55

So here we are.

9:57

And I think that whatever reservations people might have,

10:00

most Americans understand what those jurors understood.

10:04

That freedom of expression does not include the freedom

10:07

to design an addictive product that you know to be harmful,

10:11

especially to children.

10:14

This isn't some abstract legal debate.

10:16

It isn't some moral panic.

10:19

It's what the people who've built and sold these products

10:21

have said themselves, even though their bosses

10:25

tried to bury the truth.

10:27

And most of us are sick of it.

10:29

All kinds of people.

10:31

People with different politics, different backgrounds,

10:33

people without kids, people with kids,

10:36

and the kids themselves.

10:38

They don't want to spend their childhoods

10:39

stuck in their feeds.

10:41

Most of us don't want these tech companies

10:43

to keep stealing more and more of our attention

10:46

just so they can make another billion.

10:48

And we certainly don't have much confidence

10:50

that the next set of tech gods creating super intelligent robots

10:54

will do a better job than the geniuses who

10:56

blessed us with the algorithm, probably

10:58

because they're run by some of the very same people,

11:01

like Mark Zuckerberg.

11:03

The anger and disgust that most Americans

11:05

feel towards big tech is real.

11:08

It's become a potent political force

11:10

with an organized, growing movement behind it.

11:13

What's needed now are political leaders

11:15

willing to listen, take up this fight,

11:18

and rally the country around a future

11:20

where we control the technology that shapes our lives,

11:23

not the other way around.

11:26

At the end of the day, that's all the families who

11:28

filed these lawsuits and cheered these verdicts really one.

11:32

As the trial ended here in LA, some of those families

11:35

were standing outside the courthouse,

11:36

holding up photos of their children.

11:39

Their sons and daughters who struggled with depression

11:42

and eating disorders, kids who had taken their own lives.

11:47

These parents have been showing up to courthouses

11:49

and congressional hearings and school board meetings

11:51

for years now, holding up those photos, begging someone

11:56

to listen.

11:57

Thank God that last week, a jury of 12 people

12:00

on Los Angeles finally did.

12:03

Up next, my conversation with New Mexico Attorney General,

12:06

Raul Torres.

12:08

The Attorney General Torres, welcome to offline.

12:13

Thanks for having me, I appreciate it.

12:15

So you just want to landmark verdict against meta

12:18

based on a lawsuit you filed in 2023

12:22

after an undercover operation where your office

12:25

created a fake profile of a 13-year-old girl.

12:30

What happened after you created that profile

12:33

and what did it tell you about what meta already

12:37

knew about their product?

12:38

Well, what we were trying to do is recreate

12:40

the actual experience of a young person who is new

12:45

to the platform.

12:47

We had been hearing from our law enforcement officers

12:50

inside the agency that a lot of the predatory behavior

12:53

that we were most concerned about had migrated

12:55

to these spaces.

12:56

And so we were just trying to test and see what happened.

12:59

She was flooded with sexually explicit material requests

13:04

for some kind of real world interaction.

13:09

And what was most shocking is instead

13:13

of flagging this explosive growth in this young girl's account,

13:18

the company actually sent her information

13:21

about how to monetize her following

13:23

and how to grow her following.

13:24

And that was the moment for me.

13:26

I was like, we really got to dig into this

13:28

and go a whole lot deeper.

13:30

So I guess the sort of parental controls

13:34

that Instagram offers didn't really do anything in this case?

13:38

Yeah, no.

13:40

I mean, what you saw again and again,

13:44

every time we pulled back another curtain inside the company,

13:47

you saw all of these communications, emails,

13:53

and information that was being shared about not only

13:56

the addictive nature of the product,

13:58

how harmful it was to kids,

14:00

but they're very clear awareness of all the predators

14:03

that were there.

14:04

And to match that and compare that with what they have been

14:08

saying publicly with what Mark Zuckerberg has been saying

14:10

publicly was something I think really prompted the jury.

14:13

I mean, they heard six weeks of testimony

14:16

came back with a decision in less than a day.

14:18

And in my senses, they were trying to send a message.

14:21

And so hopefully everyone who's been paying attention

14:24

to this case really starts to understand the sense of urgency

14:27

that I think people in the community have about it.

14:29

So the jury awarded the maximum penalty per violation,

14:32

$5,000 each, but the total of $375 million

14:36

was under the $2 billion you asked for.

14:40

One juror said they compromised on the number of violations

14:43

but maxed out on the penalty per child.

14:45

How did you read that?

14:47

Well, I mean, to your point, I think they did compromise.

14:49

We were looking for something that captured

14:54

the full extent of the harm, all the underage kids

14:57

that were on the platform.

14:59

I think they took a compromise and went with a number

15:03

that represented the estimate of kids

15:06

that might have actually been harmed.

15:08

The thing to remember though is that that $5,000 penalty

15:11

hasn't been changed since 1970.

15:13

Since we first enacted this consumer protection law,

15:16

had it been adjusted for inflation,

15:18

it had been a $40,000 per violation.

15:20

That would have pushed the result

15:23

to just under $3 billion.

15:25

And so one of the things that we're doing

15:27

in the aftermath of this verdict

15:29

is pushing to both expand the definition

15:32

of what's covered under the act,

15:33

but really ratchet up those penalties.

15:35

Because I recognize, I think everybody recognizes

15:38

that that's not a big enough stick

15:40

for a company that has this many resources

15:43

and engage in this kind of commerce all over the world.

15:46

We need to have stronger deterrence.

15:48

It's something that I'm working on.

15:50

I'm also really trying to push the other AGs

15:53

around the country to really re-examine

15:55

our consumer protection laws.

15:56

Because most of them haven't been updated in years.

15:58

I read that you're also going back to the table

16:00

and made ask the judge for additional financial penalties

16:05

and a ruling that would force meta

16:07

to make changes in their apps.

16:10

Can you talk more about what specifically

16:12

you'll be asking for?

16:13

So the judge separated out our public nuisance claim.

16:17

And so we're gonna come back.

16:18

We're gonna really present more evidence

16:20

about how much harm the company's products have caused here

16:24

in New Mexico will be asking for additional monetary penalties.

16:27

But the more important piece of the presentation

16:31

that's gonna happen in May

16:32

is on our request for injunctive relief.

16:34

That means real age verification changes to the algorithm

16:38

where they stop bombarding kids with notifications

16:41

during the school day and the middle of the night.

16:43

Changes to infinite scroll to auto play of videos.

16:47

And we're gonna actually be asking the court

16:49

to set up an independent monitor,

16:52

hopefully relying on technologists and experts

16:54

from around the country to help us design

16:58

very clear and specific features

17:00

to create a safer environment there.

17:02

The cool thing about it is that if we can do this here

17:04

effectively, we can actually establish a blueprint

17:08

for what can happen around the rest of the country

17:10

and around the world.

17:11

So I think it's a real opportunity

17:14

for us to change fundamentally

17:16

in the way this company does business.

17:17

So this one case, if it's held up on appeal

17:21

and if the judge agrees, it could lead to maybe the end

17:24

of infinite scroll of some of these notifications,

17:28

push notifications for children, age verification,

17:30

just all across meta and perhaps other

17:34

social media companies as well.

17:35

The jurisdiction of this court

17:37

is obviously limited to the state of New Mexico.

17:39

So what we would effectively be doing is asking them,

17:42

if they're gonna continue to do this business in New Mexico,

17:45

they're gonna have to come up with a different standard

17:47

of doing that business here.

17:49

But once they've established that,

17:50

like once we've gone through the process of doing it,

17:54

if we prevail on appeal and can establish

17:57

the feasibility of implementing these changes,

18:00

we could actually change it across the board

18:03

for this company and set a new benchmark

18:06

for the industry.

18:06

Now, look, I wish Congress would wake up

18:11

and like put this at the top of their agenda.

18:13

I think this is a place where there's a lot of bipartisan

18:17

opportunity for meaningful change,

18:19

but they have been stuck in place.

18:21

And so if we have to do this through a court process,

18:24

through a litigation process,

18:25

I'm gonna just push forward.

18:28

But I think this is an opportunity to kind of use litigation

18:33

to prompt some higher level policy engagement in Congress.

18:36

And that's what we're really hoping for.

18:37

So meta's argument was that this case is still really

18:41

about content, not design.

18:44

The calling it consumer protection

18:46

is just a way to get around section 230,

18:48

which is essentially shield social media companies

18:51

like meta from being held liable for the content

18:53

on their platforms.

18:55

But it's not just meta making this argument.

18:57

Mike Maznick, a tech dirt called your verdict,

19:00

quote, a really problematic result

19:02

that easily should have been tossed on 230 grounds.

19:05

What's your response to people who say,

19:08

this is a speech case,

19:09

dressed up as a consumer protection case?

19:11

Well, I think they don't really understand

19:13

the nature of the evidence that was presented.

19:16

I don't think they understand the nature

19:17

of the legal arguments that were made.

19:19

We weren't focused on specific third party content,

19:24

which is what section 230 is all about.

19:27

This is about specific design choices and features

19:31

that have made this an addictive and dangerous product.

19:34

And it's also about the affirmative misrepresentations

19:36

that the company has made.

19:38

And one thing is clear is that when you build a product

19:43

and you in the design choices that you have

19:46

they're built into it create known harms

19:49

and then you lie to people about those harms,

19:52

that is outside of the ambit of section 230.

19:55

And so again, meta and other tech companies

19:59

have been hiding behind section 230 for the last 30 years.

20:04

And I'm assuming they're going to be,

20:07

essentially focused on that in their appeal.

20:10

I don't have a sense that this is going to change

20:13

at least with respect to the judiciary here in New Mexico.

20:17

Now, whether or not they can get some

20:19

of the more conservative justices on the court to bite

20:21

or even some others who are concerned about that aspect

20:26

of their defense, it remains to be seen.

20:28

I think from the public's perspective,

20:30

we ought to be able to create some basic safety standards

20:35

around these types of spaces without infringing

20:40

on expression content, things of that nature

20:43

because I'm sensitive to that.

20:45

But I also don't want to live in a world

20:47

where we have to live with exploitation and addiction

20:52

and all of this harmful activity as a price

20:56

that we're forced to pay because Mark Zuckerberg claims

20:59

that he's some pamphleteer from the 18th century

21:02

when he's not.

21:03

Yeah, I wanted to get into some of the tension

21:05

around balancing sort of protecting users

21:09

with protecting privacy.

21:11

So internal meta-document showed that encrypting messenger

21:14

would impact roughly 7 1,5 million child sexual abuse

21:18

reports to law enforcement and then mid-trial meta-announced

21:21

that they were going to roll back encryption

21:23

on Instagram direct messages.

21:27

I also talked to a tech journalist, Casey Newton,

21:29

for this episode, who noted with some alarm

21:33

that this is the first time a major platform

21:34

has ever rolled back encryption protections

21:37

and he said that we shouldn't have to give up

21:40

our basic right to privacy so cops can make fewer phone calls.

21:44

What do you say to that?

21:45

Sort of the general concern about,

21:47

because I've heard this from a few places,

21:48

that like, infinite scroll, auto play,

21:52

some of these features, people could live without

21:54

and they say, okay, those aren't two 30,

21:56

those aren't content, but encryption,

22:00

once government, especially this government,

22:02

could break encryption, that's not only going to protect

22:06

children, but protect people's privacy

22:08

all over the country.

22:09

Yeah, so I read that same comment.

22:12

Again, I think this is probably the view of somebody

22:17

who doesn't share the perspective of people,

22:20

like myself who worked in child pornography

22:23

and child solicitation cases for a number of years.

22:25

And one really important piece of context is meta

22:28

and Mark Zuckerberg decided to go to end encryption

22:31

the day after we filed this lawsuit.

22:33

So I'll leave it to you to decide whether or not

22:36

their motivation was really protecting the privacy interests

22:39

of their users or whether it had to do

22:42

with shielding themselves from liability.

22:43

My view is that the lawyers got around the table and said,

22:47

hey, as long as we can see all of this solicitation

22:52

between miners who we've lowered onto this platform

22:54

and predators that we've failed to kick off,

22:57

we're on the hook.

22:58

But if we blind ourselves by implementing

23:00

end-to-end encryption, we get to hide behind that.

23:04

And by the way, you can tell the marketing department

23:07

to dress it up as privacy, even though we literally

23:10

track every single piece of information

23:13

that we can track about every single user that we have.

23:16

I don't think people were buying that.

23:18

And I also think that to your point,

23:20

the fact that they were as a result of that decision

23:23

shielding referrals to law enforcement,

23:28

I think that got to the ultimate decision

23:31

to roll that back because it wasn't something

23:34

that was defensible and court.

23:35

And to that last piece about cops having to make phone calls,

23:40

it's not cops making phone calls.

23:41

I don't have access to that information.

23:44

This is about a company that can see whether or not

23:47

a 40-year-old man is trying to solicit a 12-year-old girl

23:50

in their platform for sex.

23:52

And if they have that information, then I

23:54

would hope that they were going to be sharing that

23:57

with law enforcement.

23:58

But I think it's a distortion to equate the lack

24:02

of end-to-end encryption with someone in government

24:06

having immediate access to everyone's private communication,

24:09

because that's not what this has been about.

24:11

The other piece is when it comes to having kids online,

24:15

if look, if it's adults communicating with other adults

24:19

and there's end-to-end encryption, I don't have any problem

24:22

with that.

24:22

When it's a 50-year-old man communicating

24:24

with a kid down the street from me,

24:25

I have a very serious problem with that.

24:27

And I think most Americans can walk in Chugam

24:30

at the same time.

24:31

We can craft solutions that both protect

24:34

basic privacy interests without putting kids at risk.

24:37

Yeah, I mean, the way I was looking at this

24:39

is I can see on an app like Instagram where it seems like

24:44

if you're going to have encrypted DMs on an app that

24:47

is also algorithmically connecting you to strangers,

24:52

then that's a problem, especially for children.

24:54

I wonder, does this mean that for encrypted apps,

24:58

WhatsApp, Signal, even I messages,

25:02

that there has to be age verification

25:05

because you don't want kids on encrypted messaging apps at all?

25:09

Yeah, age verification is going to be key.

25:11

It's going to be part of what we talk about

25:14

in the May presentation on public nuisance.

25:17

And we're going to be asking the judge

25:19

to really start exploring real age verification

25:23

for precisely that reason is that we have

25:25

to have different guardrails based on the ages

25:29

of the users that are in these spaces

25:31

and the potential harm to those users.

25:33

Again, if it's end-to-end encryption

25:36

between adults in these spaces, I'm not really interested

25:40

or talking about that.

25:41

You could solve part of the end-to-end problem

25:45

by just mapping a blanket rule where no one over a certain age

25:50

who is unknown to a minor can connect with that minor,

25:53

they can't communicate with them.

25:54

There are companies in the space

25:56

that have taken that step.

25:57

With respect to coming up with a more nuanced solution,

26:01

there are opportunities to develop actual technology.

26:05

It's imperfect, but it can do age estimation

26:08

based on some of the sort of the angles, right?

26:12

Every time you look at a camera,

26:14

it has the ability to estimate age.

26:15

Now it's not perfect, but it sidesteps the problem

26:18

that other people have correctly identified

26:20

of uploading and sharing maybe sensitive personal information

26:23

on an ID or something like that.

26:25

But I think the real way we have to start thinking about it

26:29

is lawmakers and policymakers,

26:32

if they're gonna engage in meaningful tech regulation,

26:34

they have to start iterating the way technologists do.

26:37

The problem is we created section 230 in 1997

26:41

and we walked away and decided not to do anything.

26:44

It sat there for 30 years and it went from a moment

26:47

when I was waiting for my dial-up tone on ALL

26:50

to now a time where there's more computing power

26:53

in my pocket than there used to be in my laptop

26:56

and we haven't changed the regulatory or legislative framework

27:00

to keep pace with technology.

27:02

I think policymakers have to just get comfortable

27:05

with iterating around these spaces,

27:08

understanding that you're never gonna be completely in alignment

27:11

but having some basic priorities

27:14

and that should start with making sure kids are safe

27:16

in these spaces.

27:18

So there are now over 40 state AGs with lawsuits

27:21

against meta, thousands of pending cases

27:24

that will now move forward.

27:25

Are you coordinating with the other attorneys general

27:28

is there a legal strategy here that is analogous

27:31

to what happened with Big Tobacco in the 90s?

27:34

Yeah, I mean, I've been hearing from my colleagues

27:36

around the country, I'm aware of the action

27:38

that they put together.

27:39

ours was a little different because we focused on

27:42

exploitation so heavily.

27:44

And so there was a different sort of evidentiary basis

27:48

but we did have elements or we talked about addictive design,

27:51

we talked about some of those other features.

27:53

We are sharing some of the notes and the feedback

27:56

that we have from our litigation team with them

27:59

to sort of inform how to make those presentations

28:02

and those arguments.

28:03

I think more generally I'm trying to get all of my colleagues

28:09

to re-examine their underlying consumer protection laws.

28:12

I'm in the process of trying to redesign hours, right?

28:16

I mean, 1970s a long time to go without meaningful changes

28:21

in those spaces, but I think that instead of coming up

28:24

with all of these specific sort of bespoke solutions

28:29

to technology challenges that are really pressing

28:34

in the moment but change over time,

28:36

I think we should look more broadly

28:38

at the kind of authority that we have

28:40

to really get into this space and try to protect people.

28:43

So we're working both on litigation

28:45

and potential legislation at the same time.

28:47

And hopefully, like I said, it's a moment where

28:52

after six weeks of evidence,

28:55

this jury came back in less than a day.

28:57

That's a pretty powerful signal.

28:58

And I hope that the company's heard that signal

29:00

but more importantly members of Congress did too

29:03

because I think that's where we really need

29:06

to see some action taken on these issues.

29:09

You mentioned the Supreme Court where this case

29:11

or one of these cases could end up.

29:13

Have you thought about this court

29:17

with this composition of justices,

29:20

what kind of arguments you think would be persuasive

29:23

to some of the more conservative members of the court

29:26

or just members of the court who maybe haven't been

29:30

as forward leaning as you were on this case?

29:33

Yeah, I actually think it's actually something

29:36

that will be centered probably more at the middle

29:40

of the court because I can see folks on both the left

29:44

or the right who have a maximalist interpretation

29:50

of some of the sort of the free speech rulings

29:54

when it comes to corporations being more susceptible

29:59

to an argument advanced by Meta.

30:01

But my sense is that there is a middle ground

30:07

where you can start identifying the unique harms

30:10

that product design and misrepresentation

30:13

presents to kids and to young people

30:15

in the vulnerable populations

30:16

and that that will be a way to distinguish

30:21

this type of action from those that are obviously

30:24

based on content obviously motivated

30:27

by a political or ideological motivation.

30:31

I think by keeping this centered on child welfare

30:34

there's a real possibility that you can get

30:37

some combination of moderates or persuadable Republicans

30:43

to step up and sign on to a decision

30:46

that better protects these kids.

30:48

Attorney General Rolto has thank you so much

30:50

for taking the time and talking about this case

30:53

and the strategy going forward, really appreciate it.

30:56

Thanks for taking the time.

30:59

Up next my conversation with Hardfort co-host

31:01

and platformer author Casey Newton.

31:03

But first, if you love Dan's analysis on Pots of America

31:06

take a listen to our subscriber exclusive pod Polar Coaster.

31:09

It's like having a really smart friend

31:11

break it all down for you.

31:12

I love Polar Coaster.

31:13

I never miss an episode.

31:15

It is great to hear Dan one of the smartest political

31:17

strategists I know, break down polls.

31:20

It's also one of the biggest polling nerds I know

31:22

and it's fantastic show so check it out.

31:24

You can get that show and a whole bunch of other subscriber

31:26

only shows if you subscribe to Friends of the Pod.

31:30

You can also get ad free episodes of Pots of America offline.

31:33

Love it or leave it, Pots of the world.

31:35

All your favorite crooked pods.

31:36

We have an extra episode of Pots of America

31:38

called Pots of America Only Friends.

31:40

The subscribers get access to,

31:42

you also get access to our growing list

31:44

of excellent substack newsletters

31:46

and you get to feel good about supporting

31:48

independent pro-democracy media.

31:50

So hit pause and subscribe to Friends of the Pod

31:53

right now at crooked.com slash friends.

31:55

This episode is sponsored by BetterHelp.

32:05

Whether you're dealing with anxiety, depression,

32:06

conflict and relationships or simply need an impartial

32:09

third party to help you deal with daily stress,

32:12

BetterHelp is there to connect you with the support you need.

32:14

BetterHelp Therapists work according to a strict

32:16

code of conduct and are fully licensed in the US.

32:19

BetterHelp does the initial matching work for you

32:21

so you can focus on your therapy goals.

32:23

A short questionnaire helps identify your needs and preferences

32:25

in their 12 plus years of experience

32:27

and industry leading match fulfillment rate

32:29

means they typically get it right the first time.

32:31

If you aren't happy with your match,

32:33

switch to a different therapist at any time

32:34

from their tailored wrecks.

32:36

With over 30,000 therapists,

32:37

BetterHelp is the world's largest online therapy platform

32:39

having served over 6 million people globally

32:42

and it works with an average rating of 4.9 out of 5

32:44

for a live session based on over 1.7 million client reviews.

32:48

When life feels overwhelming, therapy can help.

32:50

Sign up and get 10% off at BetterHelp.com slash offline.

32:54

That's better H-E-L-P.com slash offline.

32:58

Offline is brought to you by Mint Mobile.

33:00

I don't know about you, but I like keeping my money

33:02

where I can see it.

33:03

Unfortunately, traditional big wireless carriers

33:05

also seem to like keeping your money too.

33:07

After years of overpaying for wireless,

33:09

if you're fed up with crazy high wireless bills,

33:11

bogus fees and fed up and free perks

33:14

that actually cost more in the long run.

33:15

Say free perks?

33:16

Oh, different thing.

33:17

Yeah.

33:19

Then switch to Mint Mobile.

33:22

You could be saving a lot with Mint Mobile.

33:23

Have you checked how much you're paying a month

33:25

for your mobile phone bill?

33:28

Probably not.

33:29

I get unfortunately my mobile phone bill gets texted to me

33:34

by my mobile phone company.

33:35

And every time I think that is insane,

33:37

how is that possible?

33:38

That is an outrageous number.

33:39

So maybe you should think about my mobile phone to switch.

33:42

Stop overpaying for wireless just because that's how it's always been.

33:44

Mint exists purely to fix that.

33:46

Mint Mobile is here to rescue you

33:47

with premium wireless plans starting at $15 a month.

33:49

All plans come with high speed data

33:51

and unlimited talk and text delivered

33:52

on the nation's largest 5G network.

33:54

Bring your own phone and number, activate with eSim

33:57

in minutes and start saving immediately.

33:58

No long-term contracts, no hassle.

34:00

Ditch over price wireless

34:01

and get three months of premium wireless service

34:03

from Mint Mobile for $15 a month.

34:06

If you like your money,

34:07

Mint Mobile is for you.

34:08

Shop plans at MintMobile.com slash offline.

34:10

That's MintMobile.com slash offline.

34:12

Up front payment of $45 for three month,

34:14

five gigabyte plan required,

34:16

equivalent to $15 a month.

34:17

New customer offer for first three months

34:19

only then full price plan options available.

34:21

Taxes in fees extra,

34:23

see MintMobile for details.

34:29

Casey, welcome to offline.

34:30

Hey, thanks for having me, John.

34:31

I want to talk to you about Metta's rough week in court.

34:34

Jury's in two different cases,

34:36

held the company liable for designing a product

34:39

that harm consumers in these cases.

34:40

Children, I'm also talking to New Mexico Attorney General,

34:43

Raul Torres for this episode.

34:45

The other big case was here in LA

34:47

where Mark Zuckerberg himself took the stand

34:51

and the jury found that Metta's design features

34:52

as well as YouTube's harmed a young woman's health.

34:56

I know you've heard that case closely.

34:58

Commentate I've seen is that this is a big text,

35:01

big tobacco moment.

35:03

Do you agree?

35:04

How big of a deal is this?

35:05

I agree that it is a big deal.

35:07

And I think that over the past couple of years,

35:10

the world has been coming around more and more

35:13

to this framing of the issues surrounding social media

35:17

as a kind of public health crisis, right?

35:20

It seems like there is something about these apps

35:23

that produce really harmful effects

35:26

for some subset of the population.

35:28

And this was the first moment that Jury's actually

35:31

were able to find a legal path to hold them accountable.

35:34

What was some of the most damning testimony and evidence

35:38

against Metta in your view from this trial?

35:42

Yeah, so I mean, in the trial itself,

35:45

it seems like jurors were really swayed

35:47

by the internal research that Metta had done

35:50

in which their own researchers had found that again,

35:53

for some subset of users of Instagram,

35:56

there were negative mental health effects.

35:58

Now, you know, Metta would say,

35:59

well, you know, those effects were exaggerated

36:02

and you're sort of leaving out a lot of context here.

36:05

But I think once a juror understands

36:07

that a company has been researching this

36:10

and that the more they looked into it,

36:11

sort of the worse stuff they found.

36:13

And then also that research kind of gets canceled

36:15

or the researchers get moved to other projects.

36:17

It kind of does start to feel like a big tobacco moment, right?

36:19

Yeah.

36:20

Well, what was Metta's defense to that in the trial?

36:23

Well, they said essentially the effects

36:27

that you are talking about at trial were cherry picked

36:31

and we can show you lots of other data

36:33

that shows that the vast majority of people

36:35

never experience a problem here.

36:37

And also some of the research that we have done

36:40

is why we have added various features that are designed

36:43

to help you mitigate the effects of the thing that we built.

36:46

Yeah. And it seems like they also tried to argue

36:50

that this young woman had pre-existing problems

36:53

and issues with her family and with other struggles

36:59

and that somehow because of that,

37:03

they couldn't be held liable.

37:05

Yes, although the surgeon general under President Biden,

37:10

when he did a big report on this subject,

37:12

one of the things that he found was that it was precisely

37:15

the teens who have pre-existing mental health conditions

37:18

who are at more at risk of these terrible outcomes

37:22

on these platforms.

37:23

So simply to say, oh, well, she does it count

37:26

because she was already having mental health problems.

37:28

It's like the whole problem is that you're serving millions

37:30

of people who have mental health problems

37:31

and we just know that Instagram

37:32

and other social apps can be really bad for those folks.

37:35

Yeah, and it seems like the key is that the jurors didn't have

37:39

to prove that meta and YouTube were the sole cause

37:45

of the mental health problems,

37:46

but that they were a, I think it was like a significant factor.

37:49

Yeah. And again, that really is a big deal

37:52

because for the past 30 years,

37:54

platforms have been insulated from these kinds of attacks.

37:58

They've been able to hold up Section 230

38:00

and say, we are not responsible essentially

38:02

for anything that happens here.

38:04

And so what's really been fascinating to me about this case

38:07

is that it seems like the plaintiffs' lawyers

38:08

have finally found a way through that shield

38:11

and juries are responding to it.

38:13

Yeah, I want to get into that shield even more,

38:15

but I just see the jurors said they were unimpressed

38:19

by Mark's testimony, shocking, I know.

38:23

The judge also didn't seem all that impressed

38:24

with his team recording the proceedings via their meta AI glasses.

38:29

Guess that was a no-no.

38:31

What did you make of Mark's testimony

38:33

in his general posture throughout the trial?

38:36

I think basically since Cambridge Analytica,

38:40

the 2017 post-Trump election backlash,

38:44

meta has been in this posture of delay, deny, deflect.

38:50

And Zuckerberg has been carefully trained

38:54

to give the least that he can get away with.

38:58

And this is just mostly work for him.

39:00

This is a guy who's gone before Congress a lot,

39:02

has been asked a lot of the same questions.

39:04

He chokes out a few words, then he gets interrupted.

39:07

And I think it really wasn't all that different at trial.

39:11

He doesn't really give folks almost anything,

39:15

but that wound up costing him.

39:16

Because I think a lot of what the jurors are responding to

39:19

is the idea that the people who are getting hurt,

39:22

the plaintiff in this trial, this is a real person.

39:26

This is not some statistical abstraction,

39:28

and there are a lot more people like her.

39:29

And because the executives of these companies

39:31

can't really speak to that,

39:33

increasingly they're getting in trouble.

39:35

So you just published a really thoughtful piece

39:37

about what these verdicts mean for the wider internet.

39:40

And you sort of laid out three camps,

39:42

three different reactions to the verdict.

39:44

The plaintiffs who are euphoric, the defendants,

39:47

who plan to appeal.

39:49

And then writers and thinkers who worry these verdicts

39:52

could break the basic compact

39:54

that holds the internet together.

39:57

This is what you were just getting at with Section 230.

40:00

Talk to me about the concerns of that third group.

40:03

So a good thing about the internet that we have,

40:07

arguably, I don't know, maybe some people would disagree,

40:09

is that you can have very wide-ranging political discussions on there.

40:14

You can say really edgy things.

40:16

You can say ideas that are sort of fringy

40:18

and even a little bit dangerous.

40:19

And one of the big reasons that you can do that

40:21

is that the platforms are just confident

40:23

that if they get sued over this,

40:26

they can get the suit tossed rather easily.

40:29

So you can imagine a lot of stuff

40:31

that people were saying about COVID in the early days.

40:34

Like, turned out to be true,

40:35

but with super edgy at the time,

40:36

the platforms just sort of mostly let it happen.

40:39

The fear is that if the Section 230 shield disappears,

40:43

all of a sudden platforms are going to start

40:45

overmoderating content.

40:47

They're going to say,

40:47

hey, this is starting to feel a little bit spicy.

40:50

Like, maybe it's a red state where we have a lot of laws

40:53

targeting LGBT people.

40:55

Maybe in that state, we don't want to permit

40:57

quite as much discussion of LGBT issues, right?

40:59

And all of a sudden,

41:00

like the surface area available for us

41:02

to have public conversations shrinks.

41:05

So that's one of the big fears,

41:07

but depending on how the cases get adjudicated,

41:10

there are even worse ones.

41:11

And there's one in particular about New Mexico

41:13

that I love to talk.

41:14

Yes, and I do want to get to that.

41:16

But like, it seems like with this case,

41:18

and this is what the New Mexico,

41:20

what happened in New Mexico,

41:21

which is about encrypted communications,

41:23

I think we should put that aside for now.

41:25

Because this case, and I think what was novel about it,

41:29

and innovative in the legal strategy,

41:31

is they did not go after content moderation.

41:34

And they basically said,

41:35

yeah, of course platforms can still be shielded

41:39

from legal or have legal liability shields

41:42

from getting sued for content, for user content.

41:45

But this is about the design itself.

41:48

And so we should be able to regulate

41:52

some of these features, infinite scroll, algorithmic choices

41:58

that are made, some of the,

42:01

trying to think of what are the other ones.

42:03

Auto play video.

42:04

Auto play, yeah, that was the other big one.

42:05

Auto play.

42:06

And those don't have to do with,

42:09

necessarily with free speech and free expression.

42:12

Right. And so like, this is the argument

42:14

that I'm trying to make is that,

42:16

content and design sort of exist along a spectrum.

42:20

There are some things that I think most of us can agree

42:23

are mostly just design.

42:24

Like the decision to send you 12 push notifications

42:27

after midnight when you're a teenager trying to sleep,

42:29

that's really like a design decision,

42:31

not a content decision, right?

42:33

And then, then there's like, literally what subjects

42:36

can you talk about and will we remove them from the platform?

42:38

That's like obviously a content decision.

42:40

My argument has been like,

42:41

let's try to find those design things

42:44

that like we can develop a consensus around.

42:46

And like, particularly when they seem to serve

42:48

no real social purpose,

42:49

I would argue that like auto play video,

42:51

infinite scroll are like probably in that category.

42:54

And maybe we can go after those

42:56

and still have a section 230 that enables the rest of us

42:59

to have political discussions.

43:01

Where I think I guess really tricky is around the algorithms.

43:04

Because I think most of us have this sense in our gut

43:07

that the reason that I can't stop looking at Instagram

43:09

and the reason I keep reinstalling it every time I delete it

43:12

is because I just know it's going to show me something good, right?

43:15

That casino effect is working

43:17

and I just want to pull the lever of that slot machine.

43:20

There are real difficult questions there

43:22

about whether these algorithmic recommendations

43:24

are protected speech under either section 230

43:27

or the first amendment.

43:28

And that I think is just going to be a lot harder to untangle.

43:30

Yeah, that seems the feature that's the trickiest to me

43:32

because infinite scroll, auto play, notifications.

43:37

I do think it's hard to argue that those are expressions

43:41

of free speech.

43:42

But a recommendation algorithm, like basically

43:45

if you're telling a platform what it can and can't recommend,

43:50

does that start to feel like regulating speech?

43:54

Because is that like telling a newspaper

43:57

or a TV news program,

43:59

which stories they can air and which they can't?

44:02

Absolutely.

44:03

And you can just see the way that that could be used

44:05

against the media in ways that we wouldn't really like.

44:08

I do think there is a potential path forward here, though,

44:11

which is just trying to regulate this by age, right?

44:15

I think, look, once you're an adult,

44:18

your hippocampus is fully formed.

44:20

If you want to spend eight hours staring at TikTok every day,

44:22

like God bless, go for it.

44:24

If you're 14, we might want to give you a little bit more protection.

44:27

And so maybe they don't regulate the actual content of the algorithm,

44:30

but they say, look, if you're under 18,

44:32

we're going to prevent these companies from personalizing it too much, right?

44:35

Like maybe we'll allow them to do some very high level personalization,

44:39

but we're not going to like fixate on your absolute exact interest.

44:43

So if there's any path forward there, I think it might look something like that.

44:46

So you said the most alarming part of these verdicts

44:48

was how the New Mexico case implicated encryption.

44:51

Meta actually, and you wrote about this as well,

44:53

actually ended encryption on Instagram DMs mid-trial in the New Mexico case.

45:00

You noted that that's the first time a major platform has ever

45:03

rolled back encryption protections.

45:05

You know, agitoras would say that encryption enabled predators to go after children in the dark.

45:13

You'd say, and I'm quoting your piece here, that, you know,

45:15

we shouldn't have to give up our basic rights of privacy,

45:18

so cops can make fewer phone calls.

45:21

How do you resolve that?

45:23

Well, I think agitoras needs a minus business.

45:27

Like we know that cops want to spy on us.

45:30

They have always wanted to spy on us.

45:32

And what we have said is, no, you're not allowed to,

45:34

because we have privacy rights.

45:36

So like, look, I don't want to be too glib about this.

45:39

I understand there are really painful trade-offs involved

45:42

when you allow folks to have encrypted speech.

45:44

But in the world we're living in, I truly do not want the state to be able to spy on all

45:49

of my communications.

45:51

And I think we just have to absorb the cost of that and find other ways to catch predators.

45:55

And by the way, there are other ways to catch predators, right?

45:58

Yeah.

45:59

To me, I've thought a lot about this.

46:02

And it feels like you need spaces like,

46:05

or you need platforms like WhatsApp,

46:08

Signal, where I message, I guess,

46:13

where encryption is protected and guaranteed.

46:15

And there are places where you can communicate with people

46:19

where you do not have to worry about the government spying on you.

46:22

Just like in real life, right?

46:24

Just like pretend we didn't have any of these.

46:26

There should be places where you can go with someone one-on-one

46:29

and have a conversation with them.

46:30

I wonder, I was thinking about the Instagram DMs and encryption there.

46:34

Platforms where they also have these recommendation algorithms discovery,

46:40

where they are connecting you with a bunch of strangers.

46:43

And then those strangers can have conversations with you that are encrypted.

46:47

That seems like less of a, you know, a sure thing in terms of like keeping that encrypted.

46:53

Yeah, I think that's fair.

46:55

And I've spoken with employees at Meta who have made the same case to me.

46:59

Like even folks who are generally pro encryption,

47:01

they're like, look, on the subject of Instagram,

47:03

because it is a place where strangers meet,

47:06

we might want to make encryption like at the very least not that a fault.

47:10

I talked to some who are sort of happy to see it go away.

47:13

I can live with encryption on Instagram going away.

47:16

In fact, they never even rolled it out to most people.

47:19

But what I object to is for the Attorney General of New Mexico

47:22

to be able to say that because Meta offered encryption,

47:26

the platform was inherently unsafe.

47:28

In fact, I'd be willing to bet that to the extent

47:30

any of these teenagers did have encryption on Instagram,

47:32

it probably did keep some of them safe.

47:33

But just by allowing them to have private conversations

47:36

without the state's doping.

47:37

And by the way, I guess if that is the finding,

47:41

then that means that WhatsApp is a defective product just by its nature.

47:46

And so it's signal and so are these other places

47:49

where people are having encrypted communications.

47:51

Yeah, that just feels like a true slippery slope.

47:54

And it is why like, you know, I want to be reasonable on most issues of tech policy.

47:57

I try to be just kind of a real hardliner about encryption

48:00

because it's just so easy for the whole thing to unravel

48:02

once we start going down this road.

48:04

Yeah.

48:12

Offline is brought to you by Quince.

48:13

This time of year might make you rethink what's in your closet.

48:16

You want to move away from clutter toward high quality pieces

48:18

you can actually live in.

48:19

That's why you should check out Quince.

48:21

The fabrics feel elevated, the fits are thoughtful,

48:23

and the pricing actually makes sense too.

48:25

Quince makes high quality everyday essentials

48:27

using premium materials.

48:28

They're 100% European linen pants and shirts for men are lightweight,

48:31

breathable and comfortable,

48:33

basically the perfect layer for spring.

48:35

The pants strike the right balance between laid back and refined.

48:38

So you look put together without trying too hard

48:40

and they're flow-net active wear, moisture-wicking, anti-odor.

48:44

I love anti-odor.

48:45

That's important.

48:46

It's soft enough that you'll actually want to wear it all day.

48:49

The best part is their prices are 50 to 60% less than similar brands.

48:52

How? Quince works directly with ethical factories

48:55

and cuts out the middlemen.

48:56

So you're paying for quality,

48:57

not brand markup.

48:58

Everything is designed to last and make getting dressed easy.

49:02

Love Quince,

49:03

go online and get some more spring stuff

49:06

because I go online like once a month to go to Quince

49:09

and see what they got and they always get new stuff

49:11

and it's always comfortable and it's always important.

49:12

It's getting hot out.

49:13

Refresh your wardrobe with Quince.

49:15

Go to Quince.com slash offline for free shipping

49:16

and 365 day returns.

49:18

Now available on Canada too, go to qunce.com slash offline

49:23

for free shipping and 365 day returns.

49:25

Quince.com slash offline.

49:56

And hope rooted in action, not denial.

50:12

Eric Goldman, the section 230 scholar,

50:14

you cite him in your piece, says the social media industry now faces

50:17

existential legal liability and will need to reconfigure their core offerings

50:21

if they can't get really fun appeal.

50:23

There are about 2000 pending lawsuits, massive federal trial this summer

50:26

with 1600 plaintiffs, 40 plus state attorneys general

50:30

have filed suits against meta.

50:32

Do you agree it's existential and like what kind of design changes

50:37

do you think meta might contemplate making or be forced to make

50:43

to settle or prevent future litigation?

50:45

Yeah, it's a great question.

50:47

Is it like existential in the sense that maybe meta will be out of business

50:51

by the end of the year?

50:53

No, I don't think it's existential in that way.

50:55

Are they going to have to rethink some of the features of the platform

50:59

if these cases get upheld on appeal?

51:01

Yeah, I think they will.

51:03

Where it gets tricky is, and this is one of the problems with having

51:06

juries decide this sort of thing instead of Congress,

51:09

is there's no legal standard now for what constitutes a safe platform.

51:13

Right?

51:13

Like there's no rule anywhere that says, well,

51:15

if you just get rid of auto play video and infinite scroll

51:18

and you don't personalize the algorithm too much,

51:20

we will consider you non-defective.

51:22

And so to some extent, the platforms are just going to have to guess.

51:25

On the other hand, these platforms also employ behavioral scientists

51:29

who add PhDs who are working around the clock,

51:31

try to exploit every feature of your brain that will get you to stare at the glass

51:34

rectangle longer.

51:35

Maybe the platforms could just say, hey, stop that.

51:37

Knock it up.

51:38

Let's maybe roll back the last 15 things we did in that regard.

51:41

Maybe they would be a little bit less hypnotic.

51:43

Yeah, because I thought about this too.

51:44

And I'm like, okay.

51:46

What makes this different from any kind of media company trying to keep its audience?

51:51

Right?

51:52

Which is you design your programming, whether it's TV, whether it's film,

51:57

whether it's a newspaper magazine,

51:59

because you want people coming back from even a book.

52:01

Right?

52:01

Right.

52:02

He has cliffhangers.

52:03

Right?

52:03

You want people coming back for more.

52:05

But what's different is at least, you know,

52:08

all of those media are produced for,

52:10

it's the same media produced for everyone.

52:12

This is now like individualized,

52:15

bare down into your brain.

52:16

No, what you want, spells of, that we've just never dealt with before.

52:20

And so the psychological effects of that, as we're seeing in the psychological harms,

52:24

are just so much different than any other media we've had.

52:27

Absolutely.

52:28

Like again, section 230 was a lot past because people were defaming each other on platforms,

52:34

and people were suing the platforms.

52:35

And lawmakers at the time said, hey, we're just never going to have an internet.

52:39

If you can sue a platform out of existence, because two users were mean to each other.

52:43

We did not predict the world of infinite scroll and auto play video and cognitive scientists

52:48

who were measuring the scroll depth on your phone to the exact pixel that you scrolled down.

52:52

And understanding exactly what video you were watching,

52:55

and how that relates to the 80 million other videos they might show you in the moment.

52:58

Right?

52:59

So we just have to kind of account for the growing technological sophistication of these platforms,

53:03

and how good they've gotten at hacking our brains.

53:05

I do want to just zoom out on meta for a second.

53:08

They have pivoted away from the metaverse,

53:11

despite renaming the entire company after it.

53:14

To the tune of about $80 billion in losses,

53:17

hundreds of layoffs, just this month.

53:19

What is meta's identity right now?

53:21

Does Zuckerberg have a coherent strategy, or is he just trying to survive?

53:25

I think he really has been in survival mode.

53:29

You know, interestingly, the metaverse was also a survival thing,

53:32

because at the time he was just having such huge conflicts with Tim Cook over at Apple,

53:36

he felt like unless I own the hardware of the next generation,

53:40

like I'm always going to be subject to this one person's whim,

53:43

so he wanted to go out and build it himself.

53:45

It turned out not too many people wanted to follow him along on that journey.

53:48

But while they were building headsets and glasses,

53:51

Silicon Valley started to make huge advances in AI.

53:55

Meta in fairness also made big investments in AI.

53:58

There's a sten work out as well, right?

54:00

And so now Zuckerberg is in a situation where he's really behind in AI,

54:04

and I think just having a very difficult time getting the company anywhere close to the frontier.

54:10

So I mean, look, if you look at most of the numbers that investors care about,

54:15

meta is still doing just fine.

54:17

But I do think you're starting to see some cracks in the armor there,

54:21

and the next couple of years, like there are scenarios where it just goes pretty badly for them.

54:25

What's their case on AI?

54:26

Like what is there?

54:27

What do they think they're competitive advantages in this field with all these other AI giants?

54:32

I mean, it's so grim, John.

54:34

I mean, like the true vision for like an AI version of meta is that, again,

54:39

using all the tricks we've just been talking about to understand what are your exact particular interests,

54:44

they're going to use the models they have to generate synthetic content

54:49

that keep you looking at the glass rectangle as long as they can.

54:52

So, you know, this is a company that to the extent it had any social mission at all,

54:57

it was to like connect human beings.

54:59

That has been thrown out the window because they now want to connect you with personalized slop.

55:03

And like I'm not even exaggerating.

55:04

Like this, it just is the vision of the company now.

55:06

Yeah, they're connecting us just with robots, not even robots now.

55:11

I mean, we just met this whole conversation talking about sort of the harms of the algorithmic

55:15

feed internet. AI, is there any reason to think that the AI internet will be better for people?

55:22

Or we just do think we're just going to have the same conversation in five years about chat

55:26

bots and AI agents?

55:27

Well, you know, there was an interesting study this week that said that large language models

55:31

generally do a better job of connecting people to expertise, right?

55:35

Like the big language models, they're less likely to like guide you to,

55:40

I don't know, you know, bright bar and gateway pendant.

55:43

Like they'll tell you something that actually happened.

55:45

So, that's a good thing.

55:46

But on the whole, I basically just was worried about the AI era, if not more so,

55:51

because we've already seen how hypnotic these chat bots can be for some people.

55:55

I get emails every day from people who think that they've woken up,

55:59

Claude or chat GPT.

56:01

And some of these people have really terrible outcomes, right?

56:03

So, my fear is, particularly, again, for the young folks whose brains haven't fully developed,

56:09

there are so many, like it's very hard to be a teenager.

56:12

And it's just so easy for me to imagine a generation getting addicted to these chat bots

56:16

that never really push back on them, but always tell them they're doing great, they look good,

56:20

you know, and I just think it's going to be a big problem.

56:23

Yeah, it reminds me of sort of the first wave of concern about social media or at least,

56:29

you know, a couple of years ago, was like the misinformation.

56:31

And it's going to push us into like political bubbles and all that.

56:34

And that was like the first.

56:35

And there could be some of that with AI, like when I look at GROC,

56:40

I certainly see, like, you got Elon Musk, you know, doing his own thing with the biased AI,

56:45

LLM, and I guess other companies could do that as well.

56:48

But I am more concerned about what the second wave of concern was with the,

56:53

with the social media companies, which is people spending all day long just hooked

56:58

to AI that is going, that is already sycophantic now,

57:04

because it wants to keep you on the platform, because that's how they make money.

57:07

Right. You know, when there's a company character AI, maybe you're familiar with it,

57:11

and they came along and they said, we're going to let you create a chatbot out of any

57:14

fictional character that you can imagine.

57:16

It started to get some momentum. And so Zuckerberg said, like, oh, we just need to do that.

57:20

And so this now exists on Metas platforms.

57:22

You can connect with any number of chat bots.

57:24

There was one they got in trouble for named Nasty Nancy, who I guess was sort of a

57:28

a stepmom who was doing things she shouldn't. But yeah, that's the kind of the present of Metas.

57:32

So yeah, you can imagine what that's going to look like in two years as the models improve.

57:36

The last thing is, there seems to be this growing gap between people in Silicon Valley,

57:43

in the tech world, who use AI, and they're saying, oh, it's here, the future is here.

57:50

You wouldn't believe what you can do with this. And then everyone else who either isn't using AI,

57:55

or who is just asking an LLM some basic research questions, can you talk about that gap and

58:03

sort of what the people who use AI all the time and are very proficient with AI are

58:09

like why they're so excited or why they've been so compelled by this?

58:13

My view of the gap is that it's really about the folks outside of the bubble,

58:19

not wanting to believe what the folks inside the bubble are saying. And I think they have a lot

58:24

of really legitimate reasons for that, right? Because what are the folks inside the bubble saying?

58:28

We're creating an existential threat to humanity. It's probably going to take your job.

58:32

It requires the largest energy and infrastructure build out in the history of America.

58:36

And that might wind up in your backyard and raise your electricity prices.

58:39

Like, of course, Americans do not want to have that vision come true.

58:43

I think the AI industry is doing a really bad job, selling itself in that way.

58:47

I think what the technologists would say is like, look, whether you want to believe it or not,

58:53

we actually do basically have the recipe figured out. We know we can just keep pouring more data

58:58

and compute into these systems and the amount of intelligence that we have is going to scale.

59:03

And so that just is going to create huge consequences for all of us.

59:07

So to me, it's really not about like, who is right and who is wrong? It's about like, what do you

59:12

want to believe? Yeah. And that it's like, if the future is here and this is happening, then like,

59:18

what is the best way to adapt in a way so that we don't find ourselves in a really bad

59:23

situation? Yeah. And I mean, one of my big critiques of the AI industry is it's just like,

59:27

so anti-democratic, right? It's like, like, I mean, you know, a big criticism that it just

59:32

leagates is like, nobody asked for this, right? Like people aren't asking for their jobs to be

59:37

taken away and for all of the rest. So I wish we would bring more kind of democratic governance

59:42

to these systems. Agreed. Agreed. Casey Newton, thanks for, thanks for jumping on offline and helping us

59:48

get smarter on this. It has my pleasure, John. All right. Okay.

1:00:02

Offline is a crooked media production. It's written and hosted by me, John Favreau.

1:00:07

It's produced by Emma Ilek Frank. Austin Fisher is our senior producer and Anisha Banner G

1:00:11

is our associate producer, audio support from Charlotte Landis. Adrian Hill is our head of

1:00:16

news and politics. Matt DeGroat is our VP of production. Jordan Katz and Kenny Siegel take care

1:00:21

of our music. Thanks to DeLon, Villain-Wave, Eric Shoot and our digital team who film and share our

1:00:25

episodes as videos every week. Our production staff is proudly unionized with the writer's guild

1:00:30

of America East.

1:00:47

In moments like these, it's easy to feel overwhelmed and even easier to feel powerless.

1:00:53

But we are neither. I'm Stacey Abrams and on my podcast, Assembly Required, I take on each

1:00:59

executive action, legislative battle and breaking news moment by asking three questions. What's

1:01:05

really happening? What can we do about it? And how do we keep going together? This is a space for

1:01:12

clarity, strategy and hope rooted in action, not denial. New episodes of Assembly Required,

1:01:20

drop Tuesdays, tune in wherever you get your podcasts and on YouTube.