HomeVideos

Nancy Guthrie Disappearance Raises New Surveillance Questions | Pivot

Now Playing

Nancy Guthrie Disappearance Raises New Surveillance Questions | Pivot

Transcript

1898 segments

0:00

I'm glad they got these pictures of this

0:01

guy. At the same time, this is an edge

0:04

case. They're they're keeping your video

0:06

that which I which everyone thought they

0:08

were doing and they said they weren't.

0:16

>> Hi everyone, this is Pivot from New York

0:18

Magazine and the Vox Media Podcast

0:19

Network. I'm Cara Swisser.

0:21

>> And I'm Scott Galloway.

0:22

>> Scott, we just did a great on with Cara

0:25

Swisser about resist and unsubscribe,

0:27

but I'd like

0:28

>> you have another podcast. Did I find

0:29

that out? This

0:30

>> dude, you were quite substantive. Where

0:32

are we right now? Give us a quick

0:33

update.

0:34

>> Uh, it lulled Tuesday and Wednesday. It

0:36

appears to have come back today because

0:38

Chelsea Handler, who reached out to me,

0:41

posted something of all the things she

0:43

was unsubscribing to. And just to give

0:44

you an example of how much impact one

0:46

person can have. Uh, I went on AI, I

0:50

went on to my site analytics. I think

0:51

she just that one video she did on

0:54

Instagram, that one post is going to

0:55

inspire 6 to 7,000 unique site visits.

0:58

conversion of 5% that's 300 people

1:01

unsubscribing average of two platforms

1:04

600 unsubscribes average 200 that's

1:06

$12,000

1:08

times or excuse me $120,000 times 10 so

1:12

$1.2 $2 million in market cap getting

1:15

taken out of these companies because of

1:16

one Insta Post. So,

1:17

>> Right. Exactly. And, you know, I'm going

1:19

to see her uh tomorrow night, I think.

1:22

Um, tomorrow night. She's here in DC. We

1:24

should get them all to do things like

1:26

that. Let's let's reach into the celebs,

1:28

we know, and get them to do.

1:30

>> I'm going to bug them all.

1:32

>> Okay.

1:32

>> I like it. Thank you.

1:33

>> Yeah. If they do that and put even just

1:35

one thing up, it matters and it's an

1:37

easy thing for a lot of them. And they

1:39

>> what people don't realize about about

1:40

economic protests, the most famous one

1:42

was a Montgomery bus strike. It wasn't

1:44

the one cinematic moment. It was it was

1:46

um an organization of thousands of car

1:49

pools over the course of a year.

1:50

>> Yeah.

1:51

>> So it takes it takes a while, but

1:54

>> any individual who subs unsubscribes

1:56

from Open AI OpenAI right now is taking

1:58

$10,000 out of their market valuation,

2:01

>> which is great. And that

2:02

>> and there's a substitute, the free the

2:04

free chat GBT

2:05

>> and also all kinds of other free

2:07

services. Gemini, all the others, you

2:09

don't have to pay for it necessarily.

2:11

And by the way, you can use things for

2:13

free. You're taking stuff from them,

2:15

right, without paying them. Like paying

2:17

is the issue is what you pay for. So

2:19

just keep that in mind. Everyone's like,

2:20

"Oh, now I can't use Google." I'm like,

2:22

"No, it's free."

2:24

>> Well, just just to give an example how

2:25

how these this unsubscribes to these

2:27

recurring revenue tech platforms or tech

2:30

companies, T-Mobile just had an earnings

2:33

call. They were projected to add 992,000

2:36

new subscribers. they added 962. So

2:39

30,000 fewer because 30,000 people

2:42

didn't show up for subscription. So not

2:44

only do these do these actions punch

2:46

above their weight class in terms of

2:47

economic impact, if you take if Sam

2:51

Alman is grows his subscriptions 7 12%

2:55

versus 8% month, he's not going to close

2:58

his $850 billion round.

3:00

>> Yep.

3:01

>> So this is literally this is the string.

3:03

If you if if you don't have the time or

3:06

the energy to do some of the very other

3:07

important work, whether it's protests or

3:09

or or calling your congressman, you can

3:12

have a massive impact by unsubscribing

3:14

right now.

3:15

>> Yeah, you can. Now, speaking of which,

3:17

and something the administration does

3:18

care about, Attorney General Pam Bondi,

3:20

who we'll talk about more in a minute,

3:21

was testifying in front of the the Jan,

3:24

crazy Jan, was testifying in front of

3:26

Congress about Epstein on Wednesday. She

3:28

made it clear she'd prefer to be talking

3:29

about other things. What did she zero in

3:31

on? Let's listen. Dow is over $50,000. I

3:35

don't know why you're laughing. You're a

3:37

great stock trader as I hear rascin. The

3:39

Dow is over $50,000

3:42

right now. The S&P at almost 7,000

3:46

and the NASDAQ smashing records.

3:50

Americans 401ks and retirement savings

3:53

are booming. That's what we should be

3:56

talking about.

3:57

>> Well, she's not the Treasury Secretary,

3:59

but this is what shows what they care

4:00

about. They really do. the fact that it

4:02

was inappropriate to bring this up in

4:04

here given they were talking about

4:05

victims sexual uh uh abuse victims, but

4:09

nonetheless, this is what floats their

4:11

boat is is this money, right? And so,

4:14

let's also listen to a great idea one of

4:15

our listeners sent in.

4:17

>> Every child of an elderly person should

4:21

also go through all of their parents

4:24

subscriptions. I went through my

4:26

mother's this weekend and was able to

4:28

take $125

4:30

off of some bills by unsubscribing

4:33

subscriptions she didn't even know she

4:35

had.

4:36

>> That is a great idea. I do that with my

4:37

mom all the time and I'm trying very

4:39

hard to take the New York Post off of

4:41

her subscriptions.

4:42

>> Two years after my mother died, I found

4:44

that Geico was still taking $220 out of

4:47

her bank account a month for car

4:49

insurance.

4:50

>> Wow. Crazy. if you don't, and I've used

4:52

this example before, when I unsubscribed

4:54

from AT&T, went to Noble, I'm saving

4:56

about 20 or 30 bucks a month, but in

4:58

addition, I found out

4:59

>> I had four accounts with AT&T for

5:01

Blackberries and iPads, which have been

5:03

in landfills for years, cuz I never went

5:05

on and unsubscribed them. And even

5:07

though they know they're not getting a

5:08

GPS signal from these things, and they

5:10

could send you an email saying, "Hey,

5:11

you know, you're paying 70 bucks a month

5:13

>> for something you haven't used in 5

5:15

years,

5:15

>> you're going to save money. It's these

5:17

companies are very good at figuring out

5:19

a way to get you to subscribe and get

5:20

you to forget that it's coming that this

5:23

money is coming out of your pocket every

5:24

month.

5:24

>> Yeah. You know, there's a couple

5:25

services and I don't have the names to

5:27

show where your subscriptions are and to

5:29

unsubscribe, but this is a better way to

5:31

do it. But then you can use those

5:32

services to find them all over the

5:34

place. You'd be surprised of what you're

5:36

I found an AT&T thing. I was still from

5:38

when Apple first had the iPhone when

5:40

they had unlimited if you remember.

5:42

Anyway, uh it's a great thing to do.

5:44

Keep going. We're going to do more.

5:45

We're gonna every little thing we can

5:47

pull on. The administration cares about

5:48

this issue. Uh it's the only thing left

5:51

is the DAO at this point. The fallout

5:53

from the

5:54

>> What's that $50,000? She's the [ __ ]

5:56

attorney general. She clearly knows

5:58

nothing about economics. What is she

5:59

talking about? She called the Dow.

6:02

>> I know. Also, also calling a

6:03

representative Rascin. Who does [ __ ]

6:06

does she think she is? She's in his

6:08

house. She's in his house. He calls him

6:10

Rascin. I'm going to call you Galloway

6:12

when I use your house. Hey, Galloway.

6:14

Anyway, the fallout from the Epstein

6:17

files continues. Speaking of which, as I

6:18

mentioned, crazy Attorney General Pam

6:20

Bondi, who really needs to be medicated,

6:23

testified before the House Judiciary

6:25

Committee on Wednesday, and things got

6:27

heated. She soiled herself multiple

6:29

times. Um, Bondie sparred with

6:31

Democrats, not just Democrats, over DH's

6:33

handling of the Epstein files and

6:35

refused to apologize to survivors. She

6:37

wouldn't even look at them there. It

6:38

turned out she's never talked to them.

6:40

She also clashed with uh GOP Congressman

6:42

Thomas Massie. Massie criticized Bondi

6:45

and the DOJ for failing to redact

6:46

victim's names while blacking out the

6:48

names of businessmen uh businessman Les

6:50

Wexner. Let's listen to the exchange.

6:53

>> Within 40 minutes, Wexner's name was

6:56

added back

6:57

>> within 40 minutes of me catching you

6:59

redhanded.

7:00

>> Red hand. There was one redaction out

7:06

and we invited you in. We This guy has

7:09

Trump derangement syndrome. He needs to

7:11

get You're a failed politician.

7:14

Uh really crazy crazy craziness I have

7:18

to say. I just don't know what to say.

7:19

She's What is wrong with her? Like ser

7:22

speaking of derangement syndrome. Like

7:25

honestly I don't know what she was doing

7:27

up there. I know it's an audience of one

7:29

but he can't even find this impressive.

7:31

It's grotesque. I mean I don't know.

7:35

>> Yeah. It really feels like we have

7:38

>> wheels are coming off. I mean it's it's

7:39

it's a shame because it's just so

7:42

>> it's a serious issue

7:43

>> weird and it's the attorney general

7:46

making a mockery of the institution and

7:48

just

7:49

u no no decorum but for I'm curious what

7:54

you thought about the hearings but the

7:56

moment that I found really chilling

7:59

was when I think it was representative

8:02

Jaipal

8:04

had um some of the the survivors uh

8:08

stand up and asked how many of them have

8:10

reached out to the DOJ

8:12

>> to provide evidence or input, but all

8:14

these survivors stood up.

8:16

>> Yeah.

8:16

>> And it was clear they've reached out to

8:18

the DOJ and the DOJ has um has is has

8:22

ignored them. And you thought, let me

8:23

get this, the Department of Justice

8:26

>> investigating what is arguably may go

8:28

down as the crime of the century to date

8:31

>> and survivors and people with direct

8:33

knowledge about what happened or what

8:35

didn't happen. They could also, quite

8:36

frankly, they might exonerate some

8:38

people.

8:38

>> Right. Exactly.

8:39

>> They don't want to talk to them.

8:41

>> Right. Right. And she wouldn't look at

8:42

them. That was another moment. She

8:44

wouldn't turn around. She wouldn't do

8:46

it. She This woman is insane. I just I

8:49

don't She a crazy one. She's like, it

8:52

was so strange. And And I know this

8:54

audience of one they always do, but in

8:56

this case, I was like, "Wow, you people

8:57

are desperate and terrified of what's

8:59

coming next for you." You know, I

9:01

thought Massie was effective. I thought

9:03

Becca Balant was effective. I thought

9:05

Gipol's effective, Rascin, um I thought

9:08

they all one of the things someone who

9:10

works there said, "How do you think it

9:11

went?" And I said, "The only problem

9:13

with this kind of thing was you lay down

9:17

with pigs. The only one only people that

9:20

like wrestling with pigs are the pigs,

9:21

right? If you get in the mud with them."

9:23

But I thought they rel they relatively

9:25

handle it well. It's just that the the

9:27

craziness is what gets attention and not

9:29

the victims, right? It becomes a

9:30

ridiculous circus. And on some level,

9:33

what was interesting is Fox didn't show

9:35

it, right? They they they they keep

9:37

they're obsessed with the Nancy Nancy

9:39

Guthrie kidnapping, which is a terrible

9:41

thing, too. But they're not even airing

9:43

it. They don't want to see show you the

9:45

crazy like and any normal person looking

9:47

at this would be like, "What? Honey, you

9:50

need some you need some therapy like

9:52

stat kind of thing." And you're you

9:54

know, and what happened to you? So, I

9:56

thought that was it was a really

9:58

interesting This Epste thing isn't going

9:59

away, Pam. I'm sorry. It's just not now

10:01

cuz it's so very clear that you didn't

10:03

do your job and neither did people

10:05

before you, by the way. But guess what?

10:07

>> It's a valid point.

10:08

>> You're in the chair now. I don't really

10:10

>> It's her It's her DOJ.

10:12

>> It's her DOJ

10:12

>> and her boss her boss is mentioned

10:15

>> in the Epstein files more times than

10:17

Jesus is mentioned in the Bible or the

10:19

term meth is mentioned in Breaking Bad

10:21

over eight seasons. And I felt like

10:23

every day, every time yesterday, she

10:25

>> she claimed that, you know, the

10:26

president had been the most quote

10:27

unquote transparent president. When she

10:30

uses the term transparent, I think some

10:32

somewhere there's a thesaurus filing for

10:34

protective custody. It's just

10:36

>> why are you laughing at me was just

10:41

and al just it's it was so weird. It's

10:43

so weird. It's so culty. It's so

10:45

strange. One of the things I do think is

10:46

effective is a lot of these Congress

10:48

people are going in and seeing

10:49

unredacted versions which are very

10:50

upsetting. um

10:52

>> when they come out and they look like

10:53

they've seen a ghost.

10:54

>> I know. Even Cynthia Lumis who was I

10:57

didn't know it was there now. Whoa.

10:59

Whoa. Folks, like Cynthia Lumis, I'm so

11:02

glad she's leaving politics. But I have

11:04

to say, even someone like that who

11:06

literally puts in the least effort

11:08

possible. Um same thing. They're looking

11:10

like, "Oh my [ __ ] god, you're kidding

11:12

me here." And you know, nobody's

11:14

>> I got to be honest. I didn't I didn't

11:16

realize it was this bad.

11:18

>> Yeah. when the more information you you

11:20

read about this

11:21

>> Yeah.

11:22

>> in terms of the number of victims.

11:24

>> Yeah.

11:25

>> In terms of how many people were

11:27

involved, uh how many

11:29

>> how many opportunities there were to

11:31

stop it.

11:32

>> Yeah.

11:33

>> And it just gets the web keeps getting

11:35

deeper and uglier.

11:36

>> Yes. And the lies like when commerce

11:38

secretary Howard Letic was on Capital

11:39

Hill this week as well. He told the

11:41

Senate committee he and his family had

11:42

lunch on Epstein's island in 2012, but

11:44

insisted he'd not have a relationship

11:45

with him. Of course, this was he had

11:47

given this sort of haha interview with

11:50

one of these right-wing outfits where he

11:52

said, "I never in that."

11:53

>> He was indignant. I was disgusted by him

11:56

and I said, "We're never we're going to

11:58

have no contact with him again." And

12:00

here's the thing.

12:01

>> He took his kids. I I I took my four

12:03

kids and their nannies and I got all the

12:05

kids off the island. That by

12:07

>> But this is the thing. It's not It's

12:09

It's not It's usually not the the

12:11

infraction itself. It's the cover up. If

12:13

he had said,

12:14

>> why'd he say the first thing? That's

12:16

>> But if he had just said right up front,

12:18

he's a neighbor. He had powerful

12:20

friends. I didn't do the diligence I

12:22

should have. I went with me and my kids

12:23

to his island once cuz it sounded like

12:25

fun.

12:26

>> Yeah.

12:26

>> Okay. Poor judgment, but go along, get

12:29

along.

12:30

>> Instead of trying to wrap yourself in

12:32

some sort of indignance that you

12:33

immedately smelled a rat and you're

12:35

lying

12:36

>> and you decided to

12:38

>> I mean, if he just come clean in the

12:40

beginning, said like, "Yeah, it was bad

12:42

judgment. took my kids to his island,

12:43

had a lunch. I'd heard he was a big

12:45

philanthropist and who knows maybe it

12:47

was okay. All right. Bad judgment. Move

12:51

along. But it again, it's the cover.

12:54

>> He had to take a laugh. He had to take a

12:56

I'm so pure laugh. And he that's cuz

12:58

he's a [ __ ] Let's let's be clear. This

13:00

guy's a [ __ ] And people are asking for

13:02

him to resign. He really is a liar. He's

13:03

a liar and a [ __ ] And it doesn't mean

13:06

he had to do anything, but he's a liar

13:08

and a [ __ ] The one that one that's

13:10

interesting under scrutiny is

13:12

entertainment executive Casey Wasserman.

13:14

Chapel Rowan and other artists have cut

13:15

ties with Wasserman as is their right

13:17

after latest file show exchanged emails

13:19

was Julian Maxwell seemed to have some

13:21

kind of relationship with her probably

13:24

extramarital who knows who serves as

13:26

chairman of the LA Olympics organized

13:28

appears to be holding on to that role.

13:30

They're backing him. There were other

13:31

names floated to take his place. He let

13:33

me be clear for people not letting him

13:35

out but it was 2003 before any of this

13:38

was known. He may have been able to pick

13:40

it up. That's different. Um but uh but

13:43

this was well before the first

13:45

conviction, the first um sweetheart deal

13:48

that Epstein did with in Florida. Um so

13:52

he's even even he's under scrutiny and

13:54

people are cutting ties. And again, this

13:57

is this artist right. They don't like

13:58

the cut of his jib. That's perfectly

14:00

fine. In his case, there's just the the

14:03

blast zone of this is so far right. It's

14:06

really

14:07

>> it's so indiscriminate. And again, I go

14:09

to the following.

14:10

>> Yeah.

14:10

>> If we had an institution we could trust,

14:12

including the Department of Justice and

14:14

the institutions that actually assembled

14:15

these files, if they could go through it

14:17

and go, "Okay, there are three circles

14:19

here. There's people who either engaged

14:22

in provided infrastructure or trafficked

14:25

and facilitated crimes, we are going to

14:27

release those names in the form of grand

14:29

jury indictments, and we're going to go

14:31

after these people." That's the headline

14:32

here. That's what the Department of

14:34

Justice isn't supposed to ruin people's

14:36

careers. It's supposed to create an

14:37

incentive system where people follow the

14:39

law by prosecuting criminals and

14:41

exonerating people who are not guilty.

14:43

That is what they are there to do. And

14:45

then the second circle, and this is a

14:47

harder one, is okay, if a cabinet, if a

14:50

cabinet member has clearly lied under

14:52

their testimony or under oath, should

14:54

they release that information? Didn't

14:56

didn't commit a crime. This is Howard

14:57

Lutnik. Should the president, who has

15:00

not so far been accused of a crime, if

15:02

he's mentioned in this thing 6,000

15:04

times, should we release that

15:05

information? I think that is a really

15:08

important point. The biggest circle,

15:10

quite frankly, is go I have seen on Tik

15:14

Tok and on Instagram people talking

15:16

about models, how they talked about

15:19

going to a museum with Jeffrey Epstein

15:21

and we should no longer uh have anything

15:24

to do. They're trying to shame all these

15:25

people and it's like, you know what,

15:27

folks, that's just pure gossip. And

15:30

unfortunately, the ring light shaming of

15:33

all these courageous, virtuous people

15:35

when they're behind a a keyboard and

15:37

have much higher standards for other

15:38

people than they do for themselves, that

15:40

is distracting from what the Department

15:42

of Justice is supposed to do. And that

15:44

has put pedophiles in prison.

15:47

>> Yeah, I would urge people to read. It

15:48

was really interesting. you know, Kathy

15:50

Katherine Rumler, who's the the legal

15:52

head of Goldman, you know, she was she

15:55

had a lot of emails and very chummy kind

15:57

of emails with Epstein going on for a

15:59

while. Um, I thought Bill Cohen did a

16:03

great job talking about why she was in

16:06

that relationship and most of it was in

16:08

fact she was professional. She's looking

16:10

for work, right? And that's a whole

16:12

different

16:12

>> guy who knows rich guys who can send me

16:14

a for wealth management.

16:15

>> Yes, exactly. So I would urge people to

16:17

read that and again one or two of them

16:20

and and one or two places she when he

16:22

said oh I it was only prostitution she

16:24

goes that's justice abusive Jeffrey like

16:27

she she unfortunately he kept saying

16:28

there are gifts there was a business

16:30

relationship I thought it was it was

16:32

actually a really um complex situation

16:36

that made me think god if she was a guy

16:38

and she did like golf with him she'd get

16:41

off because she was a woman was vaguely

16:43

flirty kind of she wasn't like it was it

16:47

was a great piece cuz it made me rethink

16:49

I was like okay like not great judgment

16:53

right should have known better should

16:55

have stopped talking to him after the

16:58

first thing um but didn't business it

17:01

was just interesting it was it made me

17:03

think a lot I recommend Bill Cohen's

17:04

column in pock and I thought this was

17:08

this is his area of expertise in finance

17:10

and I thought okay I got this is why

17:12

there she was he was trying to explain

17:14

why they haven't let her go, right? So,

17:16

I thought that was interesting. Anyway,

17:18

um speaking of uh um power, six

17:21

Republicans joined Democrats in the

17:22

House on Wednesday to vote for a

17:23

resolution aimed at ending President

17:25

Trump's tariffs on Canada that it's a

17:27

symbolic gesture, even if it clears the

17:29

Senate. Uh Trump would veto it, but that

17:31

didn't stop him from making threats.

17:32

Trump posted on True Social, that any

17:34

Republican who votes against the

17:35

terrorists would seriously uh suffer

17:38

consequences come election time, and

17:39

that includes primaries. Uh I think he's

17:41

losing the grip, as they say. What what

17:44

do you think?

17:44

>> Well, there's some new data that shows

17:46

that about So, the initial notion was

17:48

the tariffs would

17:50

uh mostly be paid by either

17:52

corporations, sort of a populist thing,

17:55

or the uh uh importer or excuse me, the

17:59

exporter themselves, the the country

18:01

would absorb it or whoever was sending

18:03

the products. It ends up and there's

18:04

finally analysis 94%

18:07

of the costs have been borne by US

18:09

consumers and then the other 6% have

18:11

been borne by companies either deciding

18:12

to take a bit of a hit or the the

18:15

importer themselves or excuse me the

18:17

exporter themselves reducing their

18:19

prices. You have about 15% of the

18:22

economy is um imports.

18:25

It they thought the tariffs average

18:27

around 20% so that's 3%. some managed to

18:30

get out of it. So, call it a 2% to the

18:32

economy, but the problem is it's an

18:34

unnecessary 2% hit to the economy. To be

18:37

fair, it hasn't had the catastrophic

18:39

effect a lot of people thought it was

18:40

going to have, but in a weird way.

18:44

>> Well, if Yeah, it's just

18:46

>> I feel it myself and the shelves are

18:48

emptier. It's weird. I never have

18:50

noticed that.

18:50

>> Well, but why reduce people's prosperity

18:52

by 2% for no real reason? It doesn't

18:54

cause growth. It doesn't cause

18:55

innovation. And all it's doing is is is

18:58

urging or reconfiguring the supply chain

19:01

>> around the United States. The EU is

19:02

entering into an agreement with

19:04

Merkasaur. There all kinds of new trade

19:06

zones being opened up such that people

19:08

are not as reliant on the US. And a

19:10

weird a weird thing though is that if

19:13

his tariffs are overturned

19:16

in by the Supreme Court or by the

19:18

Congress, I actually think the markets

19:20

will rip. So, in a weird way, it could

19:23

end up it could end up helping him if if

19:26

these things are turned back. I think

19:28

the markets will scream if these tariffs

19:31

are found to be uh illegal.

19:33

>> Yeah. Well, we'll see. And although

19:35

apparently he's got all these plans to

19:36

put other kinds of fees in place to take

19:39

their place that are that he that he'll

19:41

have to go back to court and stop him

19:43

for those. He's doubling down. This is

19:45

something he's talked about for years.

19:46

So, I don't know if he's going to back

19:47

off so quickly and take the

19:49

>> take the victory here. He'd like to take

19:52

the L. Honestly,

19:54

>> I don't know. No, they you know that

19:57

lunatic Peter Navaro talks about him

19:58

like that we have a whole bunch of

20:00

things to happen if the Supreme Court

20:02

>> overturns this. What's taking the

20:04

Supreme Court so long by the way?

20:06

Anyway, uh we'll see what happens. I do

20:07

think on the broader sense that there's

20:09

lots more um Republicans willing to push

20:12

back because of their own political

20:14

survival is not linked to Donald Trump

20:16

as much anymore. The other thing is it

20:18

looks like they may lose control of the

20:19

House.

20:20

>> That's right. Another person's

20:21

resigning, right?

20:23

>> So, you know, they're one they're one

20:25

sick person away from losing having the

20:28

Democrats in control. So, it's a really

20:30

interesting time. He doesn't have the

20:32

the power is slipping away and that's

20:33

why he screamy Pam or this nonsense and

20:36

stuff. So we'll see more of that I

20:38

think. Um okay Scott let's go on a quick

20:40

break. When we come back social media on

20:42

trial very important case

20:45

support for pivot comes from anthropic.

20:48

There are bumps in the road. The ones

20:50

you can just throw a band-aid on and be

20:51

done with it. And then there are the

20:53

bigger problems. The ones where you

20:55

really have to stop and think through.

20:57

The ones when you finally crack it feels

20:59

unbelievable. And for those problems,

21:01

you're going to need a partner to help

21:02

you understand where you're at, where

21:04

you're going, and how you're getting

21:05

there. Claude from Anthropic is that

21:08

partner. Claude is the AI for minds that

21:10

don't stop at good enough. It's a

21:12

collaborator that actually understands

21:14

your entire workflow and thinks with

21:15

you, whether you're debugging code at

21:17

midnight or strategizing your next

21:19

business move. Claude extends your

21:21

thinking to tackle the problems that

21:22

matter. Plus, Claude's research

21:24

capabilities go deeper than basic web

21:26

search. It can use comprehensive,

21:28

reliable analysis with proper citations,

21:30

turning hours of research into minutes.

21:33

Ready to tackle bigger problems? Start

21:35

with Claude today at claude.ai/pivot.

21:39

That's claude.ai/pivot.

21:41

And check out Claude Pro, which includes

21:43

access to all the features mentioned in

21:45

today's episode. Claude.ai/pivot.

21:56

Support for the show comes from

21:57

Corewave. AI isn't just a new tool. It

21:59

encompasses so much more. It's spurring

22:02

a revolution across all industries and

22:03

reshaping itself to become a big part of

22:05

our future together. Coreweave is at the

22:07

center, powering some of the biggest

22:09

names in AI. As the essential cloud for

22:11

AI, Corewave provides an AI platform

22:13

that combines next generation

22:15

infrastructure, intelligent tools, and

22:16

expert support. It's powering the

22:18

world's most complex AI workloads faster

22:20

and more efficiently. From medical

22:21

research and diagnosis to education,

22:23

from complex visual effects from movies

22:25

to breakthroughs in science and

22:26

technology. If it's AI, Cororeweave is

22:29

uniquely ready to power it with

22:30

purpose-built tech. The big ideas, the

22:32

wild visions and what-ifs and why nots.

22:35

Cororeeve is working to build what's

22:37

never been built before. Cororeweave is

22:39

the essential cloud for AI. Ready for

22:41

anything, ready for AI. To learn more

22:43

about how Coree powers the world's best

22:44

AI, go to corweave.com/refor.

22:54

Scott, we're back with more news. A

22:56

landmark social media trial got underway

22:57

this week with Meta and YouTube accused

22:59

of deliberately designing their

23:00

platforms to addict young users. You

23:02

think the lawsuit is the first of more

23:04

than500 similar cases to go to trial.

23:06

This is something that's been building

23:07

for a long time. The plaintiff's lawyer

23:09

is arguing that his client, a

23:10

20-year-old woman, got hooked on these

23:12

apps as a kid because they were like

23:13

digital casinos delivering dopamine

23:15

hits. Instagram head Adam Msari uh

23:18

testified on Wednesday that he doesn't

23:19

think users can be quote clinically

23:21

addicted to the app. Adam Msari is not a

23:24

doctor, just so you know. I can't

23:26

believe he said that. It was kind of a a

23:28

mistake on his part. Meanwhile, and also

23:30

he's wrong. Meanwhile, YouTube is

23:32

arguing it's not social media, it's an

23:34

entertainment platform like Netflix and

23:36

it's not addictive. That is also not

23:38

true. The jury, anyone who has kids

23:40

knows that. Uh it's very different from

23:41

Netflix. The jury trial, I mean, it's

23:44

become more like Netflix recently, but

23:46

it's also an addictive situation. The

23:48

jury trial is expected to last six to

23:50

eight weeks with Mark Zuckerberg and

23:51

YouTube's Neil Mo expected to testify.

23:54

This is a really important trial. The

23:56

big names are coming out and talking

23:57

about an issue you and I have talked

23:59

about for years. Um, what what are the

24:02

actual effects and who is responsible

24:04

for creating an addictive product? And

24:07

I'm sorry, Adam. I'm not a doctor

24:08

either, but any fool will tell you it's

24:11

anyone, not fool, any person will tell

24:13

you it's addictive. Anyone who uses it.

24:15

Um, and you design and there's so much

24:17

proof that you've designed it like a

24:19

casino or a cigarette or whatever it

24:21

happens to be. Thoughts?

24:22

>> Well, imagine you're 14 and someone you

24:26

go into your room and if you were like

24:28

me, your mom wasn't home until 6:00 or 7

24:30

p.m. and you're home alone.

24:31

>> Gilligans Island

24:32

>> and yeah, that's it was Bugs Bunny and

24:34

Gilligans Island and I Dream of Genie

24:36

for me. But what if in the corner there

24:37

was a casino? What if there was an

24:41

arcade? What if there was?

24:45

What if there was unlimited music? What

24:47

if there And then you say, "No, no, no.

24:50

Study. What if there was the high school

24:52

cafeteria where I could say something

24:54

mean about someone else or someone could

24:57

say something mean about me and all I

24:58

could think about the rest of the day

24:59

and night was what they were saying

25:00

about me?" That the high school

25:02

cafeteria never never left. And it ends

25:06

up that about 6% of teenagers are

25:08

clinically addicted or or meet the

25:11

clinical definition of addicted to

25:12

either drugs or alcohol. But under that

25:15

same those same standards, 24% are

25:17

addicted to social media. And just some

25:20

data, the average American team teen

25:22

spends 4.8 hours a day using social

25:24

media. 16% of teens or one in six use

25:27

Tik Tok almost constantly. 15% for

25:30

YouTube, 13% for Snap, 12% for

25:33

Instagram. And roughly half of all teens

25:35

report feeling addicted to social media.

25:38

And you say, well, okay, fine. What's

25:40

the impact? Teens who are in the highest

25:43

use group expressed two times more

25:45

suicidal intent or self harm than those

25:47

in the lowest use group. And the highest

25:50

use group also express poor uh body

25:52

image at three times more than the

25:56

lowest use group. And it typically takes

25:58

a society or it takes America 20 to 30

26:01

years to respond to really negative

26:04

externalities. Took us 30 years with

26:06

tobacco. It took us 20 years with

26:07

opiates. And if you think about social

26:10

going on mobile in 2012, it 20 years is

26:14

probably the right number. I think when

26:15

I'm I mean parents always ask me what

26:17

should I do with my kids and I say how

26:18

old are your kids? And if they say three

26:20

or five, I'm like, we'll have it figured

26:21

out by then because the data here is so

26:24

overwhelming and we're up against uh

26:28

intrigence and people trying to delay an

26:30

opuscate similar to those tobacco

26:32

executives and they have more money and

26:33

they're more skilled this time. But

26:35

eventually the tide, the tsunami of

26:38

parental concern here, you know,

26:41

understandable parental concern is

26:43

washing over all this [ __ ] where so

26:46

I think in I would say I mean you have

26:48

entire countries now age gating. Look at

26:50

what Australia is doing. I think another

26:53

two to three years I'm hopeful the

26:55

landscape's going to be much different

26:57

for children. The the remedies would be

26:59

warning sign there's lots of remedies

27:01

like with cigarettes peopleating age

27:03

gating warning signals um they check

27:07

ages legal liability the age checking is

27:10

harder

27:11

>> what every what every other substance

27:13

company and manufacturers media company

27:15

is subject to

27:16

>> they've got to be kidding you know

27:18

there's so much you they have so many

27:20

emails of them talking about this that's

27:22

the problem for Adam is sorry to say

27:24

this he doesn't think it's clinically

27:25

addictive come on Adam, come on. We all

27:28

think it is. We The problem is every

27:30

adult knows this in their bones, right?

27:33

It's like

27:33

>> cuz we're addicted.

27:34

>> We're addicted. Like, we are. It's a

27:36

problem. You cannot put it down. And it

27:38

is different from television. It is very

27:40

different. And television. Listen,

27:42

Gilligans Island's addictive enough. I

27:44

can't believe I watched all that [ __ ]

27:45

But you can walk away from it in a way.

27:47

You cannot walk away from this. It's I

27:50

find myself I'm I have to throw the

27:51

phone across the room, right? Sometimes

27:54

I'm like, "Put it down." Um, you know,

27:57

every Amanda, same thing. We just It's

27:59

really interesting. And sometimes I

28:01

think about it. I'm like, I like news

28:03

and I'm read I'm mostly reading news,

28:05

but I don't stop. That's the difference.

28:07

I put down magazines. I put down

28:09

newspapers. And I love news. So, this is

28:12

the all this stuff as it gets out, as

28:14

you see the emails in inside the company

28:17

talking about it. And especially early

28:20

on, they knew just what they were doing.

28:21

And um perhaps they weren't meaning to

28:24

be malevolent at the beginning, but it's

28:26

malevolent for many young people and the

28:28

impact is huge. And then they just keep

28:30

doubling down with AI relationships and

28:32

synthetic relationships and everything

28:34

else. This is the time has come round at

28:37

last for these companies. We'll see how

28:38

well how how this trial does, but it's

28:41

going to it's going to just uncover more

28:43

and more about what they knew. very much

28:45

like the cigarette companies.

28:46

>> When you have hundreds of billions of

28:48

dollars in shareholder value, trillions

28:50

of dollars of shareholder value, hundred

28:51

billions in revenue, millions of some of

28:53

the brightest people in the world, and

28:55

trillions of data points, all trying,

28:56

all aiming towards one thing. How do we

28:58

get people to spend one more second

29:02

every day on social and less time

29:06

somewhere else, whether it's sports,

29:08

friends, studying, sleep, and they're

29:10

winning. And young people, especially

29:12

young men, who have this tremendous fall

29:13

in their brains where they're constantly

29:15

dopah hungry, they're up against an

29:17

indomitable foe. And then the other

29:21

>> like sugar. It's like sugar. It's the

29:22

same thing. It's the same.

29:23

>> And then there's two or three. But your

29:25

kid your kid can take, you know, a 10

29:27

pound bag of sugar into his bedroom with

29:29

him.

29:30

>> The the the other kid my kid could, but

29:33

go ahead.

29:34

>> The other two things it is a cumulative

29:36

effect that I think have really hurt our

29:37

youth are one. I do think parents have

29:40

some culpability here and that is we

29:44

have decided that our job is to clear

29:46

out all borders and obstacles for our

29:48

kids. We engage in concierge and

29:49

bulldozer parenting and by the time the

29:51

kid gets to college he or she has never

29:53

had a sea or a disappointment

29:55

>> and we've created this princess and pee

29:58

um generation with good intentions. We

30:00

thought we were doing our kids a good

30:02

thing. And then something that doesn't

30:03

get talked a lot about but I absolutely

30:06

think is adding up to a generation that

30:09

is at a disadvantage and that is if you

30:11

are 21 since the age of 10 the person

30:15

you are supposed to look up to most in

30:17

the world is Donald Trump. So

30:21

performative verality, coarseness and

30:23

cruelty,

30:25

>> online scams,

30:27

>> crypto, doubling down on lies. This has

30:31

been the role model

30:33

>> as kids brains are being wired during

30:36

puberty. And no matter who is president

30:38

or what you think of the office,

30:40

>> president is the person that millions of

30:44

young Americans look to as the as the

30:46

ultimate of success in American values.

30:49

So what have we done? We've raised a

30:51

generation of kids who are dopah hungry

30:53

and their primary role model maybe with

30:55

a close second the richest man in the

30:58

world

30:58

>> the greatest control of all time

31:00

>> are exhibiting values that are very

31:04

>> I mean and what do you know these 21

31:06

year olds are not it's shocking it's

31:09

shocking what good people they are what

31:11

they have to deal with

31:12

>> I would agree I think they do resist

31:14

more than you think and actually there

31:15

are a lot of parents one of the things I

31:17

spent a lot of time doing with my kids

31:19

whenever like can Can you go get this

31:21

from me? Can you talk to that person if

31:22

they wanted something? I'm like, you

31:23

need to do it. Like you I you figure it

31:26

out has became one of my lines with my

31:29

kids, my older kids. You figure it out.

31:31

I do it with my younger kids now. With

31:33

Saul, I'm like, you figure it out. I

31:34

don't know. I know, but you can do it

31:36

yourself. And so that's it's the best

31:38

piece of advice you can give to like a

31:40

kid. You

31:41

>> I've started giving my kid pounds when

31:43

he gets good grades. Is that wrong?

31:45

>> I slip him I slip him a 20 pound. And I

31:48

slip him a note when he gets an A on a

31:49

test.

31:50

>> Do not do that. Well,

31:51

>> totally.

31:52

>> No. No.

31:53

>> Anyway, um

31:54

>> that's called that's called capitalism.

31:56

>> Okay.

31:56

>> You got to get a bunch of money.

31:58

>> Okay. All right. Whatever. Whatever you

32:00

want to do there, Scott. We should write

32:01

competing parenting books. Uh in the

32:03

same genre about surveillance, as you

32:05

know, that's another thing I go crazy

32:06

about. Um investigators in the Nancy

32:09

Guthrie abduction case have recovered

32:10

footage from the Nest doorbell. Nest is

32:12

owned by Google. It was initially

32:13

thought to have no video because there

32:14

was no active subscription. When you

32:16

sign up, you have, for people who don't

32:18

know, for Nest or any of these things,

32:20

you can buy a subscription. If you

32:22

don't, they say they don't keep the

32:23

video. As it turns out, they do. The

32:26

incident shows that Nest uploads video

32:28

to Google Cloud before you decide to

32:30

keep it with a paid plan so it can

32:31

linger after it says it's been deleted,

32:34

is supposed to be deleted. I'm glad they

32:37

got these pictures of this guy. At the

32:38

same time, this is an edge case. They're

32:41

they're keeping your video that which I

32:43

which everyone thought they were doing

32:45

and they said they weren't. The FBI

32:47

working with Google engineers took 10

32:48

days to recover the footage from

32:49

Guthri's camera. I the companies need to

32:52

spell out in plain English how long

32:55

deleted footage actually remains on

32:57

their servers. And by the way, they're

32:58

also getting incredible push back from

33:00

the Ring ad for the Super Bowl, which is

33:02

like, "We're watching everybody, but

33:04

only for your dogs." And there's been a

33:06

million memes about only for people we

33:08

need to take away. like the surveillance

33:11

of these kind of things and the ease of

33:14

which they are hacked by the way not

33:16

just taken off the door like this

33:18

terrible person did um but hacked into

33:21

are quite something a lot of people are

33:22

getting them hardwired into their house

33:24

so that they can't do that and also so

33:27

that they can't be um taken via wireless

33:31

there's a lot of wireless activity here

33:32

but there are ways to a lot of these

33:35

things are open season on your home I

33:38

don't when I Just speaking of my son, my

33:40

kids, Alex took I had one of them up at

33:43

one of our houses when we bought it. It

33:45

was there, one of these Amazon or Echo

33:47

or whatever. He took them all out. He

33:49

took one day I came back and everything

33:50

was gone. And I was like, "Why?" And he

33:53

goes, "Because they can watch us." And I

33:55

was like, "Don't be paranoid." He goes,

33:56

"I'm not." And he was right. So

33:59

>> I think we're I think we have a bit of a

34:01

different view on this in the sense that

34:03

I think technology I think we gave up

34:05

our privacy a long time ago. Yes, Scott

34:07

McNeely, we did

34:08

>> what I want to see. Oh, remember Scott?

34:11

>> Yeah, he said that

34:12

>> privacy doesn't exist. Get used to it.

34:14

Remember,

34:14

>> if you are in London or New York, you

34:17

can't go more than I think it's 30 feet

34:19

agree

34:19

>> outside without a camera.

34:21

>> And the reason they did that was they

34:23

implemented massive they have like a a

34:25

security headquarters because of 911.

34:28

And I actually what I think you need

34:31

though is really really wellthoughtout

34:34

laws and institutions that say we're not

34:38

going to go fishing unless it's a felony

34:41

crime. We don't investigate it.

34:43

>> In other words, people have the right

34:45

You said something I've thought about a

34:46

lot and that is people have the right to

34:48

have secrets.

34:49

>> Yeah.

34:50

>> And if you want to if you want to go

34:52

into a store, if you're I don't know,

34:55

you you should be able to do what you

34:57

want. If you murder somebody then quite

35:00

frankly and there are enough there's

35:03

enough evidence to say that you are a

35:07

reasonable person of interest then we

35:10

are going to utilize uh cameras data

35:14

video footage

35:15

>> I agree with you I just think you buy

35:16

this product and it says it isn't

35:18

keeping it if you don't pay for it then

35:20

it's not keeping it like I'm sorry

35:22

that's just the deal that's just the

35:24

deal when you buy I have several of

35:25

these and I've taken most of them off my

35:27

house, but they say ex and I pay a lot

35:30

of attention. We if you don't pay this

35:33

stuff is deleted. This is deleted. If it

35:36

says it's deleted, it should be deleted.

35:38

That's all. It's just the deal you make

35:40

with them. And so I don't think they

35:41

should keep it if it's supposed to be

35:43

deleted. Same thing with Echo. It

35:45

shouldn't be listening if it says it's

35:47

not listening. Right. That's what that's

35:49

if you want it to listen, you can tell

35:51

it. That's in your home. I'm talking

35:53

about this outside. I think we've lost

35:55

that battle. They're going to their

35:56

cameras are everywhere in talk about

35:58

London. Monte Carlo is really wired. So

36:00

is the United States of America. And

36:02

that's a good thing when it comes to

36:04

crime, but it's a very bad thing when it

36:06

comes inside of your house. Cuz Scott, I

36:08

know if you want to wear your frilly

36:10

underwear, I think I Oh, wait. Was that

36:12

a secret? Um, I back people in their

36:15

homes. I'm just

36:16

>> Daddy goes commando. Big and the twins

36:19

want to be free. But I I think it's it's

36:22

in this case it was good to be able to

36:24

get the picture of this guy. At the same

36:26

time, she didn't the intent wasn't to.

36:30

So if plain English of what you're doing

36:33

and how long it remains and then it

36:34

should tell you when it's deleted and

36:36

permanently deleted. If they say

36:39

permanently deleted, it needs to be

36:40

deleted. That's I I feel like that's

36:42

>> at some point you should be able to

36:44

have, you know, I have cameras around my

36:45

house. You can see almost everything.

36:47

I try to sneak in all the time

36:48

>> if someone were to break in. But I think

36:51

what you want is like

36:54

this is the hack that I think is coming.

36:58

Somebody hacks into Uber with your Uber.

37:01

If you use Uber a lot, I think you can

37:03

find out when someone is with a thin

37:05

layer of AI on top of your Uber trips

37:06

where

37:06

>> they go.

37:07

>> Mhm.

37:08

>> They'll be able to know if you just

37:09

terminated a pregnancy.

37:10

>> Yep.

37:11

>> Or if you're a Russian spy. Why is this

37:13

person continually going to the Russian

37:14

embassy? Why is this are you having

37:17

affairs with sameex?

37:19

A thin layer of AI on top of your ride

37:22

history when and where you are going

37:24

places.

37:26

>> It would be they would it would be very

37:28

easy to say, okay, this person is

37:31

clearly suffering from diabetes. This is

37:34

why they keep going to this

37:36

>> type of clinic. You could. This person

37:39

is clearly engaged in a love affair with

37:44

this dude at this address. This person

37:48

is clearly

37:49

>> sure is constantly going to Amtrak. But

37:50

go ahead.

37:51

>> This person is clearly cooperating with

37:53

the CIA as evidenced by the fact they

37:55

keep going to this one address that is a

37:58

co. They could find out. So that hack,

38:04

folks, this is this is the trade we all

38:07

make and we all talk a big game. Anyone

38:09

who talks about privacy is typically

38:11

over the age of 50 and in Brussels or

38:12

DC. We consistently trade our privacy

38:16

for utility.

38:18

>> Yep, we do.

38:18

>> And and what I want is massively

38:24

Okay. Unless it's a felony, maybe even

38:26

more than that, it's a felony that with

38:29

that has a threat of violence and

38:31

there's really strong evidence against

38:33

one person, all that [ __ ] is off limits.

38:36

No one can use it.

38:37

>> All I'm saying is if they say it's off,

38:39

it needs to be off like

38:40

>> or at least give you the power to delete

38:42

it.

38:42

>> It's like if you buy like I don't know,

38:45

organic apple, it's not organic. You

38:47

can't do that. I mean,

38:48

>> it's the same thing. You're selling a

38:50

product, you say what it is, stay with

38:52

what you say it is. But at the same

38:53

time, I love the fact, okay, when when

38:56

there's a crime,

38:57

>> crime is hitting despite all the

38:59

scariness and everyone saying whether

39:00

it's whether they saying saying it's,

39:02

you know, Eric Adams or mom Donnie or

39:04

it's it's bedum in the streets,

39:06

>> crime, the number of shootings in New

39:08

York last year, I think, hit like an

39:09

all-time low.

39:11

>> Violence is going and crime, violent

39:13

crime has consistently gone down the

39:16

last several decades. It was, is it

39:17

because we're a better people? I don't

39:19

think so. is because if you commit

39:21

crimes now, everyone has seen those Law

39:23

and Order SUVs

39:25

>> that if you if you go into a 7-Eleven in

39:28

the middle of [ __ ] nowhere

39:30

>> and shoot the clerk,

39:32

>> they're film

39:32

>> ATMs have cameras. So, was there any

39:35

ATMs outside? Then they check the

39:37

footage on the ATM. I like I don't like

39:39

a surveillance state. I like a state I

39:42

like a place where if a really strong

39:45

lawyers that where they consistently

39:47

say, "I get you think a crime is

39:49

committed here. There's not enough

39:50

evidence. You do not have access to this

39:52

video.

39:53

>> Right.

39:53

>> Stop. There's evidence that you're

39:56

planning a terrorist attack. Sorry,

39:58

boss. We're violating your privacy

39:59

rights. Every ring light, but we have

40:02

Uber. We still I still think we have a

40:05

dupe process. We can't have the wrong

40:07

people getting a hold of stuff. Anyway,

40:09

I I hope they find Nancy Guthrie and I

40:12

hope it helps that they have these, but

40:15

we have to be it brings up a big issue

40:17

about surveillance and we should pay

40:19

attention to it. Um, and I hope it helps

40:22

find find her and bring her home safely

40:24

to her family. Um, anyway, let's go on a

40:27

quick break. We come back, uh, we'll

40:29

talk about the latest in AI news.

40:31

There's a lot of it.

40:33

Support for the show comes from Indeed.

40:35

Hiring isn't just about finding someone

40:37

willing to take the job. It's about

40:38

connecting with someone who can move

40:40

your business forward. For that, check

40:42

out Indeed Sponsored Jobs. Indeed

40:44

sponsored jobs boosts your job post for

40:46

quality candidates so you can reach

40:48

people that can help your business

40:49

thrive. People are finding quality hires

40:51

on Indeed right now as we speak. In the

40:53

minute I've been talking to you, 27

40:55

hires were made on Indeed, according to

40:56

Indeed data worldwide. Join the 3.3

40:59

million employers worldwide that use

41:00

Indeed to connect with quality talent

41:02

that fits their needs. Spend less time

41:04

searching and more time actually

41:06

interviewing candidates who check all

41:07

your boxes. Less stress, less time, more

41:09

results now with Indeed sponsored jobs.

41:12

And listeners to this show will get a

41:13

$75 sponsored job credit to help get

41:16

your job the premium status it deserves

41:18

at indeed.com/pivot.

41:20

Just go to indeed.com/pivot

41:22

right now and support our show by saying

41:24

you heard about Indeed on this podcast,

41:26

indeed.com/pivot.

41:28

Terms and conditions apply. Hiring, do

41:30

the right way with Indeed.

41:36

Support for this show comes from Quint.

41:39

Style doesn't come from chasing new

41:41

trends every season. Real style comes

41:43

from slowly and intentionally

41:44

cultivating a wardrobe filled with

41:46

highquality staples that will last. And

41:47

if you're on the lookout for a perfect

41:49

addition to your closet, look no further

41:50

than Quint. You'll find organic cotton

41:52

sweaters, polos for every occasion,

41:54

light jackets that will help keep you

41:55

warm as the seasons change year after

41:57

year. Not to mention their famous 100%

42:00

Mongolian cashmere. If there's anything

42:01

better than Kashmir, I'd love to hear

42:03

it. Every Quint's item is built for

42:05

everyday wear and made with ethically

42:07

sourced materials from top factories.

42:09

And by partnering with manufacturers

42:10

directly, Quint keeps things affordable.

42:12

So, you're only paying for the quality

42:14

clothing and not the brand markup. I

42:16

have finally bought new Quint clothes,

42:18

not just uh soft pants that I can wear

42:20

when I do sports. I actually bought more

42:22

of those, but I also bought a lovely

42:23

cardigan that is so soft. I wear it all

42:25

the time. I fell asleep in it the other

42:27

day. I bought a beautiful jacket and I

42:29

just love it. I have to say this this

42:31

cardigan I'm wearing is so comfortable.

42:33

It's really good-looking. The fabric,

42:35

everything else, it feels richer than it

42:37

was. Um, and the same thing with the

42:39

coat. It's really good-looking and I

42:40

really like wearing it. Again,

42:42

comfortable, simple, uh, and just

42:45

lovely. I really, really like it.

42:47

Refresh your wardrobe with Quint. Don't

42:49

wait. Go to quint.com/pivot

42:51

for free shipping on your orders and

42:53

365day returns. Now available in Canada,

42:56

too. That's quinc.com/pivotpivot

43:00

t to get free shipping and 365day

43:03

returns. quint.com/pivot.

43:08

Scott, we're back with more news. Time

43:09

for rapid fire of AI news. First up,

43:12

Anthropic is in the final stages of

43:13

raising $20 billion in new capital at a

43:15

valuation of $350 billion

43:19

uh valuation. And also at Anthropic, a

43:21

researcher submitted a resignation

43:23

letter saying the world is in peril,

43:25

saying employees constantly face

43:27

pressures to set aside what matters

43:29

most. That researcher is going off to

43:31

write poetry, by the way, which should

43:33

trouble you. Over at XAI, Elon Musk has

43:36

lost two co-founders, Jimmy Ba and Tony

43:38

Woo. Both announced their departure is a

43:40

big restructuring over there too when he

43:42

as he's brought it into uh SpaceX. The

43:45

company at OpenAI, the company's fired

43:47

an executive after she opposed plans for

43:49

an AI erotica feature in chat GBT citing

43:52

sexual discrimination. We don't actually

43:54

know what happened here. Uh Anthropic

43:56

raised the funding uh raised twice the

43:59

funding initially sought based on

44:00

investor demand. Uh so thoughts on any

44:03

of these stories? Lots of different lots

44:05

of stuff happening around AI again.

44:07

>> Yeah, the why people get fired or why

44:10

they say they were fired? I don't know.

44:11

I haven't sorted through that. What I I

44:13

think is already happened whether it's

44:15

reflected in the valuations or not. I

44:18

think Anthropic is now worth more than

44:19

open AAI. I think open AAI

44:22

>> what was their valuation? 800 billion.

44:24

>> Well, they're I think they're trying to

44:25

close around at 850.

44:26

>> Yeah. 850.

44:27

>> But that one VC who kind of if there was

44:32

a moment where the the the balloon was

44:34

burst, if you will, the bubble was

44:35

burst. It was when that VC had Sam Alman

44:37

on his podcast and said, "You've made a

44:40

trillion dollars in spending commitments

44:41

on a company with 20 billion in revenue.

44:43

How are you going to do that?" And he

44:45

got very defensive about it. And they've

44:47

gone consumer anthropics gone

44:50

enterprise. Uh they haven't made the

44:53

kind of crazy commitments. I I think

44:56

there's been the kind of the mother of

44:58

all industrial pivots. I think now, if

45:01

you will, Avis is now hurts. I think

45:03

Anthropic is now worth more or will be

45:06

soon than Open AI.

45:07

>> They are not making the money.

45:10

>> Yeah, that's but they're they're

45:12

stronger in the enterprise. Anyways, I

45:14

none of this makes any sense in terms of

45:16

a multiple on revenues, but uh I think I

45:19

think Open AI is in real

45:23

um I don't know, crisis is the wrong

45:25

word. There's a lot of arguments over on

45:28

X about that they have now do not have

45:30

he his big thing was I have the best AI

45:32

researchers now he does not right from

45:35

what most people intelligent people are

45:37

saying about it but you know he always

45:40

does this he always goes in and shakes

45:42

the tree and then shakes the tree again

45:44

that's that's his emmo I guess they're a

45:47

distant what third or fourth something

45:49

>> well these guys are all here's an a

45:53

symbol of how easy it is and how

45:55

difficult or how vulnerable they are. It

45:58

says, "Here are some Daario and Daniela

46:00

Emodi.

46:02

Uh, we're at OpenAI now at anthropic."

46:05

Ilia Sitsker, open AAI, now at safe

46:08

super intelligence. Aravan Shinavas,

46:11

Open AI, now at Perplexity. Mera Morati,

46:13

open AAI now at thinking machines.

46:15

Arthur Mench was at Google, now at

46:18

Mstral AI. It's the brightest minds here

46:21

are supposedly in I used to work with a

46:24

lot of luxury brands and they said the

46:25

biggest problem they were having in

46:26

China

46:27

>> Mhm. is that at the biggest malls, if

46:29

Prada was had a store across the street

46:31

from Bautega Vanetta,

46:33

>> if if the manager of that Prada didn't

46:36

have people show up, he could go across

46:38

the street during the lunch hour to the

46:41

lunch court and offer someone 11 bucks

46:43

an hour from the Bautga store who was

46:45

making 10 and they wouldn't even go back

46:47

after their lunch break. They would go

46:49

over and work out. It was just so easy

46:51

to pick off people by offering them a

46:52

dollar more per hour. And it feels so

46:55

many of these deep these people who

46:57

have, you know, fairly or unfairly have

47:01

established themselves as some of the

47:03

few minds that really understand this

47:04

stuff. The amount of money and

47:07

temptation to go do their own thing or

47:09

join another firm. It is I mean

47:12

supposedly wasn't there reports that

47:15

Zuckerberg was paying some people$100 or

47:17

$300 million and then he wasn't.

47:19

>> I mean it just feels like it's total I

47:22

don't know bedum right now. Right. It's

47:24

it's they all think they're going to be

47:26

the one, right? I'm going to be the

47:28

final one standing and I'm going to own

47:29

the world essentially, which is a bet.

47:32

It's a bet, right? I think one of the

47:34

things that continues to plague these

47:35

companies are these researchers who are

47:37

like, we're [ __ ] everybody. Like they

47:40

come out and almost, you know, like

47:43

they're sort of like, sh it's going to

47:45

kill us,

47:46

>> I think. But quite frankly, K, I think a

47:47

lot of it is people

47:51

backfilling

47:53

uh the reason why they're living with

47:55

leaving with morality sometimes or some

47:57

sort of victimhood. If you look at just

48:00

to go back to musical chairs here, if

48:02

you look at XAI, the company lost its

48:04

second co-founder in just two days. And

48:06

that means that half of XAI's founding

48:09

team, six of the 12 have left the

48:11

company in less than three years of

48:13

existence. and Mus said, you know, we we

48:16

reorganized XAI to improve the speed of

48:18

execution, which required parting some

48:20

ways with some people. And I think for

48:23

some of these founders, there's legal

48:24

risk to staying at XAI. The EU is

48:27

currently investigating the company for

48:29

its creation of non-consensual sexual

48:31

deep fakes based on real people,

48:33

including children. So, this really is

48:36

the wild west. This is um you know, I

48:41

don't know. I I think it's just it's so

48:43

difficult to even keep track of

48:45

>> Yeah. Yeah.

48:46

>> You know, who ends up where and why.

48:48

>> It's like as if a science people went

48:50

crazy, right? But I I do think the

48:53

warnings are getting really interesting.

48:55

They're like I wish someone would just

48:56

explain what we're in peril. How how are

48:59

we in

49:00

>> Yeah. How does that manifest? What does

49:01

that mean? Hey. Hey. Like, oh, it's like

49:04

the people who knew that we were about,

49:06

you know, in those movies where a bunch

49:07

of people know we're about to get hit by

49:09

a like a comet or something and they're

49:12

not telling us. They're like, I would

49:15

>> love your family. Why

49:17

>> is it Arnold Schwarzenegger showing up

49:19

at your door wearing Oakleys and a lot

49:21

of leather? Like, what is it?

49:22

>> What is happening?

49:23

>> What does it look like here? What does

49:24

it What does it mean? Cuz the employment

49:26

destruction that was supposed to be

49:27

already well underway, I would argue is

49:29

not happening yet. I don't know. But why

49:31

would someone say they're in peril?

49:33

We're in peril and set aside what

49:36

matters most, which is safety

49:37

presumably. And then they go off and

49:38

write poetry. I would like some more

49:40

information if you don't mind. If you're

49:42

going to do that, you need to tell me.

49:44

>> Yeah. Why exactly why are we in peril?

49:46

>> Why are we in peril? But

49:48

>> from what? Tell me. Tell us. I know. I

49:51

know there's these legal things, but if

49:52

it's so terrifying, you need to like

49:55

step out and like tell us tell us what

49:57

it is and have bring proof, too. by the

50:00

way, would love to know when the comet's

50:02

going to hit us. In any case, uh

50:04

>> but the VP of product policy at OpenAI

50:06

was fired after she voiced opposition to

50:09

Open AI's upcoming erotica features for

50:11

adult users.

50:12

>> Yeah,

50:13

>> she she did something else

50:15

>> that enabling erotica would likely

50:17

strengthen feelings that users already

50:19

have for the chatbot. Based on a recent

50:21

report released by OpenAI, out of chat

50:23

GBD's 800 million weekly users,

50:26

>> 1.2 And 2 million users are prioritizing

50:29

talking to chat GBT over their family,

50:31

friends, school, or work. That's less

50:32

than I would have thought. Roughly 560K

50:36

are experiencing psychosis or mania.

50:38

This is shitty research. As a as a as a

50:41

ratio by 800 million people, is that

50:42

normal or not normal?

50:44

>> That's a lot.

50:44

>> And about 1.2 million people discuss

50:46

suicide with chat GPT. Again, what I

50:49

want to see is someone to say,

50:51

>> "All right, is that just a function of

50:52

people who are depressed thinking they

50:53

can talk to chat GPT just as they would

50:55

talk to a friend or a therapist, right?

50:57

>> Or is it something about talking to chat

50:59

GPT?"

51:00

>> Right? You get the psychosis,

51:01

>> suicidal ideiation or psychosis.

51:04

>> Ladder, you know, interesting. I just

51:05

did an interview with Sher Challe from

51:06

my doc series and she's been saying it

51:08

for years and she's like, I've never

51:10

seen anything like it now. It was before

51:12

on the sidelines and in the darker

51:14

places or people had, you know, it was a

51:16

small group of people. She goes, "It's

51:18

really gone mainstream in a way. I I

51:20

would like the information from these

51:22

people. Would you come out and bring a

51:23

bag and bring it to me or Scott or

51:25

something like that?"

51:26

>> Anyway,

51:28

Carisher, on a separate note, speaking

51:31

of sort of normal journalism and getting

51:33

information out, one of the most

51:35

depressing things, Hong Kong media mogul

51:37

and pro-democracy activist Jimmy Lie was

51:39

sentenced this week to 20 years in

51:40

prison after he found guilty in a

51:42

sedition and collusion with foreign

51:44

forces. It's the longest sentence ever

51:46

handed down under a death sentence.

51:48

>> Death sentence. Uh Eli's children are

51:50

saying a potential visit by President

51:51

Trump April could be crucial in securing

51:53

the release of their 78-year-old father.

51:55

This is something Trump should do. Back

51:56

in December, Trump said he asked uh

51:58

President Xi to consider releasing lie.

52:01

But on the campaign trail in 2024, he

52:03

was a lot more confident saying 100%

52:05

I'll get him out. He'll be easy to get

52:06

out. He's not so easy to get out. Let's

52:09

not forget the real surveillance

52:11

economy, the real control economy. We've

52:13

talked about these issues around control

52:15

and the uses of AI for badness. Um,

52:19

China wins the boats everywhere and they

52:21

they go after this guy who's a really

52:24

important uh figure um uh figure in in

52:29

this area. And so if President Trump can

52:32

do anything, please do it. If anyone can

52:34

do anything, but Jimmy Li is a hero and

52:37

and what's happened to him is is as you

52:39

say a death sentence.

52:40

>> Look, I go to the economics. when you

52:42

start imprisoning journalists, whether

52:44

it was Turkey in 2012, Soviet Union at

52:47

the turn of the century or China, uh,

52:50

putting the, you know, taking a very

52:53

hardfed approach to Hong Kong in 2021 as

52:56

kind of best epitomized by Timi being

52:58

imprisoned,

53:00

distinct of the morality of it, distinct

53:02

of the importance it plays in a society,

53:05

the nation gets poorer and angrier. It's

53:06

a it is literally a conard a canary in

53:09

the coal mine saying we are about to

53:11

send a chill across some of the most

53:13

talented people and scrutiny about what

53:15

can be said about companies that hurts

53:17

the economy. The nations get poorer and

53:20

angrier and it's literally a symbol of

53:24

when an economy is about to move to an

53:26

authoritarian state which is really bad

53:29

for innovation for attracting outside

53:30

capital.

53:32

When you're thinking about investing

53:35

in Turkey, and all of a sudden they

53:36

start locking up journalists. Does that

53:37

thing does that does that if you're

53:40

Google, you think, "Yeah, I'm going to

53:41

start I'm going to open an office in I'm

53:45

going to open an office in Istanbul."

53:46

You think, you know, I'm going to wait

53:47

and see if they sort that out. If you're

53:49

one of the brightest PhDs in the world

53:52

and you're doing research on

53:55

authoritarian governments or you're

53:56

doing research on innovation and you're

53:58

worried that your research might might

54:01

contradict something that the leadership

54:03

is espousing to do you go teach at those

54:05

universities? No, you go somewhere else.

54:08

So

54:10

this is look China is not you know is

54:13

not a model for

54:16

but having said that I was just supposed

54:17

to be on with Don Lemon who got

54:19

arrested. Why the [ __ ] why are they

54:21

arresting Don Lemon?

54:22

>> Don Lemon like give me a break. They

54:24

shouldn't be arresting any journalist

54:26

like this. It's just ridiculous. I would

54:28

agree. Um I'm going to finish up with

54:30

any Jimmy L. Let's get him out. Let's

54:32

let's get him out. He's a hero. Um I'm

54:34

going to finish up with something that

54:35

just happened. Um Gail Slater, who a

54:38

hugely respected lawyer, antitrust

54:40

lawyer who was running antitrust of DOJ,

54:42

just announced she's stepping down. It

54:44

follows uh the the resignation of a guy

54:46

named Mark Hammer, who was one of his

54:48

her top deputies. She's had clashes with

54:51

Pam Bondi over the Hanley antitrust

54:53

investigations. I have heard she was in

54:55

a real bind over the Paramount thing.

54:58

They're trying to like shove through

54:59

things that are friendly to the Trump

55:02

administration and she just can't do it.

55:04

She can't do it. that during her 11

55:05

months on the job, she found herself in

55:07

this bind caught between the Trump

55:09

administration's um

55:12

she was close to JD Vance. This is a

55:14

very respected and well- reggarded

55:16

antitrust version. This should be an

55:18

enormous signal that Gail Slater is

55:20

stepping down. Um I had hoped to talk to

55:22

her, but everyone had told me they

55:24

didn't know what she was going to do

55:25

about the the Netflix Paramount thing.

55:28

Um you cannot be against the Netflix

55:30

thing if you're not against the

55:31

Paramount thing. I'm sorry. like and of

55:34

course she's being, you know, she she

55:36

had a she's been put in a bind all over

55:38

the place. A talented and and well

55:41

regarded person has put into a bind and

55:43

so she's stepping down. Um I just don't

55:45

know who they'll put in some idiot like

55:47

a Brandon Carr type of person who will

55:49

just do what they say. Um but it really

55:51

brings it down rather significantly.

55:53

Even even um Megan Del Reinver works for

55:56

Paramount actually now very well

55:58

regarded like they're going to have to

56:00

put in it in a village [ __ ] idiot in

56:02

the Pam Bondi mode. So not a good sign.

56:04

Not a good sign.

56:05

>> Yeah, great.

56:06

>> Anyway, uh one more quick break. We'll

56:08

be back for predictions.

56:11

>> Support for the show comes from

56:12

Netswuite. We all hear all the time how

56:14

AI can push businesses to new frontiers.

56:16

If you're still not sure what that

56:17

actually looks like, particularly for

56:19

you and your company, then look no

56:21

further than Netswuite by Oracle.

56:23

Netswuite is a top AI cloud ERP trusted

56:26

by over 43,000 businesses. It's a

56:28

unified suite that brings your

56:29

financials, inventory, commerce, HR, and

56:31

CRM into a single source of truth. With

56:34

all that connected data, your AI doesn't

56:35

throw out its best guess. It actually

56:37

knows what it's talking about. It can

56:39

intelligently automate routine tasks,

56:40

deliver actionable insights, and help

56:42

make fast AI powered decisions with

56:44

confidence. Now with Netswuite AI

56:46

connector, you can use the AI of your

56:48

choice to connect to your actual

56:49

business data. Plus, you can automate

56:51

those tiresome manual processes. It's AI

56:54

built into the system that runs your

56:55

business, affording you total

56:57

flexibility. Get ahead of the game and

56:59

put AI to work today with Netswuite. If

57:02

your revenues are at least in the seven

57:04

figures, get the free business guide,

57:05

Demystifying AI, at netsweet.com/pivot.

57:09

The guide is free to you at

57:10

netsweet.com/pivot.

57:11

netsweet.com/pivot.

57:15

>> Okay, Scott, let's hear a prediction.

57:18

>> I think that

57:20

what was supposed to be the most

57:21

anticipated IPO, maybe with the expect

57:24

exception of kind of Space X, AI, Tesla,

57:28

the

57:29

>> whatever Tesla's not in there yet.

57:30

>> Probably the most anticipated was Open

57:33

the IPO of Open AI in 2026, sometime

57:36

this year, early 27. I don't think

57:38

that's going to happen. Um I think that

57:42

yeah I think this company is is now is

57:44

gone into full

57:47

um I don't call it panic mode but it

57:50

feels as if the m momentum has a habit

57:53

of creating more momentum and I think

57:54

the momentum is really negative around

57:56

this company

57:57

>> what happens where does it go what does

57:59

it do

58:00

>> well I think they'll substantially

58:03

um scale back there I mean have you

58:05

already seen the war have you already

58:06

seen Jensen Hang and Sam Alman who were

58:09

you

58:09

Bud buddies are already [ __ ] posting

58:11

each other,

58:12

>> right?

58:12

>> Claiming that the hundred billion dollar

58:14

agreement with was a framework and

58:16

they're actually not the hundred billion

58:17

investment.

58:18

>> May I just say you said that

58:21

>> well that was ridiculous. These circular

58:23

deals I'll give you a hundred billion.

58:25

I'll invest 100 billion if you invest

58:26

100 billion in our chips. And now and

58:28

now quote unquote Jensen's backtracking

58:30

and saying well it was just a framework

58:33

they couldn't justify it. Nvidia stock

58:35

has gone down because people are worried

58:37

about exposure to open AI. Right. So

58:38

what does OpenAI do? They start ship

58:40

posting Nvidia and saying no was because

58:42

their chips didn't live up to our

58:44

expectations. When when the biggest

58:47

player in the space Jensen Hong and kind

58:50

of the young gun Open AI start ship

58:52

posting each other and and they back out

58:55

of this hundred billion dollar

58:56

investment framework. That is a really

59:00

bad sign.

59:01

>> He kept using what was the word? We're

59:02

honored to be invited. What was he

59:05

saying? It was so funny. Yeah, but

59:06

they're both going on background now and

59:07

blaming each other.

59:08

>> Oh, totally. Utterly. Like, can I just

59:10

give people a lesson? When you hear

59:12

sources close to the situation, if they

59:14

were any closer to either of them,

59:15

they'd be on the other side of them.

59:17

>> That's them, right? The

59:20

>> So, I think I think the momentum the the

59:23

worm has turned. And it's not that

59:24

OpenAI isn't an unbelievable company

59:26

that could go public at like a $50

59:28

billion market capitalization. But the

59:29

problem is when you sell some investors

59:32

in at 250, 450, and then if he's able to

59:34

close this round at 850, they're not

59:37

willing to go public or let you have a

59:39

liquidity event that cuts there. What

59:41

happens in an IPO? Say he went public at

59:44

300 billion next year and said, "Okay,

59:45

the market isn't what we thought."

59:47

Unless there's a couple years where the

59:49

latest round of investors get so

59:51

fatigued they're willing to take a 60%

59:53

haircut. All of your shares, the last

59:55

round of investment has a preference,

59:58

meaning they they're the first money

60:00

out. So the 50 or 100 billion going in

60:01

at 850 doesn't want to give up their

60:04

liquidity preference and let them go

60:06

public if they're going public at less

60:07

than 850, which I think they would. So

60:10

your last round of investors become a

60:12

veto block for going public unless

60:14

you're going to go public at a valuation

60:16

greater than 850.

60:17

>> So what do they do? You haven't answered

60:19

my question.

60:20

>> They'll dramatic, in my opinion, they'll

60:22

dramatically scale back their capital,

60:24

their capex and they'll end up with a

60:26

much smaller, much less ambitious

60:28

amazing company that's only worth a

60:30

100red or 200 billion. It's only one of

60:31

the 30 most valuable companies in

60:34

America. Not the

60:35

>> get bought what? Well, that means

60:36

everyone else will get collapsed, right?

60:38

Or not. I think the whole my opinion if

60:40

you look at and I look at weird signals

60:42

the percentage of ads at the Super Bowl

60:44

right if you look at all this I think

60:47

there's a ton of anecdotal evidence

60:49

showing that while AI may live up to its

60:51

potential the market cap of the biggest

60:54

players this year is about to throw up

60:57

which isn't to say that similar in 2000

60:59

when the market cap of Amazon went down

61:00

95% it's still not going to be an

61:03

unbelievable company but I think we're

61:05

about to see a dramatic recalibration in

61:07

the markets which includes Open AAI's

61:09

IPO plans getting queered. Now, who's

61:11

going to take their place? And this is

61:13

the prediction.

61:14

>> Mhm.

61:15

>> The most impressive numbers hands down

61:17

that no one I wasn't expecting.

61:20

>> Khi's

61:23

year on year

61:24

>> and not Poly Market, right?

61:25

>> Well, Khi's Khi is actually of the two

61:29

the clean well-lit space of this I see.

61:32

Okay.

61:33

>> Of this marketplace, right? It's a

61:35

little Cali is CFTC regulated. It's also

61:39

in the US. It's peer-to-peer trading.

61:41

It's federally regulated. Um I have some

61:45

I don't have moral clarity around these

61:46

issues because I do think they tap into

61:48

the dopa of a young more risk aggressive

61:50

male brain. But just let me go straight

61:52

to the numbers here.

61:54

>> In 2026 or in this Super Bowl, right,

61:58

>> over a billion dollar in trading volume

62:01

on Cali. That's up 2,700%.

62:05

It was up 28fold this year. And you know

62:08

who's getting absolutely the [ __ ] kicked

62:10

out of them is Flutter is the gambling

62:13

sites.

62:15

They're killing these guys. The sports

62:17

market accounted for about 90% of Kow's

62:19

activity this month. And it's it's

62:22

having an incredible impact on

62:24

traditional gambling and sports book. Um

62:28

analysts have noted that Khu's ride ride

62:30

rise coincides with the underperformance

62:32

in major sports books stock prices draft

62:34

kings flutter as traders shift some

62:36

activity towards prediction markets and

62:39

with a venue that's easy to access

62:40

nationwide which koshi is even in states

62:43

without legal sports betting the firm is

62:45

attracting betterers who might otherwise

62:47

um have used traditional sports books.

62:49

So this is this company and my my

62:53

prediction is the following.

62:55

Open AI way to the downside doesn't get

62:58

public. Khi is about is going to be in

63:01

my opinion the kind of IPO we're all

63:04

trying to get into in Q2 of Q3 of this

63:06

year.

63:07

>> Kchi it is. All right. Well, that's

63:09

interesting. You've been sounding this

63:10

alarm for these companies. Interesting.

63:12

Fascinating. That's a big one, Scott.

63:14

That's a big one.

63:15

>> We'll see. Right.

63:16

>> Yeah. Anyway, we want to hear from you.

63:19

Send us your questions about business,

63:20

tech, or whatever's on your mind. Go to

63:21

nymag.com/pivot

63:23

to submit a question for the show or

63:25

call 85551 pivot. Uh elsewhere in the

63:28

Cara and Scott universe this week on

63:29

Profy Markets, Scott spoke with Esar

63:32

Prasad, professor of trade policy and

63:34

economics at Cornell University to

63:36

discuss why he thinks economics,

63:38

domestic politics, and geopolitics are

63:40

stuck in a doom loop. Doom loop. Let's

63:43

listen to a clip. Globalization used to

63:46

be seen as a positive sum game where

63:48

countries could benefit mutually from

63:50

trade and that would be an offset to

63:53

what is intrinsically the zero sum game

63:55

of geopolitics where one country can

63:57

gain influence only at the expense of

64:00

another. But now even globalization has

64:02

become seen as a zero sum game. So it

64:05

isn't offsetting the zero sum game of

64:07

geopolitics and worse some of the

64:10

negative dynamics of globalization have

64:12

started infecting domestic politics not

64:14

just in the US but in many other

64:16

countries.

64:17

>> God I feel smarter already my people

64:19

>> you know there's

64:22

professor um

64:23

>> Prasad that one of the things that

64:26

struck me and I said this

64:27

>> we graduated the same year from

64:29

undergraduate me from UCLA him from the

64:32

University of Madras.

64:33

>> Mhm. I graduated with a 2.27 GPA with an

64:37

incredible ability to make bongs out of

64:39

any household item.

64:40

>> He won a scholarship in India that

64:42

identified like one of the 50 smartest

64:44

kids of a billion kids.

64:46

>> And so what what do what does a guy

64:49

who's one of the 50 like he this guy

64:52

could walk into the Rose Bowl and take

64:54

the average IQ of those 80,000 people up

64:56

a couple points. That's how smart and

64:58

hardworking this man is. So what are we

64:59

doing to what are we saying to these

65:01

people now? Can you imagine a kid coming

65:02

out of the University of Memphis right

65:04

now in 2026? Is he going to go to Brown?

65:07

>> Yeah.

65:07

>> No, he's going to go to Miguel or he's

65:09

going to go to Instituta or he's going

65:11

to go to

65:11

>> INSEAD

65:12

>> or who knows maybe

65:14

>> maybe the University of Cordova in

65:16

Argentina. I mean,

65:17

>> speaking of doom loops, academic doom

65:20

loops.

65:20

>> We're the sports team that used to have

65:22

access to the number one draft at any

65:25

college in the world and we've said no,

65:27

we don't want

65:27

>> and now we just have Prof coming back to

65:30

the

65:32

Anyway, it's it sounds like a great

65:34

interview. I'll be listening to it. That

65:35

is the show. Thanks for listening to

65:37

Pivot and be sure to like and subscribe

65:39

to our YouTube channel. Uh we'll be back

65:41

next week.

Interactive Summary

The podcast discusses several current events, beginning with the "unsubscribe" economic protest, highlighting its impact on companies like OpenAI and the financial benefits for individuals. It then shifts to the controversial testimony of Attorney General Pam Bondi regarding the Epstein files, critiquing her focus on economic prosperity over victim concerns and the DOJ's perceived inaction. The conversation delves into surveillance and privacy issues, particularly concerning smart home devices like Nest doorbells storing "deleted" footage. A major segment addresses the social media addiction lawsuit against Meta and YouTube, presenting alarming statistics on teen addiction and its severe psychological effects. The hosts also touch upon the economic implications of Trump's tariffs, recent developments in the AI industry including Anthropic's funding and OpenAI's internal issues, the imprisonment of Hong Kong media mogul Jimmy Lai, and a prediction about OpenAI's IPO prospects versus the rise of prediction market platform, Kohi.

Suggested questions

5 ready-made prompts