HomeVideos

"Elon musk: Wikipedia is not for sale" - Nish Kumar meets Wikipedia founder Jimmy Wales (pt 2)

Now Playing

"Elon musk: Wikipedia is not for sale" - Nish Kumar meets Wikipedia founder Jimmy Wales (pt 2)

Transcript

520 segments

0:04

You're not argument averse. It I just

0:06

would like to note the smoothness of

0:08

which the staff managed to switch my

0:10

microphone. That was like a magic trick.

0:13

Um you're not averse to arguments

0:15

though. In the book you talk about

0:16

actually how that's quite an important

0:17

part of building trust and how actually

0:19

trust is actually a fundamental part of

0:22

having a good argument with somebody

0:24

because if you both trust that you're

0:26

both coming from a place that you're

0:27

trying to improve the thing

0:28

>> Yeah. then you can actually get into it

0:30

properly with people.

0:31

>> Yeah. No, and I I think what's what's

0:33

great is, you know, the example I like

0:35

to use because it's so clear and kind of

0:37

simple is, you know, imagine think about

0:40

our article on abortion,

0:42

>> a controversial topic, abortion, and

0:44

imagine a kind and thoughtful Catholic

0:46

priest and a kind and thoughtful Planned

0:47

Parenthood activist. And the key is I

0:50

specified kind and thoughtful. um well

0:53

they are never going to agree on the

0:55

fundamental issue but what they can

0:57

agree is to write about the issue in a

0:58

fair way. So the article will say, you

1:01

know, the Catholic Church position on

1:02

abortion is thus and such and critics

1:04

have responded thus and the pope said

1:06

this and so on. And at the end of the

1:08

day, as they've been working together,

1:10

they may even make friends if if they're

1:12

civil and kind people, and they can both

1:14

point at that and say, "Right, this is

1:16

good. Like, this is something if you

1:18

read this, you would really understand

1:21

what the debate is. You'd understand

1:22

some of the ins and outs and and all of

1:24

the different factors." Whereas if the

1:27

space isn't designed to support that

1:30

kind of conversation, what do you get?

1:31

You get people just screaming at each

1:33

other.

1:33

>> Yeah.

1:33

>> Uh and that's not really helpful for

1:36

anybody.

1:37

So how do we

1:41

I mean the the very fact that you have

1:45

felt the need to write this book and the

1:47

very fact that you talk about the loss

1:49

of trust being the symptom and the cause

1:51

of the problem suggests that you know

1:53

we've all collectively identified that

1:55

there is a problem here. There is a

1:56

there is an issue that we can no longer

2:00

agree on basic facts. M so

2:05

it's part of the there is a kind of

2:07

higher purpose to you writing this book

2:09

in some ways it's to sort of essentially

2:12

provide a blueprint for a return to a

2:14

fact-based discourse. Right. I I I mean

2:16

I I I'm ascribing that motivation. It's

2:19

not in the text. The text is very much

2:22

like consistent with the principles of

2:24

Wikipedia. It's about having a reasoned

2:26

and you know impassioned argument that's

2:28

based in fact. But it doesn't there is a

2:30

higher purpose here isn't there? you are

2:32

trying to get us back to a place where

2:34

we can all agree on a set of facts that

2:36

we have different opinions about. I mean

2:38

it's crucial that we do uh because we

2:41

have a lot of problems in society a lot

2:44

of conflicts and if we can agree on some

2:48

basic sets of facts and then we can

2:51

discuss the values that we have and how

2:54

do we accommodate you know how do we

2:56

compromise what are the what are the

2:57

pros and cons but if we're living in

2:59

completely different universes where we

3:02

don't share the facts at all I mean the

3:04

a great example you know uh Donald Trump

3:07

there was a famous time when he claimed

3:10

that uh Mexico was just sending

3:13

murderers and rapists.

3:14

>> Yeah.

3:14

>> Right. And you know if you actually look

3:16

at the statistics the crime rate of

3:19

people who have come illegally to the US

3:22

is actually lower than the crime rate of

3:24

police officers. So that that's not the

3:27

same as they're only sending murderers

3:28

and rapists. And obviously this country

3:31

has a big uh immigration debate and you

3:34

know but like that question like the

3:36

question of how do we deal with

3:38

immigration? Should we have more people

3:41

here? Should we have less people here?

3:42

What do we do about people who are

3:44

coming here on small boats? How do we

3:46

manage it? Those are legitimate. Like

3:48

people can have very different views on

3:49

that. But if your views are so colored

3:54

by misinformation that you don't even

3:57

know what's going on, then you begin to

3:59

think people on the other side are

4:02

supporting things they don't. Like if

4:04

you actually think so I've seen there's

4:06

a been a lot of discourse uh online and

4:08

actually a couple of tech bros who I

4:10

know from back in the day. I messaged

4:12

them privately because they, you know,

4:14

they're posting this stuff about that

4:17

London uh that that basically there's,

4:21

you know, a minority of English people

4:23

in London. And I was like, well, that's

4:26

not correct. And I sent some statistics

4:28

to say, yes, the number of foreigners in

4:30

London has increased some in the last

4:31

years. It's nowhere close to 50/50.

4:33

>> And he sent back another thing. And it

4:35

wasn't about whether people were British

4:40

or not. It was about whether they were

4:43

brown or not.

4:44

>> Yeah.

4:44

>> And I said to him, "Oh, so what you

4:47

really meant was race. I'm kind of

4:48

disappointed in you." And he's never

4:50

answered me again.

4:50

>> Yeah.

4:51

>> But also there's

4:52

>> I'm I'm a Londoner. I'm I was born in

4:54

Britain, but I don't think I figure in

4:57

some of the conceptions of what a

5:00

British person looks like in some.

5:01

>> Yeah. And that and that's insane, right?

5:03

We we all agree that's completely

5:05

insane. And I think in a different

5:07

context he was, "Oh, yeah, right." Cuz

5:09

I, you know, I was tempted. I just

5:10

decided arguing with people on the

5:12

internet's a waste of time.

5:13

>> Yeah.

5:13

>> But I nearly said, "So, do you think

5:15

African-Americans aren't real Americans

5:18

because you're basically saying the same

5:20

thing?" Like, they're you. Anyway, but

5:22

also there's this there's this sort of

5:23

vision of London as being a lawless hell

5:26

hole.

5:27

>> Yeah.

5:27

>> And I'm like, I live in London and it's

5:30

really nice. [laughter]

5:32

Like, you can you can walk home at night

5:35

and you'll be fine. And you know what?

5:36

It's a big city and there are places you

5:38

probably shouldn't go at 2:00 in the

5:40

morning. I don't know of any big city

5:41

that doesn't have that.

5:42

>> Yeah.

5:42

>> And

5:43

>> I wouldn't hang around the Tiger Tiger

5:45

in Leicester Square at 2:00. [laughter]

5:48

>> You see some really interesting stuff.

5:51

[laughter]

5:52

>> Exactly. Uh but you know,

5:55

>> I've seen a guy poop in the street.

5:56

Okay, let's not go down that road

5:57

anyway.

5:58

>> Well, and by the way, I'm like someone

6:02

living in San Francisco has the goal to

6:04

say something about I mean, San

6:07

Francisco, it is not uncommon to be

6:09

accosted in the street by a mentally ill

6:11

person who's screaming at the sky. And

6:13

that's a sad, like, it's a really tragic

6:15

thing about San Francisco, one of the

6:17

wealthiest places in the world that also

6:19

tolerates,

6:20

>> you know, really like pretty pathetic

6:22

situation. And I'm like, it's a big

6:24

city. I suppose you might occasionally

6:26

see a crazy person screaming at the sky,

6:28

but it's not that common. But how do you

6:31

get back if we

6:32

>> if we look at the but if we just look at

6:34

the trajectory of someone like Donald

6:35

Trump. So Donald Trump becomes a

6:37

politician

6:38

>> becomes a political figure. I mean he's

6:40

there are various interventions in the

6:42

past when he took out an advert you know

6:44

calling for the death penalty to be

6:46

applied to the central park five but

6:48

really his meaningful political career

6:50

begins pushing a conspiracy theory that

6:53

Barack Obama was is not an was a Kenyan

6:57

and it's known as the birther

6:58

conspiracy. He then essentially uses

7:02

becoming the figurehead of that birther

7:04

of conspiracy movement as a platform to

7:06

run for the Republican nomination. His

7:08

opening speech, you've already mentioned

7:10

the fact that he said that Mexico was

7:12

sending murderers and rapists to

7:14

America. He he becomes president in a

7:16

kind of blizzard of lies. Then he loses

7:19

the presidency. He spends he spreads a

7:21

lot of again unverified claims about

7:23

electoral interference and electoral

7:25

fraud in January 2020. He then uh sorry

7:29

in November 2020 he then claims makes

7:32

claims about the events of January the

7:35

6th that would not pass muster on

7:37

Wikipedia. His characterization of that

7:40

would be taken down by moderators on

7:43

January the 6th but he becomes president

7:45

again. So in a world where somebody is

7:47

being rewarded for that macity

7:51

>> repeatedly politically, how do we get

7:54

back to a place of fact-based

7:57

discussion? Because if you were looking

8:00

at this objectively, you'd say, well,

8:01

there's no consequence

8:03

>> for

8:04

essentially just weaponized macity.

8:08

>> Yeah. So I this is where I would say I

8:10

would point to him and and his you know

8:13

career arc there as a good example of

8:16

being both the cause and the symptom. So

8:19

obviously he has caused a lot of uh

8:22

mistrust and distrust. So he rants

8:24

against the media all the time. He says

8:26

things that are patently untrue and then

8:28

two days later denies he ever said it

8:30

even though it's right there, you know.

8:32

Um, and you know, his followers uh

8:36

really believe. I mean, I here here's a

8:37

sad ex moment. Um, a friend of mine from

8:41

high school posted something on Facebook

8:43

and I was like, I don't I don't think

8:45

that's right. So, I quickly checked and

8:46

I found a debunking of this rumor on

8:48

social media in the New York Times and I

8:50

posted that and he said to me, I can't

8:53

believe you would send a link to the New

8:55

York Times. They just make stuff up. As

8:57

if I had sent some crazy blog or

8:59

something. And if he had said, as he

9:01

might have at another point in life, uh,

9:04

you do realize the New York Times is a

9:05

left-leaning paper and has a liberal

9:07

bias and that's we can engage with that.

9:10

That's of course that's that's a thing.

9:12

>> But, you know, his lack of trust in the

9:14

media is so much that if Trump says one

9:17

thing, he's as likely to believe that as

9:19

the debunking. And that's, you know, so

9:22

some of it is uniquely that Donald Trump

9:24

has deliberately undermined trust in the

9:26

media. So he's part of the cause, but

9:28

it's also a symptom because you know

9:31

that a as the decline in media trust,

9:34

the decline in trust in journalism, if

9:36

you look at the numbers from the Edelman

9:38

trust barometer survey, it started a

9:40

long time before that. It started, you

9:42

know, uh it's been a problem for a very

9:45

long time. And I think that there are

9:48

things that the media can do. There's

9:51

things that everybody can do, but um in

9:53

terms of, you know, recommendations for

9:55

journalism. So, one of the things I talk

9:57

about in the book is uh quite famously

10:00

just before the most recent election, uh

10:02

the Washington Post made the decision,

10:05

well, we should say Jeff Bezos made the

10:07

decision for the Washington Post that

10:09

they would not endorse a presidential

10:11

candidate. And this was caused quite an

10:14

outcry and was widely regarded as Jeff

10:17

trying to support Trump or or do a favor

10:20

for Trump or whatever. And my slightly

10:23

contrary position is actually the Post

10:25

should have stopped doing endorsements a

10:27

long time ago.

10:28

>> Um,

10:29

>> this is in the book and it's one of the

10:30

things I found so interesting because

10:32

I'm not sure that I agree with it, but I

10:34

respect the way that I respect the way

10:37

that you've articulated it so much. I

10:38

think it's such an interesting

10:39

>> You're almost a Wikipedia now.

10:41

[laughter]

10:42

That's good. My my work is done here. I

10:44

thought it was such an I think it's such

10:46

an interesting point because newspaper

10:48

endorsements in my lifetime are

10:50

something that are pretty common place

10:51

in elections and then it's almost more

10:53

of a story when a newspaper doesn't

10:55

endorse a candidate in British general

10:58

elections. I think

11:00

I I think the Guardian may not have

11:02

endorsed the Labour Party in 2019. I

11:04

think I can't remember. Somebody in this

11:06

room will definitely be able to correct

11:07

me on it. But there are it is it is

11:10

relatively common place. But it is an

11:11

interesting perspective. Your argument

11:13

is newspapers should not take a kind of

11:16

full editorial position where they

11:18

endorse a candidate in either direction,

11:20

right?

11:21

>> Yeah. I mean, I think in general it's

11:23

just something that doesn't do any good.

11:24

It it's it's it's performative on their

11:27

part and I don't think it convinces

11:28

anybody. There's evidence from research

11:31

that says it reduces trust in the paper.

11:34

Uh and so that, you know, like that's a

11:36

problem. Now, they timed it very badly.

11:39

like if they had stopped doing

11:40

endorsements at a a less sort of fraught

11:43

moment that would have been a great

11:44

thing.

11:45

>> Uh but we also have seen and part of

11:48

this is you know the the the terrible

11:51

sort of decline of the business model

11:53

for journalism particularly local

11:55

journalism is a big piece of this whole

11:57

thing.

11:58

>> And one of the responses we've seen to

12:00

that is and there's other responses that

12:02

are better. One of the responses we've

12:04

seen is um in order to chase clicks uh

12:07

online, you get clickbait headlines and

12:10

more inflammatory stuff. You also get

12:12

outlets becoming more and more partisan.

12:14

>> Yeah.

12:15

>> And you know, you you don't feel like I

12:17

my my example like I have an electric

12:20

car. I like electric cars. I don't have

12:22

a Tesla. Um [laughter]

12:25

but I have an electric car and uh so I'm

12:29

I'm interested. And so I, you know, in

12:31

my in my feed, uh, you know, when I get

12:33

news stories, if I see a headline in the

12:36

Guardian, it's going to be in favor of

12:39

electric cars and it's going to be all

12:41

great stuff. If I see it in the

12:42

Telegraph, it's going to be against. And

12:44

I think that's unfortunate, right?

12:46

That's that then you think when you're

12:48

reading either of them, am I getting the

12:50

full story here or is this just like

12:52

they're campaigning for the thing they

12:53

believe in? And I do really think it's

12:56

quite important to have news that we can

12:59

trust, which means maybe don't campaign

13:02

all the time. Maybe actually keep your

13:04

editorial page editorials in the

13:06

columnists and and that sort of thing,

13:08

but try to be as straight as possible

13:10

with the news. And one of the things

13:11

that one of the good things that is

13:13

happening is the rise of uh

13:15

subscriptions. You know, like the the

13:17

New York Times has managed to this

13:20

hasn't helped local journalism.

13:21

>> Yeah. uh because it's hard to get enough

13:23

subscriptions at the local level

13:25

particularly when you've already

13:26

destroyed your paper. So that's a

13:28

problem. Getting back to that is going

13:30

to be a long job. But you know because

13:32

if if people are are paying

13:35

uh then and this is actually why I think

13:37

the financial papers tend to be a little

13:39

calmer and all that. Their business

13:40

model is much more about paid

13:41

subscriptions. If people are paying uh

13:44

then you know you don't have to try and

13:47

get as many clicks as possible, right?

13:48

you have to actually be a little bit

13:50

more broad with what you're doing and

13:52

sort of informative and so on. So that's

13:54

just one piece is sort of recommendation

13:56

to media is like try not to be so

13:59

hyperartisan that nobody believes you're

14:01

anything other than yet another

14:02

campaigning organization.

14:05

>> That's a perfect moment to segue into

14:06

the real purpose of this evening. Please

14:09

donate to Wikipedia.

14:11

[laughter]

14:13

Well, that that is important, right?

14:14

Because that again we we sort of return

14:17

to the model of Wikipedia as something

14:19

that's going going right on the

14:21

internet, which there's increasingly few

14:24

things on the internet that we can say

14:25

are like not actively harming the way

14:28

that we live our lives. But part of the

14:30

thing with Wikipedia is the fact that

14:32

people donate, the fact that you're not

14:33

relying on, you know, advertising

14:37

revenue or anything like that, it it

14:39

preserves the independence of the thing,

14:40

right?

14:41

>> Yeah. Yeah. I I actually one of the

14:44

things that is very very important is

14:46

independence. That's one of the rules.

14:48

Be independent. Uh and for us that means

14:50

intellectual independence is very

14:52

important. So our uh financial model uh

14:57

you know when uh Elon Musk cut his

15:00

chainsaw out with Doge, you know,

15:02

cutting programs uh and and things that

15:04

the US government was doing, we weren't

15:06

worried because we don't have any

15:07

funding from the US government or any

15:09

governments. Um, I just saw some big

15:12

foundation in the US has decided to stop

15:14

making grants and a lot of corporations

15:17

are pulling back on their uh, you know,

15:20

sort of social giving kind of work.

15:22

>> Well, that's not how we get our money

15:24

either. The vast majority of the money

15:26

that funds Wikipedia is from the small

15:28

donors.

15:29

>> So, people who they see the little

15:30

banner um, thank god they don't use my

15:34

picture nearly as much as they used to.

15:36

>> They're using it today. aware.

15:38

>> Yeah. I don't know whether that's cuz my

15:40

phone is reading the fact that it's near

15:41

your phone or something [laughter] and

15:42

they're like, "You better stick the

15:44

Jimmy picture up if you stand next to

15:45

him." Um, but yeah.

15:46

>> No, but but it's people uh giving their

15:49

20 quid. Yeah.

15:50

>> Uh which is like hugely important

15:53

because that means if we were mostly fun

15:56

and sometimes people say, "Oh, why don't

15:57

you know why don't you just get uh you

15:59

know, Google and Facebook to pay for

16:00

it?" And I'm like, "Well, think it

16:02

through a little bit." Right? that would

16:03

you really be comfortable with us having

16:06

major donors who basically could call

16:08

the tune like and say oh if you don't

16:10

change

16:11

>> sort of implied that he wants to buy

16:12

Wikipedia a couple of times right

16:13

>> yeah yeah he yeah I mean I my most

16:18

successful tweet in history there was a

16:20

a New York Post journalist who said to

16:23

Elon who was moaning about something in

16:24

Wikipedia he said Elon you should just

16:27

buy Wikipedia and I just like retweeted

16:29

that and just said not for sale And um

16:33

that was very successful. People loved

16:36

that. Um and then you know there other

16:38

things like you know the day he said uh

16:40

defund Wikipedia I think we brought in

16:42

$5 million that day. [laughter]

16:45

>> So I'm like bring it man bring it.

16:50

because I I think there are a lot of

16:52

people who uh will say like actually

16:56

it's very concerning to have this guy

17:00

thinking he's going to call the shots

17:02

and like it's really important that we

17:04

have like super geeky Wikipedia. So

17:08

actually a fun story I was in we have an

17:10

annual conference every year uh and I

17:12

was in Alexandria Egypt a few years ago

17:15

and a friend of mine was there at the

17:16

conference who's not a Wikipedian and um

17:19

we had dinner one night we sat at at a

17:21

table and you this is like a geeky

17:23

conference you know so we sat at a table

17:24

for dinner and we just happened to sit

17:26

down with the English arbitration

17:28

committee uh several of the members of

17:31

of the arbcom as we call it so this is

17:32

like the Supreme Court of English

17:34

Wikipedia and we're having this sort of

17:36

super geeky discussion over dinner and

17:38

then we got up to walk away and I said,

17:40

"Do you realize you just had dinner with

17:43

some of the most powerful people in

17:44

media in the entire world? This is the

17:48

Supreme Court of English Wikipedia." And

17:49

do you know who they are? A bunch of

17:51

freaking geeks, right? They're absolute

17:54

nerds and they're really passionate

17:56

about Wikipedia. They're, by the way,

17:58

they're not woke leftists as Elon would

17:59

have you believe. They're like not very

18:02

political in many ways, right? except

18:04

for they're quite political about truth

18:06

and facts and things like that. And so

18:08

I'm like, this is great, right? This is

18:11

actually really really cool that and

18:14

their concerns are never about um how do

18:17

we increase our revenue? Are we going to

18:19

lose the grant from that rich

18:20

organization? Uh is the government going

18:22

to cut our funding if we are critical of

18:24

government policy? Nothing like that.

18:26

They're just like, let's follow the

18:28

facts. Let's document the world and be

18:30

as fair as we can. Fantastic. But

18:45

don't

Interactive Summary

The discussion highlights the crucial role of trust in fostering constructive arguments, emphasizing that mutual trust enables proper engagement with differing viewpoints. It illustrates this with an example of finding common ground on controversial topics by focusing on fair representation of facts. The conversation then addresses the erosion of fact-based discourse, attributing it partly to figures like Donald Trump who act as both a cause and symptom of declining media trust by spreading misinformation. A key recommendation for media organizations to regain public trust is to cease partisan endorsements and prioritize objective reporting, relying instead on subscription models to ensure independence. Wikipedia is presented as a successful model for maintaining independence and factual integrity through its reliance on small individual donations, ensuring its community of dedicated "geeks" can focus on truth and facts without external influence.

Suggested questions

5 ready-made prompts