HomeVideos

How Information Integrity Shapes Climate, Peace, and Society - SDG Media Zone | United Nations

Now Playing

How Information Integrity Shapes Climate, Peace, and Society - SDG Media Zone | United Nations

Transcript

561 segments

0:01

Hi, good morning everyone. Welcome to

0:04

the SDG media zone. We are live from the

0:07

UN in New York and it's my pleasure to

0:12

pass the pass the microphone to for a

0:15

conversation between UN under secretary

0:17

general global communications Melissa

0:19

Fleming

0:21

>> and founder and chairman of purpose

0:24

Jeremy Hyman's. Over to you.

0:26

>> Thank you Alexandra. Hi everybody. Um,

0:30

great to be at the media zone, the SCG

0:33

media zone today with Jeremy Hymans, who

0:36

works very closely together with the

0:38

United Nations. In fact, I think you

0:40

have been for your whole career. Just

0:42

quick aside, can you tell your story of

0:44

how you first encountered the UN,

0:46

Jeremy?

0:48

So um I was a a child ambassador for

0:51

peace from the age of about eight uh

0:54

when the cold war was winding up and

0:57

they wanted young people to kind of u

1:00

make the case for peace. So I had the

1:03

the pleasure um and the weirdness of

1:06

being kind of going to various UN events

1:08

and campaigning in advance of the world

1:10

summit for children uh all those years

1:13

ago. Um, so, uh, yes, I've long

1:17

understood the, um, the idealism and the

1:21

hope for common humanity that the UN

1:23

represents and all of that potential is

1:25

something that I really, I really

1:27

believe in.

1:27

>> Right. And we're going to talk a bit

1:29

about how we work together because you

1:31

have a an organization called Purpose. I

1:35

think there is a way that you describe

1:37

it. I always said it's kind of like a

1:39

social movement agency. How would you

1:41

describe purpose? Yeah, I think we, you

1:44

know, we are uh an agency that works all

1:46

over the world to mobilize the public to

1:49

try to shift narratives uh on global

1:52

issues. Uh and we operate in in uh on

1:56

pretty much every continent. We have

1:58

offices in eight or nine countries. So,

2:01

uh it's fascinating work. And I first

2:03

started working with you when I was

2:05

working at UNHCR and purpose was helping

2:07

us

2:09

be more strategic in how we reached

2:12

audiences and how we segmented our

2:14

audiences to to look at the concerned

2:18

middle the people who were kind of

2:20

undecided and to come up with effective

2:22

campaigns for reaching when it's it was

2:25

an issue that was being so polarized.

2:27

And now when I came here to take on this

2:31

role in 2019, we were about to start

2:34

working on how do we communicate climate

2:38

change more effectively? How do we reach

2:40

communities in the languages they speak

2:43

on the platforms where they get their

2:44

information. Um, I really turned to you

2:47

because you had just published a book

2:48

called New Power and you were really

2:50

expert in how um how these kind of

2:54

movements started where people concerned

2:57

citizens uh find a way to get to to join

3:01

forces for the common good and we really

3:03

needed to join forces with the citizens

3:06

of the world to help us reach our

3:08

climate goals. But then we had the

3:10

global pandemic and so then we started

3:12

an initiative called verified that uh

3:16

was quite novel. Can you describe what

3:19

was so different about verified than the

3:22

usual communications that come out of an

3:24

institution like the United Nations?

3:26

>> Yeah. Well, I mean as you say it was it

3:28

was March April 2020. um we knew and

3:33

already we were seeing there would be

3:35

what was described at the time as the

3:37

infomic right this massive information

3:40

challenge associated with co and you

3:43

know there was a lot that people didn't

3:44

understand at the beginning of the virus

3:46

and there was a lot that was

3:47

deliberately being put into the

3:49

information ecosystem that was wrong we

3:51

also realized that traditional

3:53

institutions would probably not um be

3:58

fit for the task on their own of

4:01

engaging with that information warfare.

4:03

You know, the vaccine skeptics, for

4:05

example, they were organized in these

4:07

very close-knit online communities. They

4:10

weren't necessarily pulling the lever of

4:12

big institutions. They are now in some

4:15

places. We'll we'll come back to that.

4:17

But they were very effective at even a

4:19

small number of actors reaching millions

4:22

or hundreds of millions of people with

4:24

rumors, with myths and disinformation.

4:25

So, we knew we had to do something about

4:27

it. And so we formed this I think really

4:29

wonderful and very novel alliance where

4:31

we took all of the institutional

4:33

credibility, the science uh and the

4:36

global reach of the UN like only the UN

4:39

could for example reach out to

4:41

broadcasters in Africa and very quickly

4:44

get uh you know reach to hundreds of

4:46

millions of Africans with public health

4:49

messages. But what we could do at

4:50

purpose was, you know, fight on this new

4:53

battlefield, right? Um, develop content

4:57

that sometimes had nothing to do with

4:58

the UN, that was unbranded, but that was

5:01

reaching audiences. I always like the

5:03

example of like we were trying to reach

5:05

young men in India. [clears throat] We

5:07

weren't going to reach young men in

5:08

India with a WH fact sheet, right? But

5:11

we could reach them with through the

5:13

YouTube comedians that they, you know,

5:17

they would watch and love. So if we

5:18

could get a YouTube comedian to create

5:21

content that specifically addressed some

5:23

of that co miss and disinformation that

5:27

was the way we were going to get those

5:28

people in large numbers engaged. So it

5:30

was a massive distributed effort to

5:33

create content in all forms in all

5:35

languages and to your earlier point

5:37

about refugees to reach the people who

5:40

were not already in our camp. the people

5:43

reading the New York Times, they were

5:44

going to be okay, right? They were

5:46

getting mostly accurate information, but

5:49

it was the people, you know, you know,

5:51

outside of those more traditional

5:53

mainstream media ecosystems that we

5:56

really needed to reach. And I'm very

5:57

proud of the work we did together uh on

6:00

that.

6:00

>> I mean, that was at a time where we

6:03

really saw that the vast majority of

6:05

people around the world were turning to

6:07

social media and to influencers. We also

6:12

saw that the kind it's kind of the

6:14

disinformation actor playbook to

6:17

identify people uh influencers to carry

6:23

their messages. They're often paid for

6:25

that. What I what really struck me about

6:27

the work that we did was that we did a

6:29

kind of call for volunteers

6:32

um information volunteers and we had

6:34

such an overwhelming response from the

6:37

public. Uh so many people really wanted

6:40

to get on board and help us spread

6:41

facts.

6:42

>> Um and then we recruited

6:46

health the health community, vaccine

6:48

scientists, doctors, trained them up on

6:50

Tik Tok. Can you just explain how that

6:53

worked? I mean, why was that so

6:55

effective?

6:56

>> Well, I think it's very instructive to

6:58

what we need to do now in the in the

7:00

horrific information environment that we

7:02

face on many issues. So we recognized

7:06

that um the way people were going to get

7:08

trusted information was through

7:10

individuals that they related to that

7:13

were compelling that they trusted

7:15

probably not through institutions in a

7:18

kind of cold impartial voice. So we went

7:21

uh knowing that vaccine confidence would

7:23

be a huge challenge we literally went to

7:25

the vaccine labs that were developing

7:28

the co vaccines. This was even before

7:30

the vaccines had been um uh published

7:33

and released. And we looked for

7:35

charismatic,

7:37

>> often young, but not always young, but

7:39

interesting people who were vaccine

7:41

scientists that we thought could tell

7:43

this story, which is a fascinating story

7:46

of the development of the vaccines and

7:48

then once the vaccines were out to the

7:49

public, who could engage, reassure, and

7:52

educate on that. And we found these

7:55

incredible people who'd actually never

7:57

used Tik Tok. And we the reason we chose

7:58

Tik Tok is that Tik Tok was and still is

8:01

in many ways the place that was

8:03

originating the most viral content on

8:05

the internet even if it was being

8:07

syndicated then through other platforms.

8:09

And we trained them in how to use Tik

8:11

Tok and we turned those people from

8:13

vaccine scientists into vaccine

8:16

scientists and stars with huge

8:18

audiences. Um, you know, we had we had

8:21

billions and billions of views of their

8:23

content and we did some really

8:24

interesting testing, you know, proper um

8:27

controlled testing that showed that that

8:30

kind of content was much more persuasive

8:33

at increasing vaccine confidence and

8:35

uptake than the more traditional health

8:38

communications that were that were

8:40

circulating. And so, and that just

8:42

reflects even in even 10 years ago that

8:45

strategy would not have been as

8:47

effective. So part of the dynamic we're

8:49

in is that the is that the the kind of

8:52

distributed influence that we now see

8:54

you know that that information trust

8:57

news is being filtered through these

8:59

individuals is so critical and that is

9:03

not necessarily a good fit for

9:04

institutions who are not used to that

9:06

kind of um level of engagement. So

9:09

together we kind of developed a network

9:12

of these influencers. We called them

9:14

team Halo, these vaccine scientists. And

9:16

that network grew over time, included

9:18

other health professionals, and that's

9:20

now what we're doing on verified on

9:22

climate. These verified climate

9:24

champions, and we have some amazing

9:26

climate champions in the room today who

9:29

are doing that work, you know, people

9:31

who will reach totally different

9:33

audiences, right, to the Brazilian

9:35

government, right? um but people with

9:37

great expertise um you know for example

9:40

people who are scientists um who can

9:42

speak with extraordinary expertise. So

9:44

this new model networks of influencers

9:47

that you coordinate that are singing

9:49

from the same sheet at one level but at

9:52

another level you're allowing a thousand

9:54

flowers to bloom. I think that's the

9:56

only way we're going to compete with the

9:58

sort of disinformation forces that we're

9:59

up against.

10:00

>> Yeah. I I remember you were expressing

10:03

your surprise that the United Nations as

10:06

an institution would step away and not

10:08

insist that our brand be part of this.

10:12

Why were you so surprised?

10:14

>> Well, I mean, you know, uh I think I I'm

10:16

going to give you some credit here,

10:17

Melissa. I think the fact that you you

10:19

were in that leadership role at the UN,

10:21

I'm not sure it would have happened with

10:23

any other leader. But I think that

10:25

institutions, you know, like the UN,

10:28

they're very riskaverse. And, you know,

10:30

understandably, the whole mental model

10:32

is what we would have called in our book

10:34

old power, right? It's the castle model.

10:36

It's control. And of course, in this

10:39

media environment, the people who are

10:41

doing that um are really getting left

10:43

behind unfortunately um because of the

10:46

way that influence and information and

10:49

attention works. So, it took a leader

10:51

who was willing to do that, but I think

10:53

it was also a partnership um of the kind

10:55

that we have that still valued trust and

10:58

integrity that we weren't going off

11:00

making stuff up, right? But we were

11:02

finding ways to make facts, truth,

11:05

reality, human and compelling. And

11:08

that's what we need a lot more of. It's

11:10

not that we just need to make stuff up

11:11

the way the other side does. That's not

11:13

going to help us in the long term. But

11:15

we do need really novel ways to reach

11:19

people and I think that's where our

11:21

partnership was, you know, and is so

11:23

compelling.

11:25

>> I mean, one of we've been working now

11:27

ever since the co 19 pandemic also on

11:31

information integrity at the United

11:33

Nations. In fact, we kind of started

11:36

calling it information integrity when we

11:38

launched UN global principles on

11:40

information integrity last summer. The

11:42

secretary general launched them here. Um

11:44

it used to be you know we were just

11:47

focusing on the platforms. This is

11:49

actually a blueprint that provides like

11:52

calls to action for all almost all

11:56

involved sectors um who play a role. So

11:59

it is the platforms and I want to come

12:01

back to them and ask you a question

12:02

about that because that was our entire

12:05

strategy. Ask the platforms to be

12:08

responsible and to favor facts over

12:10

lies. that strategy didn't work. Um, so

12:14

then we also have recommendations for

12:16

governments. We have recommendations for

12:18

traditional media too and also that

12:20

traditional media that is really kind of

12:23

collapsed in the social media era um be

12:26

bolstered because they have a huge role

12:28

uh public service role but also

12:30

advertisers and then equipping people

12:33

themselves to be able to you know

12:35

understand how miss and disinformation

12:37

travels and to defend themselves. But I

12:40

I wanted to ask

12:41

you you know why has this strategy of

12:46

confronting the platforms not worked in

12:48

your view and what do we need to do that

12:50

is just the most effective way the

12:53

people who are trying to do good in this

12:54

world the pe the institutions that are

12:56

trying to change the world for the

12:58

better how do we arm ourselves to

13:01

communicate in this day and age

13:03

>> it's such a big question and you know

13:05

look I've been as you know building

13:07

movements on different issues all my

13:08

life and this should be, you know, every

13:13

campaigner's number two issue. You know,

13:15

you might be a climate campaigner or a

13:17

LGBT campaigner or whatever, but

13:19

everyone needs to be fighting this in

13:21

information uh integrity fight, but it's

13:23

a lot harder to build a movement around

13:26

confronting the platforms around things

13:29

that can be a little bit arcane and

13:30

technical, but boy, do we need it. And I

13:33

think there's a really clear story here

13:35

of powerful interests. um you know the

13:38

enormous concentrated power of these

13:40

tech players. Um and uh you know I think

13:44

unfortunately populists and agents of

13:47

disinformation have better figured out

13:50

how to capture an ally with those

13:53

powerful technology interests than those

13:56

who want liberal democracy. And that's

13:58

the story of the last few years. And um

14:01

I think we can turn it around. You know,

14:03

I really do. Um I don't I think part of

14:06

the answer and part of the hope comes

14:08

from the fact that the ways the the

14:11

information environment is changing

14:13

almost month by month right the dominant

14:15

platforms the way people's feelings

14:17

thoughts opinions are being shaped so we

14:20

have to you know we may not be able to

14:22

change at this point what X's position

14:24

is uh or what Meta's position is in a

14:27

material way but we could shape what the

14:30

next platform that hundreds of millions

14:31

or billions of people go to. we can

14:34

shape I think still the way AI impacts

14:38

the information environment. I mean one

14:40

point I've been making recently about AI

14:41

which is interesting is that there is a

14:44

big shift coming which is that right now

14:47

most people's thoughts feelings opinions

14:49

are being shaped by social media right

14:51

by those influences by that toxic swirl

14:55

the algorithms etc. But increasingly

14:58

people's uh information environment and

15:01

trust is being shaped by their

15:02

relationship with their LLM, right? With

15:05

what we have called in our recent HBR

15:07

piece, people's digital significant

15:09

others. And that is different. So you're

15:13

going to ask that person, well, what do

15:15

you think about this um you know, what

15:17

do you think about this thing that this

15:19

rumor I just heard about climate change

15:21

or about vaccines? Right? And so what

15:24

the what the LLM says to you about that

15:28

will be incredibly important because

15:30

these LLMs are primed to make us trust

15:33

them to make us you know have a real

15:35

sense frankly of that they are all

15:38

knowing right now that carries many

15:41

risks but there's also an opportunity in

15:43

that if those LLMs are actually

15:46

reporting things back that are more

15:48

grounded in reliable and trustworthy

15:50

sources there may be an opportunity for

15:53

these LLMs to depolarize and to take the

15:56

heat out of the information environment

15:58

a little bit, but only if the boundaries

16:01

that these AI developers are set are

16:04

grounded in truth and they're not just

16:07

scraping up the junk and feeding it back

16:10

to people, right? So, that's a big

16:12

coming uh arena for advocacy and work

16:16

that we all need to do. But I also think

16:18

we need to remember not to fight the

16:19

last war that the dynamics are changing.

16:22

And in a world where, you know, you've

16:24

got these funnels where everybody's in

16:26

this very intimate one-on-one

16:28

relationship with their with their

16:30

digital significant other, like that's

16:32

quite a different information challenge

16:34

to the challenge of social media

16:36

platforms.

16:36

>> Do you have a digital significant other?

16:38

I I'd like to think I don't um in the

16:41

sense that I I I would like to think

16:43

that I have not become too

16:45

intellectually or emotionally dependent

16:46

on chat GPT but you know it is

16:49

extraordinary how much you know um this

16:52

dependency is built into these new

16:55

technologies. I mean they literally ask

16:57

us and encourage us to confide in them

17:00

to tell us our problems right and this

17:03

is going to be a very potent cultural

17:05

force maybe even more potent than the

17:07

way social media reshaped people.

17:09

>> Yeah and absolutely we're so concerned

17:12

about the information that they're

17:14

trained on. I mean, I

17:16

did this experiment with ChatGBT um

17:19

because my husband came to me one day

17:21

and asked me who a certain man was and I

17:24

said, "Why are you asking?" And I told

17:26

him and he said, "Because Chhat GBT says

17:29

he's your husband." And I and I said,

17:32

"Uh oh." And then so every month I

17:34

started asking Chat GBT, "Who is Melissa

17:36

Fleming's husband?" And each month it

17:38

told me I had a different husband. um

17:41

none of whom was my actual husband whom

17:43

I've been married to for 30 years. So

17:46

you have

17:46

>> I honestly I mean I do this these kinds

17:49

of experiments because I also just want

17:51

to know how trustworthy

17:53

is are these um LLMs. I think they're

17:56

getting better. I agree. I think there

17:58

has been a lot of alarm because these

18:01

kinds of hallucinations have been

18:03

exposed but we have seen I think it just

18:06

wanted to please me and invent somebody

18:08

I had really nice weddings

18:10

>> were they nice husbands I mean

18:12

>> I don't know some of them I knew you

18:14

know I [laughter] I prefer the one I

18:15

have but just an example of you know it

18:19

is

18:20

>> they do draw you in to trust them and

18:24

we're very concerned I mean obviously

18:26

when at a time when climate science is

18:30

being denied again at very high levels

18:34

um and people are just questioning maybe

18:36

maybe it isn't so bad or what fossil

18:40

fuel companies would like us to believe

18:42

or maybe it doesn't exist at all um

18:45

we cannot have AI agents confiding in

18:50

you by the way you know some people say

18:53

climate change isn't real when 99.9% of

18:57

climate scientists,

18:58

>> right,

18:59

>> have reached a consensus that climate

19:01

change is real, that it's man-made

19:03

caused. And so any questioning of that

19:06

is really, really dangerous. And do you

19:08

have hope? Are you do you have any

19:10

evidence that the

19:13

platforms are doing something about

19:15

this? I you know I don't have inherent

19:17

faith in the platforms but but the

19:19

nature of the technology and the way and

19:22

the source material they currently

19:24

prioritize tends to be more

19:26

authoritative source material. So they

19:28

don't necessarily look for the junk,

19:30

right? They tend to look for, you know,

19:33

uh, well-known news sources, etc. in ter

19:36

when you ask them a question like that.

19:37

So there's a little bit of hope in that,

19:39

right? And I think um, you're right, the

19:42

challenge is they want to mirror you. So

19:44

they figure out what your bias is and

19:46

they they want to speak back to you with

19:48

that. But again, I think part of what we

19:51

need to do is recognize that like these

19:53

dynamics are not the same as the social

19:55

media dynamics. So, how do we as people

19:57

who are fighting for a healthy

19:59

information environment get ready for

20:01

that next fight? Really understand what

20:03

those dynamics are um even while we're

20:06

still dealing with the chaos unleashed

20:08

by social media. But I think the short

20:10

answer is I have a little bit of hope.

20:12

Um there are big big risks, but there

20:14

are also opportunities in this step

20:17

change we're about to see for maybe

20:20

calming people down a little bit um and

20:22

taking a little bit of the toxicity out

20:24

because these LLMs, they work well when

20:28

they actually calm you down and soothe

20:29

you.

20:30

>> Whereas social media works effectively

20:32

and the economics work when they get you

20:34

riled up. It's a different psychological

20:37

dynamic.

20:38

>> This is a really hopeful note to end on.

20:41

You know the UN just conducted a global

20:43

risk report. It was probably the most

20:46

comprehensive risk report ever done

20:48

because we used our country offices all

20:51

over the world to conduct it. The top

20:55

risk that people identified that they

20:58

were concerned about and that we were

21:00

least prepared to address was miss and

21:03

disinformation.

21:04

uh we believe it underlies um every

21:07

everything that we're trying to do and

21:10

to have a healthy information ecosystem

21:12

where people can get factual trusted

21:15

information um and also some

21:18

inspiration. I think that's what we try

21:19

to provide through our collaboration

21:23

um that you know not only are people

21:25

given the facts but they're given uh

21:27

ways to get involved and ways that they

21:30

can act.

21:31

>> Yeah. I I want to leave you with one

21:33

little story, which is one of our

21:35

verified champions, Ecal in Indonesia.

21:38

He runs a recycling plant. Through the

21:41

content he makes, he's gone from having

21:44

2,000 followers talking about recycling

21:48

uh to half a million in Indonesia. And

21:51

it's massively grown the actual work

21:53

that he does, which is bringing people

21:55

into his recycling plant and increasing

21:57

the volume of of of recycling that that

21:59

happens in Indonesia. It's stories like

22:01

that that give me hope. And it's because

22:03

this is a really inspiring young man,

22:06

right, who can communicate something

22:07

that frankly isn't that exciting, right?

22:10

But in a way um that people want to

22:12

listen. Um and that's the sort of model

22:15

I think that we need um when we're up

22:17

against all of this all of this hate.

22:20

>> Thank you so much, Jeremy, for joining

22:22

us in the SG media zone. And it's great

22:25

working with you.

22:25

>> Always wonderful to be with you. Thanks,

Interactive Summary

The discussion between Melissa Fleming (UN Under-Secretary-General for Global Communications) and Jeremy Hyman's (Founder and Chairman of Purpose) explores their collaboration in combating misinformation and promoting information integrity. Jeremy shares his long-standing connection with the UN, stemming from his childhood. The conversation highlights Purpose's role as a social movement agency, mobilizing the public and shifting narratives on global issues. A key focus is the "Verified" initiative launched during the COVID-19 pandemic, which innovatively combined the UN's institutional credibility with Purpose's expertise in engaging audiences through unbranded content and influential individuals like "Team Halo" vaccine scientists on platforms like TikTok. They discuss the challenges of traditional approaches to platforms and the emerging role of AI and Large Language Models (LLMs) in shaping public opinion. While acknowledging risks like AI hallucinations and mirroring user biases, Jeremy expresses cautious optimism that LLMs, if properly grounded in truth, could help depolarize the information environment, offering a different psychological dynamic than social media. The UN's global risk report identifies misinformation as the top concern, and both speakers emphasize the need for novel strategies to foster a healthy, factual, and inspiring information ecosystem, exemplified by a climate champion in Indonesia who uses social media to drive real-world impact.

Suggested questions

7 ready-made prompts