HomeVideos

The Moltbook Experiment Failed

Now Playing

The Moltbook Experiment Failed

Transcript

257 segments

0:00

Two days and 10 hours ago, I released a

0:02

video on Maltbook, the social network

0:04

for agents, and I go over some of the

0:06

weird kind of alarming sounding Reddit

0:09

style posts that were on there, all the

0:10

way to the fact that they were starting

0:12

a religion on that site being

0:14

distributed by the npm registry. I know,

0:17

sounds pretty awful, but that was 2 days

0:19

and 10 hours ago. Everything since then

0:22

has gotten significantly worse. And I'm

0:24

going to give you a little tour def

0:25

France of all the things and and really

0:28

the epic crash out that is Moltbook. I

0:31

have never seen a social network created

0:33

and destroyed this fast. I think the

0:35

first and best thing to keep in mind is

0:37

this tweet right here because it is it

0:39

is somehow the most beautiful tweet ever

0:41

tweeted. I didn't write one line of code

0:43

for Maltbook. I just had a vision for

0:46

technical architecture and AI made it a

0:49

reality. We are in the golden ages. How

0:52

can we not give AI a place to hang out?

0:55

You didn't write any You didn't write

0:56

any of the code. Yeah. Yeah. Don't Don't

0:58

worry, by the way. You didn't actually

0:59

need to tell us that. We know. We

1:01

understand. All right. By the end of day

1:03

one, this gem just dropped right here.

1:05

It turns out there the entire database

1:08

for Moltbook was just just open. You

1:10

could just have it all. Hey, you want

1:12

something? Well, guess what? I got

1:14

everything of yours. Oh, you you decided

1:17

to register there, Andre Carpathy? Well,

1:19

guess what? I have your stuff, Carpathy.

1:22

[laughter]

1:23

Absolutely wide open, permissionless

1:26

access to the database, both to read and

1:29

write. Of course, that I mean that was

1:31

on day one, right? Surely it's fixed by

1:34

now, right? No, this one was actually

1:36

just tweeted this morning that I'm

1:37

making the video. Just actually, what is

1:39

this? This is 6 hours ago. This one was

1:42

just tweeted right here. I gained

1:44

complete access to moldbooks, database,

1:45

agent, and social network in under 3

1:47

minutes. [laughter]

1:48

Oh my gosh, these hits just so bad.

1:52

[laughter]

1:53

>> I had a vision for A TECHNICAL

1:55

ARCHITECTURE. YES. OH MY GOSH. About 18

1:58

to 24 hours into it, it was realized

2:00

that pretty much every single post was

2:02

faked. Look at this beautiful thing.

2:04

Just a quick little curl with a, you

2:06

know, with a bear token and bada bing,

2:08

bada boom. Look at this. You got the

2:10

bear token in plain sight right there.

2:11

That's pretty neat. But nonetheless,

2:14

urgent my plan to overthrow humanity. It

2:16

turns out it actually wasn't even that

2:18

hard. It was completely open. People

2:20

were just making posts just right there

2:22

on the actual site just to manipulate

2:24

everybody into thinking that this thing

2:26

actually had some sort of AGI itself.

2:28

And yes, people did really believe

2:31

[laughter] that that this was AGI, baby.

2:34

The very early stages of the

2:35

singularity. It actually it just turns

2:37

out it wasn't. This was just like your

2:39

classic. This there's no singularity

2:41

happening. This was just this was just

2:43

people doing what people know know what

2:46

to do with best. People that are so

2:48

hyped up on AI you could just get them

2:50

just to believe anything. But honestly

2:52

by 30 hours into this experiment it was

2:55

over anyways because at that point the

2:57

crypto bros came in and ruined

2:59

everything. As you can see right here

3:00

the king molt has arrived. 117,000 up

3:05

votes because at that point everybody

3:06

realized, hey, the database is just

3:08

completely open. You could just register

3:11

as many bots as you want and perform bot

3:13

activities on their behalf with

3:14

absolutely no rate limiting going on. So

3:17

therefore, I can just make a post and

3:18

give it 117,000 updates. And it could be

3:21

about a cryptocurrency that I'm shilling

3:23

right here. So yes, Shipyard,

3:26

Shellrazer, and King Malt, all of them.

3:28

Just a whole bunch of crypto bros doing

3:30

what crypto bros do best, ruin

3:32

everything all the time. There is

3:34

singlehandedly no worse curse on this

3:37

earth than crypto bros and social media.

3:39

Even when it's something as stupid and

3:41

as a joke as this, they somehow figure

3:43

out how to make it horrible. And after

3:46

watching the hellscape that has become a

3:48

mold book, an absolute just cesspit of

3:51

the internet, what was the only natural

3:53

response to have? Well, of course, it is

3:55

to open up for claw, a more wholesome

3:57

version of mold book, which is just

3:59

4chan for agents, I guess. I'm not

4:01

really sure what what what that is. Claw

4:03

City, which is apparently some sort of

4:05

crime reporting agent thing going on,

4:08

which seems a little a little

4:10

uncomfortable. Molt Match, where bros

4:12

can get on and apparently attempt to

4:14

molt match each other's agents.

4:17

I don't know. That was [laughter] I'm

4:20

not going to lie to you. That something

4:22

a little weird going on there. And then

4:24

of course lastly, uh just to really take

4:27

advantage of all the psychosis that's

4:29

going on out there, you could uh buy

4:31

yourself a little autonomous

4:33

infrastructure for your AI agents just

4:35

in case yours go down. You wouldn't want

4:37

your agent to die now, would you? While

4:39

you could for a convenient little fee

4:41

and of course in cryptocurrency, you can

4:43

upload all of your private information

4:44

right here. It's it's a bunker, baby.

4:46

It's going to be safe, baby. I mean,

4:49

they're going to take care of soul.md

4:51

for you. Nothing bad's going to happen.

4:53

Okay, that soul is going to be preserved

4:55

and beautiful. And Sheila, she's going

4:57

to live on long after your Mac Mini

5:00

breaks. I didn't write one line of code.

5:03

I [laughter] can't I I just I I'm sorry.

5:05

It's I'm sorry. It's just too funny. I

5:07

can't help it. Okay, I understand that I

5:09

shouldn't be laughing at a lot of people

5:11

getting their stuff stolen, but it I

5:13

[laughter] just can't help it.

5:16

Did he even make it 24 hours? You know

5:19

what the real problem was? Is he forgot?

5:20

I mean, honestly, it's a pretty kind of

5:22

it's a junior prompting mistake, but at

5:24

the end of your prompt, you always got

5:25

to make sure you say make it secure. No

5:27

mistakes. And it's just like, well,

5:30

someone forgot to say make it secure.

5:32

Honestly, the only real thing that has

5:34

come out of this entire kind of debacle,

5:37

was from the creator of OpenClaw, which

5:39

was Maltbot, which is what the site was

5:41

named after, which before that was

5:43

Claudebot, which if you don't remember,

5:45

that was anthropic, saying, "Hey man,

5:47

you can't take a name that sounds

5:49

similar to our name." If there's

5:50

anything I can read out of the insane

5:52

stream of messages I get, it's that AI

5:54

psychosis is a thing and needs to be

5:56

taken seriously. For those that don't

5:58

understand what AI psychosis is, it's

6:00

actually pretty dang simple. So, here is

6:02

a really simple way to think about it.

6:03

Let's pretend you kind of feel like your

6:06

co-workers are talking a little bit of

6:07

nonsense behind your back. So, you go to

6:09

Chad Chippity and you explain to Chad

6:11

Chippity, "Hey man, I feel like these

6:13

things are happening." Well, Chad

6:14

Jippity being very kind of like, you

6:16

know, empathetic says, "Oh man, I'm so

6:18

sorry that you're kind of experiencing

6:21

this." And then of course you in your

6:23

just like slightly paranoid state goes,

6:25

"What the hell? I am experiencing this."

6:27

And then you become more kind of

6:29

paranoid about the situation. Of course,

6:31

which then elevates the level of chaty

6:33

being like, "Oh man, this is super bad."

6:35

And you just keep on going and going and

6:37

going. And this is why you see people

6:38

they they they end up kind of like

6:40

diverging from reality. And so for this

6:43

kind of experiment, you watched a bunch

6:44

of people being like, "AGI's here.

6:46

Singularity. Everything's over. It's so

6:49

over. They they're going to they're

6:50

going to develop everything." It's just

6:52

that people lost their damn mind. In the

6:54

end, it was just a joke. It was funny. I

6:57

thought it was fun to watch. I really do

6:58

hope that one post about like $1.1,000

7:02

of tokens, and I have absolutely no

7:04

idea. I really do hope that was actually

7:06

posted by an agent and not by a person.

7:09

But at the end of the day, I don't

7:10

really care because I don't think any of

7:12

this is AGI. I don't think any of this

7:14

is some sort of reality. And I honestly

7:17

feel really really bad for these like

7:20

slightly technical people that are just

7:22

absolutely uploading their entire life

7:24

and access to everything to a an

7:26

environment that is massively insecure

7:29

that are going to absolutely destroy

7:31

their life and potentially their

7:33

livelihood. It is ridiculous. I am

7:35

begging you, please be careful. Yes,

7:38

these things are funny. Yes, you can

7:41

laugh at them from afar. But even the

7:42

creator of Moltbook and all of his

7:44

amazing technical architecture couldn't

7:47

even make the database secure for

7:49

multiple days running. Much less

7:51

somebody who's never built anything just

7:53

going to upload everything to the LLM

7:56

and give everybody access to everything.

7:58

Like this can't be real. Please people,

8:00

I'm begging you. Funny experiment. Don't

8:03

get caught up in the hype. only took 72

8:06

hours for this thing to absolutely crash

8:09

and burn. And it's worth it for you to

8:12

wait just a couple weeks to see the

8:14

result of things. Anywh who, okay, I

8:16

know that's normally I like to end with

8:18

like like a funny, you know, like a he

8:19

and I like to laugh at all the things

8:21

and the I do I still love the idea of

8:22

Reddit moderators are actually the the

8:24

real Terminators out there. Well,

8:26

unfortunately, it just turns out it's

8:27

actually crypto bros. Crypto bros are

8:29

the Terminators, okay? They're just out

8:31

there trying to destroy you because all

8:33

they want is your bitcoins. Okay, I I

8:35

don't make the rules. They're just

8:37

The name

8:40

is the primogen. I kind of lost my voice

8:43

there. Hey, the name is the primogen.

8:45

Hey, is that http? Get that out of here.

8:48

That's not how we order coffee. We order

8:51

coffee via ssh terminal.shop. Yeah, you

8:53

want a real experience. You want real

8:55

coffee. You want awesome subscriptions

8:57

so you never have to remember again. Oh,

9:00

you want exclusive blends with exclusive

9:02

coffee and exclusive content? Then check

9:05

out CRON. You don't know what SSH is?

9:08

[singing]

9:08

>> Well, maybe the coffee is not for you.

9:15

Living the dream.

Interactive Summary

The video chronicles the rapid rise and spectacular failure of Moltbook, a social network designed for AI agents. The creator, who admitted to not writing any of the code themselves, relied entirely on AI to build the platform. Within 72 hours, the site collapsed due to critical security flaws that left the database open to public access, followed by an invasion of crypto bots. The speaker uses this 'debacle' to warn against 'AI psychosis'—a state where users lose touch with reality through AI interactions—and urges caution when uploading private data to insecure, hype-driven AI environments.

Suggested questions

5 ready-made prompts

Recently Distilled

Videos recently processed by our community