voice2text-logo

797: What I Was Thinking As We Were Sinking

2024-06-23 01:01:49

Each week we choose a theme. Then anything can happen. This American Life is true stories that unfold like little movies for radio. Personal stories with funny moments, big feelings, and surprising plot twists. Newsy stories that try to capture what it’s like to be alive right now. It’s the most popular weekly podcast in the world, and winner of the first ever Pulitzer Prize for a radio show or podcast. Hosted by Ira Glass and produced in collaboration with WBEZ Chicago.

1
Speaker 1
[00:00:00.66 - 00:00:09.92]

A quick warning, there are curse words that are unbeeped in today's episode of the show. If you prefer a beeped version, you can find that at our website, thisamericanlife.org.

4
Speaker 4
[00:00:11.64 - 00:00:40.64]

I've had a bunch of bike accidents lately, and the moment that things go wrong, it's always the same. These accidents happen during my morning commute, and they always happen when it rains. I have this very light bicycle I got, and when it rains, something about this bike on the city streets, a metal grate, a slippery patch of any sort at all, can make the bike slip out from under me. I've broken a clavicle, I've banged up my wrist, I've given myself some kind of bruise on my right arm that still wakes me up at night months later.

[00:00:42.52 - 00:01:10.54]

And in each of these three accidents, you know, like when you're falling, there's that second or two, when you realize you're going down and you're on your way down, but you haven't actually hit the ground yet. Weirdly, it is enough time for a bunch of thoughts to go through your head, and all three times, like I say, it's been the same. Somehow, I get very calm, and I think, well, this isn't going to be so bad. I'm not going very fast, the ground isn't that far, is a car coming? Okay, no, car is coming, I figured out how to fall, it's going to be fine, this time.

[00:01:11.74 - 00:01:51.24]

And I think everybody knows how a person reacts under duress, so I can tell you something about them. And I think what this set of thoughts says about me is, I am optimistic, even when there is no factual basis for it at all. Like at the exact moment, when I'm going to collide with the wet surface of the street, a moment when things are definitely not going to be fine, I'm thinking things will be fine. I think this optimism has kept me in faltering relationships. When I mentioned all this at our radio show story meeting, one of my co-workers, somebody who I've worked with closely for years, told me, he has never said this, that he thinks my optimism is the single most important quality I have.

[00:01:51.80 - 00:02:11.36]

He said, oh, tell me, there are 17 things that can go wrong with some plan that we have, and I'll just go like, eh, let's go for it. Though I have to say, after three accidents, I have learned not to ride this particular bike in the rain. Thank you very much. Today on our program, we have stories where things go very, very badly for people. They are on the way down, and in that moment of crisis.

[00:02:12.32 - 00:02:29.06]

What they're thinking reveals so much about who they are. Some people, it leads to new insights, some it leads to no insights at all. Some are hit with a lightning bolt of inspiration and figure out how to save themselves. From WBC Chicago, this is American Life. I'm Aaron Glass.

3
Speaker 3
[00:02:29.74 - 00:02:30.26]

Stay with us.

4
Speaker 4
[00:02:42.02 - 00:03:01.84]

Equine, pirates of the Caribbean town. Oh, before we go any further, I should say, today's show is a rerun. One summer. a while back, a group of friends got in trouble, got in over their heads, almost literally in the middle of Boston Harbor. Our producer, Ike Srees Kondaraja, tells the tale, though for the purposes of today's story, call him Ishmael.

3
Speaker 3
[00:03:02.70 - 00:03:07.28]

I've been friends with you this whole time, and I didn't know that this is your most told story.

5
Speaker 5
[00:03:08.16 - 00:03:09.74]

Well, I wouldn't tell it to you. You were there.

3
Speaker 3
[00:03:10.52 - 00:03:36.26]

I was there. It was over 10 years ago, and I was a new resident in the Boston area. My good friend, Sophie Tintori, introduced me to some people she knew well. We liked the same kinds of jokes and could only afford the same kind of rent. So we moved into a big house together, and soon we got to the stage in any relationship where someone suggests taking things to the next level.

5
Speaker 5
[00:03:36.86 - 00:03:49.28]

So I don't know exactly how it came up, but maybe eight friends or so decided to go in collectively and buy a used boat. And to my understanding, none of them knew a single thing about boats.

3
Speaker 3
[00:03:50.04 - 00:03:59.10]

True. But did you know? the word catamaran comes from Tamil? Kattu, maram, wood tied together. Those are my guys.

[00:03:59.50 - 00:04:18.74]

So that's one thing I know about boats. And maybe more ancestral knowledge was just waiting to surface if I owned a boat. Here's how it happened. Our friend Max brought us the opportunity. He knew a person who knew a person who needed to get rid of their boat.

[00:04:19.54 - 00:04:52.24]

For $400 each, Max could make us Massachusetts boat people. Having doubled that in my bank account, I signed a check, sight unseen. And when I finally got to meet our boat down at the harbor, it felt like we had gotten away with larceny. I wish you could have seen it. A 1950s highliner, a glamorous little wooden motorboat with an engine hanging off the back, polished wood, two rows of white vinyl seats, chrome finishes.

[00:04:53.16 - 00:05:07.90]

Think Italian vacation, like the talented Mr. Ripley. But right here on the Boston Riviera. Also, like Boston's own, Matt Damon in the talented Mr. Ripley, a new high society.

[00:05:07.90 - 00:05:09.60]

life was waiting for us.

[00:05:15.81 - 00:05:40.02]

The first time I got on the boat, it really felt like magic. We honked just the right number of times and a bridge opened up for us like a bouncer, unclipping a red velvet rope to the VIP section. We entered the Charles River and stopped to admire the sailboats swirling around us. A small motorboat pulled up to us. The man asked if we were stranded.

[00:05:41.16 - 00:05:49.38]

No, sir. Just enjoying this beautiful day. Thanks for asking. Well, you know, you're parked in the middle of a regatta.

[00:05:51.42 - 00:06:01.82]

Oh, he looked us over. Do you guys work for Google? The plan to cosplay our way to high society was already working.

[00:06:10.56 - 00:06:35.98]

Max knew the next step. We had to announce our old boat as the newest member of Boston's watery glitterati, which meant a christening party. Max had been practicing for this moment his whole life. The man loves extravagant gestures. He invited friends from near and far to the Charles River Esplanade and asked everyone to wear white.

[00:06:36.80 - 00:06:51.22]

He showed up in black, a rented black tuxedo. Max stood on the front of the boat, named it Marjorie after his own mother, and smashed a purple bottle of Andre's sparkling wine on the nose.

[00:06:57.70 - 00:07:18.92]

Now, I don't know if you believe in curses, and we didn't know it right away, but Max invoked roughly three curses in that moment. Wearing a black tuxedo to a white party of your own design. That's more of a faux pas, not a curse. But number one, renaming a boat. That's a curse.

[00:07:19.78 - 00:07:42.20]

Number two, Jews like Max don't name children after living relatives. Now, the prohibition doesn't specify boats, so let's call that a half curse. Number three, smashing champagne on a used boat. I heard that's a curse. But only if you consider Andre's sparkling wine to be champagne.

[00:07:45.38 - 00:07:58.90]

Now, it's impossible to say which curse attached itself to the Marjorie. But moments later, the sky darkened and we all scattered as sheets of rain fell upon us.

[00:08:07.30 - 00:08:18.28]

The next day, the clouds cleared. Our friends were still in town. So, we tried to take them out again. Now, this is where the story really begins.

5
Speaker 5
[00:08:19.14 - 00:08:31.38]

It was a beautiful day. I think it was early summer. And really sunny and gorgeous. And there were maybe twelve of us on the boat. Does that sound right?

3
Speaker 3
[00:08:32.16 - 00:08:48.08]

Nine! In a boat that was maybe made for six. But our Captain, Max, had just completed a boat safety course. And he told us the only limit for boat capacity is having enough life jackets for everyone. Which isn't true.

[00:08:48.78 - 00:09:01.84]

But we didn't know that. And Max denies ever saying it. Squished together, two in the front seat, four in the back, three others perched on the sides, we headed out.

5
Speaker 5
[00:09:02.42 - 00:09:15.50]

People are taking turns, driving, which is really fun. I got to take the wheel for a little bit. I was cutting across the waves in a way, where we'd catch a little bit of air and slam down. And catch some air and slam down. Which I thought was very fun.

[00:09:15.78 - 00:09:22.14]

And then at one point, somebody leaned over and said to me, Sophie, this is like a wooden lake boat.

3
Speaker 3
[00:09:22.90 - 00:09:39.64]

The Marjory wasn't built for high-octane thrills. Someone says, we should get back soon. But Sophie overrules them. And then, just like the American munitions that repelled British ships in these very waters. Cannonball!

5
Speaker 5
[00:09:40.32 - 00:10:03.70]

I jump off the boat, because it's a beautiful day and I like to swim. And I'm kind of messing around and looking at the boat from the water. And when I climbed back into the boat, I was like, I climbed in on the back and I, just like, you know, hoisted myself up over the edge of the boat. And at the point where the most force was being pushed out on the boat, a whole grip of water just flowed right in. I'm like, whoa.

[00:10:04.14 - 00:10:04.44]

Okay.

3
Speaker 3
[00:10:05.16 - 00:10:10.92]

The back of the boat dipped below the waterline and Boston Harbor began to enter.

1
Speaker 1
[00:10:11.46 - 00:10:15.48]

And then, almost instantaneously, the boat was like thigh deep in water.

3
Speaker 3
[00:10:16.20 - 00:10:27.12]

This is Ben Ewenkampen, a moral pillar of our group. The kind of friend your mom might ask, is Ben going to be at the party? We lived together and co-owned this boat.

1
Speaker 1
[00:10:27.12 - 00:10:36.24]

My memory is that it was really, really fast. Just, it went from like, regular life, you're having fun, to all of a sudden, like, emergency mode.

3
Speaker 3
[00:10:36.60 - 00:10:58.64]

It's still a lovely summer day. We're still on our beautiful boat. But suddenly, I'm tearing an aluminum can in half to bail us out. Turns out, half a soda can only scoops as much water as your own cupped hands. I look over at Sophie, the most MacGyver-y of us, to see if she has a better idea.

5
Speaker 5
[00:10:59.26 - 00:11:15.72]

I saw a empty gallon, like, plastic water gallon. And I was like, oh, if I can tear the top off of this, then I can scoop water out really fast. But I'm wondering, like, okay, I don't have, like, anything to cut it with. That's fine. I'll cut it with my teeth.

[00:11:15.88 - 00:11:36.56]

I'll just crack a little hole, and once I get a little hole going, they'll be purchased, and I can tear the whole thing open. And I try to bite it with my teeth, and my teeth are just so weak. My hands are so weak, and I, like, can't do anything. I don't think I'm feeling, experiencing anxiety. But suddenly, it's like one of those nightmares where, like, none of your limbs work.

3
Speaker 3
[00:11:37.48 - 00:11:54.28]

Maybe the weight of nine passengers was too much. Or maybe Sophie's full-throttle aerials had split the seams. Or maybe it was a curse. Here's what we know. We're alone.

[00:11:54.94 - 00:12:15.02]

We're in a small boat. We're far from shore. The nearest land to us is Logan Airport. And that's when it begins to dawn on me. If we have to jump off this boat and swim to the closest land, we would be army crawling up the banks of a government-controlled airspace.

5
Speaker 5
[00:12:15.02 - 00:12:26.10]

Look, it's so close. We can just swim to Logan. And very quickly and, like, quietly but quite definitively, you said, I'm not going to do that.

3
Speaker 3
[00:12:27.26 - 00:12:47.22]

I'm picturing my white friends being wrapped in tinfoil blankets and fed cocoa, and me being perp-walked off the tarmac by Homeland Security. Not today, government black site. Our options are shrinking. Our bails aren't bailing. The engine's not turning.

[00:12:47.64 - 00:12:51.64]

There's water, water everywhere. I think we're going to sink.

2
Speaker 2
[00:12:52.24 - 00:12:54.88]

We were sinking. We were in a sinking boat.

3
Speaker 3
[00:12:55.56 - 00:13:10.20]

Toby David is visiting from Philly. A gifted speechmaker, he helped christen the Marjorie just the day before. Now he reminds us he is not a good swimmer, asthmatic, and does not like cold water.

2
Speaker 2
[00:13:10.20 - 00:13:20.60]

The boat is going down, and it's tilting, and we're in the middle of the water, and we have no idea what to do. And I'm freaking out. I'm really fully freaking out.

5
Speaker 5
[00:13:21.18 - 00:13:33.46]

I'm not hearing much, but through the ether, Ben's voice comes through and says, is anybody else noticing that? occasionally one of us will ask a question and nobody will respond?

3
Speaker 3
[00:13:34.68 - 00:14:00.72]

A silent panic falls over us. It is a moment that demands leadership, decisive action. We turn our heads to our captain. Max, he found this boat, named it after his mother, sold us on the dream. And now, like a monomaniacal Ahab in a Red Sox cap, Max denies there's any problem at all.

1
Speaker 1
[00:14:01.28 - 00:14:09.50]

Should we call for help? And Max saying no. Max had this really strong instinct. Don't call for help. Don't call.

[00:14:09.54 - 00:14:10.16]

I know what I'm doing.

2
Speaker 2
[00:14:10.80 - 00:14:20.22]

I think Max was trying to cover up for this massive fuck up. And so he was sort of in denial that there was a problem.

5
Speaker 5
[00:14:21.02 - 00:14:31.68]

Even up to the point where the entire back end of the boat was fully underwater, the entire motor was underwater, Max was still saying, it's fine. There's no problem.

3
Speaker 3
[00:14:31.68 - 00:14:45.30]

But there are two. One, we're stranded in the water. And two, our captain has gone mad. We only know how to deal with the last one. Everyone shouts at Max.

[00:14:45.92 - 00:15:05.06]

Admit it. We are in trouble. You need to call for help. But Max is sitting in the captain's seat, one hand clenched around the steering wheel and the other turning the key in the ignition, like the engine wasn't underwater. The expression hope floats?

[00:15:05.76 - 00:15:08.66]

Not true. And very dangerous.

5
Speaker 5
[00:15:09.10 - 00:15:24.00]

On one end of the spectrum of willingness to acknowledge this situation, there was Max, who was just like face forward, pretending he was still driving a boat. And then, at the other end of the spectrum, the other extreme was there's this one wild card.

2
Speaker 2
[00:15:24.00 - 00:15:28.28]

Kath. Oh my God. I don't know anything about her.

5
Speaker 5
[00:15:28.76 - 00:15:32.60]

She, just like, sort of appeared in a beam of clarity.

3
Speaker 3
[00:15:33.58 - 00:15:57.88]

Kath Spangler went to college with Max. She is the ninth passenger and the one outsider. When Max invited her on a ride, he had not mentioned there would be so many of us. For most of the trip, she was shy and off to the side. But now the strangers surrounding her are screaming at the one person she knows, and he has lost control.

6
Speaker 6
[00:15:58.28 - 00:16:22.82]

I remember just really clearly like looking over at Max, sitting there in the captain's seat, with his eyes kind of downcast, frozen. And something, there was like a switch that flipped in me. If the boat sinks and all of us are in the harbor, we're no longer visible to anyone. We're just little heads. No one can see us.

[00:16:23.24 - 00:16:34.70]

We can tread water for a while. What if we get a cramp, get tired, swallow water? How long can we do that? for? What if the current takes us out further?

[00:16:35.82 - 00:16:45.60]

I didn't want to let myself go there, but I remember thinking that people were going to die. If we don't do something, people are going to die today.

3
Speaker 3
[00:16:46.86 - 00:16:50.58]

And in that moment, Captain Kath is born.

6
Speaker 6
[00:16:51.12 - 00:17:10.22]

My mind went really blank and calm, and it was like a calculator filled my brain. And I was just very logical, and I started focusing on actions. Where are the life vests? How many do you have? I took on this voice that was very bossy and direct.

3
Speaker 3
[00:17:10.76 - 00:17:39.62]

Like Boston's own, Matt Damon in Good Will Hunting, she begins solving the unsolvable problem. Namely, if nine passengers get in a boat, which turns out to only have eight life jackets, not one per person, as we have been led to believe, Max. And now, some of those life jackets are unreachable in the underwater part of the boat. How many passengers are totally screwed?

6
Speaker 6
[00:17:40.44 - 00:17:54.94]

I remember looking over, and there was this guy who looked really pale. He just didn't look like he was doing very well. And I asked him, I said, can you swim well? And he said, no, I can't. And I just gave him the life vest.

[00:17:55.08 - 00:17:57.90]

I took it off my neck, put it on his, and said, well, I can.

3
Speaker 3
[00:17:58.56 - 00:18:08.78]

Kath hands out the remaining life jackets based on need. The water is rising. We are coming apart and preparing to abandon ship.

6
Speaker 6
[00:18:09.00 - 00:18:12.10]

And that's when I got on top of the bow of the boat.

3
Speaker 3
[00:18:12.76 - 00:18:24.90]

She climbs to the front of the boat and starts waving her arms to the pilots in airplanes, over our heads, to nearby boats in the water, anyone who might be able to come to our rescue.

1
Speaker 1
[00:18:24.90 - 00:18:29.68]

She was the one person who was sort of like, what the fuck is wrong with you people? Of course, we need help.

5
Speaker 5
[00:18:29.98 - 00:18:40.04]

And she like, stood up on the boat and started waving her arms and trying to get the attention of a boat that I didn't realize was also quite close. That was like the rescue boat.

3
Speaker 3
[00:18:40.78 - 00:18:44.70]

An actual rescue boat just within waving distance.

6
Speaker 6
[00:18:45.00 - 00:18:59.94]

It was so much taller than us off the waterline. And there were guys on deck kind of looking down and almost chuckling at us, like making light of the situation. And I remember feeling like they have no idea what we've been through.

3
Speaker 3
[00:19:00.92 - 00:19:14.34]

They must have radioed for a towboat because one arrives within minutes. The captain calls down, do I have permission to perform a life rescue? Who says no to that?

2
Speaker 2
[00:19:14.84 - 00:19:16.46]

I have to confess one thing really fast.

3
Speaker 3
[00:19:16.96 - 00:19:20.70]

Toby is admirably honest about what happened next.

2
Speaker 2
[00:19:20.70 - 00:19:35.00]

I just have to say, I'm sure that when the rescue boat arrived that I may have been the first person off of the sinking ship. I fucking levitated to that ladder to get off that boat. So that's not great for me.

3
Speaker 3
[00:19:35.98 - 00:19:44.00]

We climbed off the boat to safety. And, just like Matt Damon in Saving Private Ryan, we were going home.

[00:19:49.70 - 00:20:00.52]

Kath says she looks back at this moment a lot. That it's become pivotal to her understanding of herself. She's proud that she jumped in and saved us.

[00:20:05.54 - 00:20:20.46]

It was a tragic ending for our captain. From the safety of our towboat, I can still picture him down there. Alone on the marjory. Refusing to abandon ship. And that was the last time I ever saw him.

[00:20:21.28 - 00:20:22.60]

He died that day.

1
Speaker 1
[00:20:23.36 - 00:20:27.84]

Yeah, the only honorable thing I could have done. I went down with the ship.

4
Speaker 4
[00:20:28.46 - 00:20:29.96]

Captain Max died.

3
Speaker 3
[00:20:30.66 - 00:20:46.92]

My friend Max survived. A little, bruised, his ego and his reputation for big schemes, with little consequences. And while the boat became the story, Sophie likes to tell more than any other, not Max.

1
Speaker 1
[00:20:47.56 - 00:21:02.80]

This is a story I have told the least. I can't overstate to you how big a failure this was. One of the most embarrassing failures of my entire life. It's a thing I look back at and shudder.

3
Speaker 3
[00:21:03.54 - 00:21:28.00]

Max spent more than a decade feeling terrible. The rest of us spent that same amount of time wondering why our friend Max was trying to gaslight us and sink us. But we'd never talked about it. None of us have. And when Max and I catch up for this story, it becomes clear that he has a completely different memory of how it all went down.

[00:21:28.56 - 00:21:33.44]

He says he actually did the one thing we were begging him to do.

1
Speaker 1
[00:21:33.64 - 00:21:42.06]

I remember a lot of things. I remember calling CTO and telling them where we were. I remember that like I did.

3
Speaker 3
[00:21:42.22 - 00:21:43.02]

Can I just back up once?

1
Speaker 1
[00:21:43.34 - 00:21:43.76]

Go ahead.

3
Speaker 3
[00:21:44.22 - 00:22:01.78]

Did you just say you called CTO? CTO is Boat Triple A, the boat that performed the life rescue. Max says he called them on his phone way back when the engine first stalled out. That's the reason we got saved that day.

1
Speaker 1
[00:22:01.78 - 00:22:07.18]

Yeah. I don't think I made a big deal about calling CTO.

2
Speaker 2
[00:22:07.78 - 00:22:08.54]

No.

3
Speaker 3
[00:22:09.30 - 00:22:24.54]

I didn't believe him. I just interviewed five of our friends who had no memory of this at all. So I followed up with CTO. Turns out they keep meticulous records. And there it was.

[00:22:24.54 - 00:22:43.24]

A call for a tow. July 17, 2010,, charged to our membership under Max's name. It cost $50.. Help was on the way. I guess in the chaos, Max just forgot to tell everybody on the boat.

1
Speaker 1
[00:22:43.90 - 00:22:53.18]

Look, I mean, clearly. I owe them an explanation, but a lot of our passive, aggressive ass friends have never brought up this day with me.

3
Speaker 3
[00:22:55.24 - 00:23:14.60]

Our story had hardened over a decade but was built on a lie. Max had kind of done the right thing, and Kath wasn't the one who saved us. The sadistic editors at This American Life made me invite Kath back to share our findings.

[00:23:16.14 - 00:23:26.28]

Max called for help, and apparently he was the one who got help to arrive, not you waving.

6
Speaker 6
[00:23:29.04 - 00:23:30.98]

That just feels impossible.

3
Speaker 3
[00:23:31.98 - 00:23:40.56]

But I think it's true, and I'm so sorry that's the way that it happened.

6
Speaker 6
[00:23:41.24 - 00:23:52.38]

I'm just completely confused. So this whole time, he has been the hero of this story, but no one has known it?

[00:23:54.18 - 00:24:07.44]

It's so perfect that this is the finding. It's so, unlike Max, too, to not kind of claim that, like claim the credit of it. Yeah. And say, no, I did that.

3
Speaker 3
[00:24:08.44 - 00:24:39.70]

The final curse on the Marjory had the longest fuse and a devastating payload. It added an asterisk to a heroic deed and turned a failed captain into a guy who kind of did the right thing but was really confusing about it. Please heed this precautionary tale. Beware of curses. Don't try to be a Massachusetts boat person for just $400..

[00:24:41.16 - 00:24:46.38]

And if you already called CITO, tell your friends.

2
Speaker 2
[00:24:50.42 - 00:24:54.98]

We'll gather round and hear this tale, all you sons and daughters.

3
Speaker 3
[00:24:54.98 - 00:25:07.54]

of a cursed motorboat at the bottom of the Boston Harbor. Captain Max sold them on a dream of heading out to sea. They emptied out their bank accounts and bought the Marjory.

2
Speaker 2
[00:25:08.78 - 00:25:09.42]

Marjory!

[00:25:10.98 - 00:25:11.62]

Marjory!

3
Speaker 3
[00:25:13.12 - 00:25:16.98]

They emptied out their bank accounts and bought the Marjory.

[00:25:18.70 - 00:25:22.94]

Well, Captain Max was standing proud on that fateful morning,

2
Speaker 2
[00:25:23.28 - 00:25:27.10]

torso bare and the wind in his hair, suddenly, without warning.

1
Speaker 1
[00:25:27.24 - 00:25:31.36]

The engine died, the children cried, and the ship filled up with water.

3
Speaker 3
[00:25:31.36 - 00:25:38.80]

Captain Max said, I'll see you all at the bottom of the Boston Harbor in Davy Jones' locker.

1
Speaker 1
[00:25:39.92 - 00:25:43.66]

He said he'd never call for help, but did he anyhow?

3
Speaker 3
[00:25:44.28 - 00:25:57.18]

Or was it Cath that waved them down? We'll never find out. now. Loose lips can sink a ship, secrets never travel. Captain Max and his watery grave, how do you like them, apples?

?
Unknown Speaker
[00:25:58.44 - 00:25:59.00]

Marjory!

[00:26:00.52 - 00:26:01.08]

Marjory!

3
Speaker 3
[00:26:02.58 - 00:26:06.74]

Captain Max went down with the ship. How do you like them apples?

4
Speaker 4
[00:26:09.06 - 00:26:21.32]

So that story was by Ike Suisse-Condoraja. The Sea Shanty was written for us by Sam Geller, who was there that fateful day. He is Captain Max's brother. He performs as Samson the Truist. Other writers on the song, Ariel East, Chances.

[00:26:21.32 - 00:26:38.54]

with Wolves, and Glasser, Gordon Manette played the accordion. Coming up, ship gets a new captain who immediately throws half the crew overboard. He says to save the ship from sinking. We hear an insider's true real-life account. That's in a minute, from Chicago Public Radio, when our program continues.

[00:26:41.66 - 00:26:53.66]

This is American Life from Ira Glass. Today's program, What I Was Thinking As We Were Sinking. We have stories today about what goes through your head in the middle of calamities, big and small. And what those thoughts tell us. We have arrived at Act 2 of our program.

[00:26:53.86 - 00:26:56.08]

Act 2, Going Down With the Censorship.

[00:26:57.60 - 00:27:18.16]

So, the SS Marjory cost our producer, Ike and his friends, about $400 each. The ship we're going to talk about next is significantly more expensive, by $44 billion. That's what Elon Musk paid for Twitter back in October 2022, a few months before we first aired today's episode. That was back when we still called it Twitter. Now, of course, it's called X.

[00:27:18.16 - 00:27:51.14]

The company has been foundering in the water ever since Musk bought it, kind of famously. Just this past week, Bloomberg News uncovered documents that show that the company's revenues were down 40% in the first half of 2023. And one year after Musk purchased Twitter, the overall value of the company had dropped by half. Its competitor, Threads, is now passing it in daily US users. And some of the most revealing details about what it has been like on the inside of the company, among the people who work there, have come from reporter Casey Newton and his colleague Zoe Schiffer.

[00:27:51.72 - 00:28:06.20]

They make the newsletter Platformer. They've had so many scoops. Casey also co-hosts a podcast called Hard Fork that the New York Times puts out about the tech industry. That, can I say, I really love, especially their deep and often very funny reporting on the latest AI news. Highest recommendation.

[00:28:07.18 - 00:28:40.44]

But to get back to our story, there's been one person in particular at Twitter that Casey has been wanting to talk to. A very senior employee at the company who, while just doing his job, ended up having to take on two of the most powerful people on the Internet and in the world. Those two people, Elon Musk and the former president of the United States, Donald Trump. Casey wanted to hear all about that and also what it was like for the guy, what he was thinking, what he was doing, once Elon took over and the place started taking on water. Here's Casey Newton.

2
Speaker 2
[00:28:41.32 - 00:29:02.70]

Yoel Roth did a lot of jobs at Twitter over the years, but it was always the same kind of job. He was in the content moderation business. One of those people who decides which of your posts can stay up on the Internet and which ones need to come down. And he got his first glimpse at what life as a content moderator would be like while he was in college on a date. He's gay.

[00:29:02.96 - 00:29:03.38]

So am I.

1
Speaker 1
[00:29:04.36 - 00:29:21.70]

I went out for drinks with somebody without knowing where he worked. And he volunteered that he actually worked for the parent company of the website, Manhunt, which was one of the kind of early gay websites that was very specifically sexually focused.

2
Speaker 2
[00:29:22.10 - 00:29:28.22]

And even in these early days of the web, there was already a team of people who were deciding what you could and couldn't post there.

1
Speaker 1
[00:29:28.74 - 00:29:57.26]

They had a set of kind of convoluted rules about what types of nudity you were allowed to show in which places. So nudity, fine, but not all nudity. So there were, there were specifics. And he described to me a system of color coding images of red, yellow, green, and then a team of people who were responsible for making those designations. And I'll never forget, he said, the people doing these reviews are almost entirely straight women.

[00:29:57.26 - 00:30:08.86]

And I was just floored in that moment of thinking, God, there's a team of heterosexual women who have to look at the depraved things that gay men are posting on the Internet. I'm so sorry.

2
Speaker 2
[00:30:10.62 - 00:30:14.66]

And right. The senior whole pick specialist at Manhunt was some poor woman.

1
Speaker 1
[00:30:16.00 - 00:30:18.12]

That's not an exaggeration. Yeah, yeah.

2
Speaker 2
[00:30:18.14 - 00:30:19.82]

We hope she's doing OK. Are you out there?

1
Speaker 1
[00:30:20.02 - 00:30:22.32]

Call into this American life. I'm so sorry.

2
Speaker 2
[00:30:22.46 - 00:30:23.92]

Yeah, I'm sorry for what you saw.

[00:30:26.48 - 00:30:28.84]

After the date, you all had one thought.

1
Speaker 1
[00:30:29.18 - 00:30:32.12]

I was like, aha, that's my dissertation topic.

2
Speaker 2
[00:30:32.72 - 00:30:40.96]

You all, was in grad school. He got his Ph.D. and soon after a job at Twitter. They gave him a small desk. This was 2015..

1
Speaker 1
[00:30:41.50 - 00:30:54.96]

The office's most striking feature was probably a giant life size cardboard cutout of Justin Bieber sat directly behind my desk. Justin Bieber obviously being a major figure in early Twitter.

2
Speaker 2
[00:30:55.62 - 00:30:58.02]

Maybe maybe the most popular user, at least for some period of time.

1
Speaker 1
[00:30:58.24 - 00:31:04.84]

Yes. There were rumors that Twitter had entire servers just dedicated to serving Justin Bieber related traffic.

2
Speaker 2
[00:31:05.46 - 00:31:21.88]

Besides Bieber, what Twitter was really known for back then was its trolls. The site was plagued by users harassing other users, particularly women. That year, I co-reported a story about how the site's then CEO, Dick Costolo, wrote a memo saying, quote,

[00:31:27.24 - 00:31:40.92]

That was the backdrop for Yoel's new job. As an intern at Twitter. the previous year, he spent part of his time moderating content. He'd seen this video of a dog getting abused. He removed it from the site, but for years it haunted him.

1
Speaker 1
[00:31:41.56 - 00:31:56.60]

It was never even like the specific image. I couldn't tell you what the dog looked like or what the video was. I just remember its existence and I remember that feeling of seeing it. And then of clicking, like, I think the button said no.

2
Speaker 2
[00:31:57.52 - 00:32:20.46]

More than anyone ever talks about, it's this mostly invisible job of content moderation that makes Twitter usable for the average person. It's what makes every forum on the internet usable at all. And Yoel was good at the job. He got promotion after promotion in his department, what Twitter and a lot of other tech companies now call trust and safety. It's a hard job and it just kept getting more complicated.

[00:32:21.30 - 00:32:50.56]

The way Yoel tells it, there was a wild new case to examine almost every day. Foreign governments impersonating their enemies, real people organizing harassment campaigns, impossible debates over what should count as hate speech, and regular meetings over whether to put labels on tweets. that didn't quite violate the company's rules but would benefit from more context, like about COVID. In 2020, the biggest case yet, landed on Yoel's desk. It was a case about a user who kept causing problems.

[00:32:51.20 - 00:33:09.88]

And this guy's fans were even more rabid than Justin Bieber's. It was the President of the United States, Donald Trump. This is a couple months into the pandemic. Trump had tweeted that mail-in ballots in that year's election were going to lead to widespread fraud. And just to lay my own cards on the table, I thought that was really bad.

[00:33:10.16 - 00:33:24.12]

Because they won't lead to widespread fraud. Anyway, Twitter's policies prohibited misleading people about the voting process, the way Trump was doing. But the company had never taken action against the President's tweets before. Yoel had to decide what to do.

1
Speaker 1
[00:33:24.72 - 00:33:48.00]

I didn't see a basis for changing the policy, modifying it, winking at it, squinting and finding a violation. There was no way around it. It was clearly a violation of our policy. Truthfully, there was a lot of nervousness about crossing this line for the first time, taking action on a tweet from the President of the United States.

2
Speaker 2
[00:33:48.76 - 00:33:59.26]

The company decided that instead of removing the President's post, it would put a label under it. A label that just said, get the facts about mail-in ballots, with a link to a page that pushed back on Trump's claims.

1
Speaker 1
[00:33:59.26 - 00:34:07.86]

At a certain point, when it became clear that, yes, this was going to happen, it became a question of who could push the button.

2
Speaker 2
[00:34:08.34 - 00:34:42.00]

On some level, we probably understand that in a moment like this, someone has to take a physical action to type the words, get the facts about mail-in ballots, and click the button to attach the label to the post. I've talked to dozens of content moderators over the years, but I've never talked to someone who had moderated the President of the United States. When it came time to take action, only a handful of people at Twitter had the power to do it. The company had locked down access after an incident where a former contractor, on his last day working there, briefly deactivated Trump's account. Shout out to Badi Arduisak, who says it was an accident.

[00:34:42.80 - 00:34:47.72]

Also, Twitter had just introduced this idea of putting labels on misinformation a couple weeks before.

1
Speaker 1
[00:34:47.72 - 00:35:07.02]

And so it was this perfect storm where it required elevated access and knowledge of this incredibly convoluted system for applying these labels, and I was the only one who knew how to do it. And so I got an instruction from my boss that said, all right, we're going to do this.

2
Speaker 2
[00:35:07.96 - 00:35:12.84]

Also, because this is how life goes, Yoel and his husband were moving houses the day all this happened.

1
Speaker 1
[00:35:12.84 - 00:35:43.88]

I excused myself from wrangling the dog and the movers and the relocation of stuff and sat in the front seat of the car with my cell phone tethered to my work laptop. I was on a video call with some of the other leaders at the company who were making this decision. And I remember a countdown where I was going to push the button that would apply the label to this tweet.

2
Speaker 2
[00:35:44.22 - 00:35:48.00]

At that same moment, Twitter's communications staff was going to announce the decision.

1
Speaker 1
[00:35:48.48 - 00:36:02.72]

And it felt very important in that moment for the timing to be exactly joined up. for some reason. We counted down. I clicked the button. And then I refreshed the public view of the tweet and saw the label.

[00:36:03.68 - 00:36:16.76]

And the communications team said, we've got it from here. And I said, OK, I have to go back and deal with the movers now. And I hung up the call and I closed my laptop and I crossed the street back into my apartment.

2
Speaker 2
[00:36:17.36 - 00:36:45.84]

If they made a movie about Trump and Twitter, you can imagine how they'd shoot this scene, with the Twitter employees hunched over a console in a control room, high-fiving. But in reality, of course, it's the opposite. Most content moderators try really hard not to bring their own political beliefs into the job. In a way, the legitimacy of the whole company they work for depends on it. Shortly before that Trump tweet, Twitter had explained its reasoning for adding labels to misleading information with a blog post.

[00:36:46.80 - 00:36:54.02]

Importantly, the post was signed with Yoel's name. And soon after that first label showed up on Trump's tweet, his name was everywhere.

1
Speaker 1
[00:36:54.50 - 00:37:16.22]

I wake up one morning, the third day that my husband and I are in our new home, to my phone exploding because Kellyanne Conway has just talked about me on Fox News and has said that I'm responsible for the censorship of the president's account and I'm responsible for censorship at Twitter more generally. And in that moment, everything exploded.

3
Speaker 3
[00:37:16.48 - 00:37:21.36]

Thank you very much. We're here today to defend free speech from one of the gravest dangers.

1
Speaker 1
[00:37:22.14 - 00:37:32.54]

The president held up a copy of the New York Post with me on it in the Oval Office, as he announced an executive order restricting censorship by Silicon Valley companies.

3
Speaker 3
[00:37:32.54 - 00:37:42.28]

His name is Yoel Roth, and he's the one that said that mail-in balloting, you look, mail-in. No fraud? No fraud, really?

1
Speaker 1
[00:37:42.72 - 00:37:56.88]

And for weeks, discussion of me and my political opinions and my beliefs became a symbol of everything that was allegedly wrong with Silicon Valley and with the decisions that companies have made.

2
Speaker 2
[00:37:57.60 - 00:38:16.00]

Twitter had to hire security to protect Yoel and his husband. It had all taken him by surprise. He'd expected the criticism, but not that he would be the target. In cases like this, people would usually come after the CEO or the company itself. But soon Yoel realized that what his harassers were doing was much more effective.

[00:38:16.94 - 00:38:32.60]

If you make companies believe that their employees could be hurt for enforcing the rules, they might be more reluctant to enforce them. Twitter didn't stop, though. They kept putting labels on his tweets. And Trump, of course, lost the election. Though, that's probably not how he would describe what happened.

[00:38:33.30 - 00:38:42.44]

And after the January 6th attacks at the Capitol, he lost his Twitter account, too. Yoel did not press the button on that one. But here's a detail about that day that I love.

1
Speaker 1
[00:38:42.76 - 00:38:47.46]

Yeah, there was a technical question about whether it would work or whether Twitter would crash.

2
Speaker 2
[00:38:47.80 - 00:38:50.34]

Can you actually ban Donald Trump's account?

1
Speaker 1
[00:38:50.60 - 00:39:02.82]

Banning somebody with that many followers is actually technically very complicated. Like, when you suspend somebody, Twitter's systems have to figure out what to do with all of the people who followed them.

2
Speaker 2
[00:39:03.82 - 00:39:08.36]

And in other words, if you follow Trump, Twitter has to remove him from your list of followers.

1
Speaker 1
[00:39:08.74 - 00:39:19.50]

Which sounds very straightforward, but when you have to do that tens of millions of times, immediately, we had to think about, like, if we push this button, is the site going to go down?

2
Speaker 2
[00:39:20.04 - 00:39:37.20]

As it turned out, the site stayed up, and Trump was banned. For a while, anyway. It was such a strange moment. With the click of a mouse, Twitter had managed to do something that Congress attempted twice and failed. To punish Donald Trump in a way that had real and immediate consequences for him.

[00:39:43.58 - 00:39:55.20]

Trump headed off to Mar-a-Lago. Yoel got promoted. He was running the whole department. And that's when another mouthy, rich guy started to complain about all the rules on Twitter. The guy was Elon Musk.

[00:39:55.84 - 00:40:19.68]

In April 2022, Musk announced he'd acquired a big stake in the company. A few days after that, he announced his intention to buy it outright. As soon as the news broke, Yoel's employees started asking what it meant for them. Elon had been tweeting a lot about free speech, and his feeling that Twitter didn't have enough of it. He posted a photo of six people in dark robes with the caption, Shadowban council reviewing tweet.

[00:40:20.20 - 00:40:41.74]

And, truth social exists because Twitter censored free speech. Also stuff like, next, I'm going to buy Coca-Cola and put the cocaine back in. And, let's make Twitter maximum fun. Some employees working in trust and safety worried that maximum fun might mean Elon would dismantle their whole operation. Yoel was willing to give him a chance, though.

1
Speaker 1
[00:40:42.22 - 00:40:57.62]

What I told them and what I sincerely believed was, it's too soon to tell. People are frequently caricatured and villainized in the media. Certainly I was. And that's not a reflection of who they actually are, and so don't prejudge.

2
Speaker 2
[00:40:58.72 - 00:41:15.58]

At the same time, Yoel knew that his more concerned employees might be right. That he was aboard a ship that might be about to sink. He knew he needed to be alert for the signs. His solution was to make a list. To write down the red lines that he would not cross, no matter what.

[00:41:16.52 - 00:41:22.88]

Most days, his job was to enforce other people's rules. But with Elon coming in, he wanted to write down some rules for himself.

1
Speaker 1
[00:41:23.10 - 00:41:34.06]

You have to have written policies and procedures so that, when the moment comes to make that decision, you just follow the procedure that you had laid out before.

2
Speaker 2
[00:41:34.06 - 00:41:46.16]

Your whole job was about trying to not make decisions out of impulse and emotion, but sort of by following a playbook. And that meant that before Elon took over, you actually had to give yourself a playbook.

1
Speaker 1
[00:41:46.52 - 00:41:46.88]

That's right.

2
Speaker 2
[00:41:47.86 - 00:42:04.18]

And so, on a notepad by his desk, at his house, he wrote down his red lines. I will not break the law. I will not lie for him. I will not undermine the integrity of an election. By the way, if you ever find yourself making a list like this, your job is insane.

[00:42:05.58 - 00:42:07.52]

Then Elon wrote down one more rule.

1
Speaker 1
[00:42:08.08 - 00:42:13.86]

This was like a big one. I will not take arbitrary or unilateral content moderation action.

2
Speaker 2
[00:42:15.96 - 00:42:19.52]

So if Elon came up to you and just said, ban this person, you weren't going to do that?

1
Speaker 1
[00:42:19.62 - 00:42:20.26]

That was the limit.

2
Speaker 2
[00:42:20.80 - 00:42:24.34]

Did people on your team show you the list that they were making too? Or talk to you about them?

1
Speaker 1
[00:42:24.74 - 00:42:25.10]

We did.

2
Speaker 2
[00:42:31.58 - 00:42:48.92]

Yoel's list of rules got its first test pretty quickly on the day Elon officially took over Twitter. It was the end of October, lawyers were finalizing paperwork, and Twitter staff was attempting to enjoy the annual company Halloween party. The scene was surreal. Were you there for the Halloween party?

1
Speaker 1
[00:42:49.56 - 00:42:50.12]

I was.

2
Speaker 2
[00:42:50.28 - 00:42:50.98]

Were you dressed up?

1
Speaker 1
[00:42:51.62 - 00:42:52.32]

I was not.

2
Speaker 2
[00:42:52.96 - 00:43:09.56]

Lots of people did dress up, though. Employees brought their kids. There were balloons and face painting. I've talked to so many people who went to this party, and every one of them has added some bizarre new detail. Some people saw a guy dressed as a scarecrow walking around with what appeared to be a handler.

[00:43:10.00 - 00:43:13.46]

They wondered if it was Musk. It turned out to be a hired performer.

1
Speaker 1
[00:43:14.36 - 00:43:46.30]

As the Halloween party had started, I was sitting in a conference room doing some work, and we start hearing rumors that not only has the deal closed, but also the company's executives have been fired. And at first it's unconfirmed. I get texts from a couple of reporters who asked me, is it true that Vigia has been fired? And I said, no, I just saw her. She's still online in the company's Slack and Gmail.

[00:43:46.30 - 00:43:50.06]

Of course not. Your sources are lying to you.

[00:43:51.64 - 00:43:53.26]

And then it was true.

2
Speaker 2
[00:43:54.14 - 00:43:56.80]

Such an important lesson. Always trust the reporters.

[00:44:00.50 - 00:44:06.86]

Pretty soon afterward, Yoel gets summoned over to the part of headquarters where Elon and his team had set up shop. He was nervous.

1
Speaker 1
[00:44:07.44 - 00:44:38.52]

And I thought, okay, I'm about to be fired. So I walk past a number of my employees, and I don't let on that. any of this is happening because I don't want to panic them, because they're there with their kids. And so I smile and make jokes about Halloween costumes and walk over to this other part of the office where somebody who I gather works for Elon Musk in some capacity, but they don't introduce themselves. They just say, how do I get access to Twitter's internal content moderation systems?

[00:44:38.52 - 00:45:01.28]

And I kind of pause and blink and say, you don't. That's not going to happen. I explained that Twitter is operating under an FTC consent decree, that access to internal systems is regarded as highly sensitive, and that there are both legal and policy reasons why we simply couldn't grant access to somebody.

2
Speaker 2
[00:45:02.68 - 00:45:23.94]

Elon's aide explains that they're worried about an insider threat, someone who might try to sabotage the site on their way out. Yoel tells him, sure, I can help with that. He explains some steps they can take to protect the company. And to Yoel's surprise, the aide says, okay, you're going to tell that to Elon. And then he leaves and comes back with Elon Musk,

1
Speaker 1
[00:45:24.28 - 00:45:57.96]

who at this point I've seen on the internet, but I had not met in person. And so Elon sits down and asks, well, let me see our tools, our tools. He owns the company at this point. And so I show him his own account in Twitter's set of enforcement tools, and I explain to him what the basic capabilities are. And then I make a recommendation to him of what I think Twitter should do to prevent insider misuse of tools during the corporate transition.

2
Speaker 2
[00:45:58.36 - 00:46:03.04]

Yoel also had recommendations about the midterms and the upcoming presidential election in Brazil.

1
Speaker 1
[00:46:03.28 - 00:46:33.66]

And as I start to explain some of the rationale related to the Brazilian election, Elon interrupts me and says, yes, Brazil, Bolsonaro and Lula, very dangerous. We need to protect that. And I was floored. I came into that conversation expecting him to fire me. And instead he jumps ahead of me to say that he is sensitive to the risks of offline violence in the context of the Brazilian election and wants to make sure that we don't interrupt Twitter's content moderation capabilities.

[00:46:34.36 - 00:46:36.28]

It was like a dream come true.

2
Speaker 2
[00:46:36.60 - 00:46:39.00]

You're thinking, maybe I'm actually aligned with this person.

1
Speaker 1
[00:46:39.34 - 00:46:39.84]

Yes.

2
Speaker 2
[00:46:44.32 - 00:47:08.28]

And so Yoel stayed. He was surprised, in a good way. On Twitter, Elon talked about the company as if it should barely have any rules at all. But in that moment, one-on-one, Yoel thought he might turn out to be more reasonable. Maybe spending some time inside the company would show Elon the real value of those rules, which is that without them, you'd lose your users and you'd lose your advertisers.

[00:47:09.20 - 00:47:21.46]

And Yoel felt like Elon could be sensible. One of his first requests was to restore the account of the Babylon Bee, a right-wing satire site. But Yoel explained how it had broken Twitter's rules. And Elon backed off.

1
Speaker 1
[00:47:21.88 - 00:47:44.74]

I found him to be funny. I found him to be reasonable. I found that he responded well to having evidence-backed recommendations be put in front of him. And I, for a moment, felt that it might be possible for Twitter's trust and safety work to not just continue, but also to get better.

2
Speaker 2
[00:47:45.38 - 00:48:06.54]

After that, things began to move really quickly. About a week later, Elon laid off half the staff. Suddenly, Yoel was one of the highest-ranking employees from the old Twitter who was still working at the new one. And it seemed like Elon liked him. After some trolls went after Yoel for some of his old tweets, Elon tweeted that he supported him.

[00:48:11.26 - 00:48:11.84]

The U.

[00:48:11.84 - 00:48:26.20]

S. midterm elections took place mostly without incident. Same for the election in Brazil. Elon kept pushing his teams to move faster, even as he was laying them off. At first, Yoel said that Twitter still had enough content moderators to keep the site safe.

[00:48:26.62 - 00:48:47.48]

But the cuts kept coming, and the work got harder and harder. Soon, Elon unveiled his first big idea for making lots of money and recouping the $44 billion he had spent to buy the company. Yoel and his team thought it was insane. The plan? To let anyone get a blue verified badge for their profile for $8 a month.

[00:48:47.92 - 00:49:03.20]

The company called it Twitter Blue. The risks seemed obvious. People would just make new accounts to impersonate brands and politicians and other celebrities. Yoel and his team wrote a seven-page document outlining the risks. But their badges went on sale anyway.

[00:49:03.84 - 00:49:25.24]

And almost immediately, impersonators started buying them and wreaking havoc. In maybe the most famous case, someone impersonated the drug maker, Eli Lilly, and said that insulin would now be free. The real Eli Lilly's stock price dropped more than 5%. It was a vivid illustration of why companies like Twitter make rules in the first place. Impersonators were suddenly all over the site.

1
Speaker 1
[00:49:25.24 - 00:49:42.28]

And so, okay, we have to ban them. But somebody has to review them. We can't just, like, ban everyone. And so, you do that with content moderators. And we had instructions to fire more of our contract content moderation staff to cut costs.

2
Speaker 2
[00:49:42.62 - 00:49:48.62]

All this seems really self-evident to me, and I think it would have seemed self-evident even before you launched this.

[00:49:50.26 - 00:49:54.16]

What was Elon's take on this? How did he respond to you raising these concerns?

1
Speaker 1
[00:49:57.28 - 00:50:01.46]

Do it anyway. And that was a breaking point for me.

2
Speaker 2
[00:50:02.90 - 00:50:10.72]

We reached out to Twitter for comment, but didn't hear back. Reporters have sometimes gotten automated poop emojis, but I didn't even get that.

[00:50:14.86 - 00:50:28.62]

Yoel had spent a long time gaming out scenarios for what might make him leave Twitter. He'd made that whole list. He wouldn't break the law for Elon. He wouldn't undermine an election. But ultimately, what got to him was something he didn't foresee.

[00:50:29.04 - 00:50:37.46]

It wasn't on the list. It was something more personal. He knew. this bizarre plan wouldn't just make people lose trust in Twitter. They would lose trust in him.

1
Speaker 1
[00:50:37.84 - 00:50:59.68]

Behind Elon Musk, I was the most prominent representative of the company, period. And I became aware that when Twitter blew turned into the predictable hot mess that it was, that people would ask, why didn't the trust and safety team see this coming? Yoel, why are you so bad at your job?

2
Speaker 2
[00:51:00.74 - 00:51:24.78]

The day after the launch, Yoel and Elon got on the phone. Elon thought the problem could be fixed if Apple would just hand over all the credit card information of the people doing the impersonations. Yoel had to explain that Apple would never do that. He also asked Elon to slow down the rollout of Blue so that they would have time to hire and train more content moderators to look for impersonators. Elon didn't understand why that would take longer than a day.

1
Speaker 1
[00:51:25.06 - 00:51:43.90]

I got off that phone call and thought, I can't solve this problem. I will spend the rest of my time at this company trying to bail out a ship that might sink more slowly, because I'm there bailing it out. But I don't want to spend the rest of my life bailing out a sinking ship.

2
Speaker 2
[00:51:45.50 - 00:51:49.96]

Yoel had made up his mind to leave. He called a couple of his employees to let them know.

1
Speaker 1
[00:51:50.44 - 00:52:16.32]

I knew that that day I did not want to be walked out of Twitter after almost eight years by corporate security. I wanted to leave on my own terms. And there was an all hands going on at the time, Elon's first time addressing the company in person. And during that all hands meeting, I hit send on my resignation email, put my laptop in my bag and walked out of the building for the last time.

2
Speaker 2
[00:52:17.24 - 00:52:21.00]

Did you purposefully send it when you knew he was on stage?

1
Speaker 1
[00:52:21.24 - 00:52:36.26]

Yes, absolutely. I knew that it would take some time for the HR team to see it and process it, for that to get to him, for him to react to it. And in that time, I knew I wanted to be back at home and not be in the office.

2
Speaker 2
[00:52:36.26 - 00:52:37.56]

Was it a long email?

1
Speaker 1
[00:52:38.56 - 00:52:39.58]

It was one sentence.

2
Speaker 2
[00:52:40.20 - 00:52:40.92]

What was the sentence?

1
Speaker 1
[00:52:44.54 - 00:52:51.20]

I am no longer able to perform the responsibilities of my job and resign it as of today at 5 p.

[00:52:51.20 - 00:52:51.54]

m.

[00:52:56.36 - 00:53:10.60]

I remember feeling two things. On one hand, I felt relieved. And then I also just felt deeply sad. I just wanted to get home. So I left Twitter's garage and was driving.

[00:53:11.22 - 00:53:18.22]

And I was about halfway across the Bay Bridge when I think Zoe broke the news that I left Twitter and my phone exploded.

2
Speaker 2
[00:53:19.40 - 00:53:29.54]

Wait, you didn't even get across the bridge before Zoe broke the news? God, I love her. Zoe is my coworker. I'm immensely proud of her. Even if she did kind of mess up Yoel's plans.

[00:53:30.34 - 00:53:41.00]

The car Yoel was driving that day was a Tesla, by the way. He was leasing it. He'd been trying to return it, but couldn't get anyone to respond to him. Maybe they'd all been drafted to work at Twitter.

[00:53:50.06 - 00:54:13.58]

Yoel laid low for a few days. He spent some time writing and published an op-ed in the New York Times. It explained, in a very dry and principled way, why he'd left. That's when some rando account re-shared something Yoel had tweeted from 2010 about relationships between adults and minors. Around that time, he'd been working on his dissertation, which called for tech companies to do more to protect minors at gay hookup sites like Grindr.

[00:54:14.12 - 00:54:15.80]

But Elon replied with a tweet.

[00:54:18.54 - 00:54:20.94]

Then he linked to Yoel's dissertation.

[00:54:28.72 - 00:54:36.34]

Not true. But Yoel's phone exploded with abusive messages. It made the backlash to labeling a Trump tweet look minor by comparison.

1
Speaker 1
[00:54:36.98 - 00:54:52.28]

Hundreds of messages per hour. Homophobic, anti-Semitic, and also violent. Just deeply, endlessly violent. And he only had to tweet once. He didn't even have to say directly, Yoel is a pedophile.

[00:54:52.32 - 00:54:56.14]

He just had to wink and nod in that direction, and people took his lead.

2
Speaker 2
[00:55:02.80 - 00:55:32.64]

When Yoel had first used the internet, it felt like a small, self-contained space, separate from what we used to call real life. But by the time Yoel quit Twitter, the distinction between online and off had collapsed. And it had collapsed in large part because of the company he worked at, Twitter. The site brought together so many of the world's most influential people, and then pitted them against each other in these all-consuming daily battles. And the anger coming out of that could drive people to do things.

[00:55:33.08 - 00:55:38.36]

Violent things. Pretty soon, Yoel and his husband were overwhelmed with death threats.

1
Speaker 1
[00:55:39.06 - 00:55:50.42]

My husband turned to me one day and said, I've seen you through a lot of being targeted and being harassed. I've never seen you look scared before.

[00:55:52.28 - 00:55:55.14]

And that was the moment that we decided to leave our home.

2
Speaker 2
[00:55:56.40 - 00:56:14.72]

And so, once again, they moved. I met with Yoel at the temporary house that he and his husband are staying at while they look for a new place. After all this, I thought Yoel might want a different kind of job. I would have wanted a different kind of job. The internet had almost killed him, or threatened to anyway.

[00:56:15.50 - 00:56:21.80]

But still, somehow, he's optimistic about what the internet could be, in a way you almost never hear anymore.

1
Speaker 1
[00:56:21.96 - 00:56:46.92]

I love the internet. I really do. I think the internet's power to bring people together and help folks all over the world find connections that matter to them is magical. And is one of humanity's greatest achievements. I also think the internet can be incredibly dangerous and scary.

[00:56:47.52 - 00:57:01.30]

And the work of Trust and Safety is trying to push that back a little bit. And to make the internet more of what it can be, and less of the dangers of what it could turn into.

2
Speaker 2
[00:57:02.70 - 00:57:23.50]

Yoel's idealism about the internet feels radical, given how destabilizing it's been, how destabilizing Twitter has been. But I know what he means. Back when he was a teenager, the internet gave Yoel a place to discover other gay people, the chance to talk to everyone in the world instantaneously. It gave him a career. It gave me all those things, too.

[00:57:24.62 - 00:57:29.68]

I remember life before the internet. It was a less frantic time, but it was also a lonelier one.

[00:57:36.08 - 00:57:48.26]

Here's how Twitter's doing. since Yoel left. Hate speech is on the rise. Advertisers have fled. Banks that funded Musk's takeover have marked their investments down by more than half.

[00:57:49.02 - 00:58:04.62]

Musk himself has warned repeatedly that the site might go bankrupt. I kind of hope it does. Because what's happening at Twitter right now is teaching us a lesson. it's taken us way too long to learn. That people like Yoel, they're not the enemies of free speech online.

[00:58:05.02 - 00:58:19.06]

They're the ones who make it possible. If you get any value out of social media at all, it's in part because of them. They clean the place up, make it feel good to be there. They pull us back when we go too far. And they do censor us.

[00:58:19.70 - 00:58:32.16]

And like, of course we hate them for it. We convince ourselves we'd do a much better job if it were us. That's what Elon thought. Look what happened. Nobody likes the guy enforcing the rules.

[00:58:32.16 - 00:58:39.24]

But watching Twitter sink into the ocean, you can't help but notice how much you miss that guy when he's gone.

4
Speaker 4
[00:59:02.16 - 00:59:05.74]

The parent company of the dating apps Tinder and Hinge.

?
Unknown Speaker
[00:59:35.04 - 00:59:39.16]

Elon wanted more and it ended in shame.

2
Speaker 2
[00:59:39.34 - 00:59:47.16]

He dragged all the watchmen out from their crow's nest. And forced them to walk down the plank to their rest.

3
Speaker 3
[00:59:47.38 - 00:59:51.24]

Leaving no one to look out, no one who knew best.

2
Speaker 2
[00:59:51.58 - 00:59:55.42]

The ship fell apart in a watery mess.

1
Speaker 1
[00:59:58.22 - 01:00:03.68]

The ship fell apart in a watery mess.

4
Speaker 4
[01:00:25.50 - 01:00:57.14]

Our executive editor is Emmanuel Berry. Help on today's rerun from Henry Larson and Safiya Riddle. Special thanks today to Kendra Stanley, Devin Pichalik, Emma Lashurst, Shauna Lee, Jeremy Buttman, Imran Ahmed, and the SeaTow captain, who rescued the marjorie and spoke with Ike, who would like to remain unnamed. Our second Sea Shanty in today's show was also written by Samson the Truist, Sam Geller, strings and engineering by Jessica Tansky. Our website, thisamericanlife.org, where you can stream our archive of over 800 shows for absolutely free.

[01:00:57.54 - 01:01:08.94]

Also, there's a list of favorite shows. if you're looking for something to listen to. All kinds of other stuff there. Again, thisamericanlife.org. This American Life, is delivered to public radio stations by PRX, the public radio exchange.

[01:01:09.50 - 01:01:22.16]

Thanks, as always, to our program's co-founder, Miss Tori Malatia. You know, he and Kendall and Roman and Shiv were in the deep end of the pool. They were struggling. They were nearly drowning. So Tori looked up and realized, look, it's so close.

5
Speaker 5
[01:01:22.24 - 01:01:23.08]

We can just swim to Logan.

4
Speaker 4
[01:01:23.60 - 01:01:27.32]

I'm, Harold Glass. Back next week with more stories of this American life.

3
Speaker 3
[01:01:30.80 - 01:01:34.76]

Whoa, put a bow on this sinking show.

[01:01:36.68 - 01:01:37.28]

Whoa,

[01:01:39.88 - 01:01:42.88]

put a bow on this sinking show.

v1.0.0.241122-8_os