1 hours 7 minutes 31 seconds
Speaker 1
00:00:00 - 00:00:01
No white cloth?
Speaker 2
00:00:02 - 00:00:18
Of course you could just do a double split in my life. Do you enjoy seeing Broadway musicals? Yeah. I mean I like it if they don't break into song too often,
Speaker 1
00:00:18 - 00:00:20
you know? Elon's on Twitter.
Speaker 3
00:00:25 - 00:00:27
He's tweeting about this right now.
Speaker 1
00:00:28 - 00:00:31
Yeah, it wasn't until it wasn't until March of 2022 that we, uh...
Speaker 3
00:00:31 - 00:00:32
We got kicked out.
Speaker 1
00:00:32 - 00:00:34
Had a little snack, we got booted.
Speaker 2
00:00:34 - 00:00:35
You made me buy the company.
Speaker 3
00:00:37 - 00:00:39
Well, speaking of, we have the sound bite
Speaker 1
00:00:39 - 00:00:40
we needed right there.
Speaker 2
00:00:41 - 00:00:46
It's the Babylon Bee at Twitter headquarters. Oh, And Elon Musk is there too.
Speaker 3
00:00:48 - 00:00:50
We wanted to give you something, actually.
Speaker 2
00:00:51 - 00:00:54
To restore the liberty of the bee. It was very expensive, guys.
Speaker 3
00:00:56 - 00:00:59
This is a gift. It's an IOU worth $44 billion.
Speaker 4
00:00:59 - 00:00:59
Thank
Speaker 2
00:00:59 - 00:01:06
you. So, uh. Well, you know, pretty soon $44 billion won't be worth that much anyway. That's a good point.
Speaker 3
00:01:06 - 00:01:08
So I might want to hang on to that 1
Speaker 1
00:01:08 - 00:01:09
Yeah, hang on to that 1.
Speaker 2
00:01:09 - 00:01:12
Okay, that's big 1. I will treasure this gift
Speaker 3
00:01:14 - 00:01:29
But it's seriously crazy to us we come into the Twitter Twitter headquarters We were you know banned from Twitter in March 2022 You restored us and a bunch of other accounts in November, 2022. And then now we're invited and we get to come here and we weren't even allowed to go into our account.
Speaker 2
00:01:30 - 00:01:41
Absolutely, it's like barbarians storming the barbarians, not merely at the gate, but through the gate and pillaging the merch. That's what we were doing. Merch pillaging. We're in and we're pillaging the
Speaker 3
00:01:41 - 00:01:44
merch. You're the barbarian.
Speaker 2
00:01:45 - 00:01:48
Yes. Just making sure. But you're my barbarian friend.
Speaker 3
00:01:48 - 00:01:49
Okay.
Speaker 1
00:01:50 - 00:01:53
I want to ask about that. So the restoring of accounts.
Speaker 2
00:01:53 - 00:01:54
Yes.
Speaker 1
00:01:56 - 00:02:04
When you came in, so you took over, you closed the deal, there was the lawsuit, there was all that,
Speaker 4
00:02:04 - 00:02:04
there was the drama. We were like riding a roller coaster wondering what was actually going to happen because we had
Speaker 1
00:02:04 - 00:02:25
no idea if he would ever actually take over the company if we would ever actually get out of Twitter jail because we had vowed to never delete that tweet. So we were watching and just kind of wondering what's really happening. Then you finally take over and you go in on day 1, what are the conversations? Like take us behind the scenes. Like what happened behind the scenes where you're talking with the team, you're trying to figure out what to do with the advertisers and all of that.
Speaker 1
00:02:25 - 00:02:28
What was the conversation behind the scene around restoring accounts?
Speaker 2
00:02:29 - 00:02:31
There were a few questions around why I was carrying a sink.
Speaker 1
00:02:34 - 00:02:36
Where is the sink? Is the sink still here?
Speaker 2
00:02:36 - 00:02:38
I think security has it somewhere.
Speaker 1
00:02:38 - 00:02:39
Do they really?
Speaker 2
00:02:39 - 00:02:49
It's here? Yeah. I've still run into people who don't actually, they ask me, why did you walk into the lobby with a sink? Because they didn't understand I
Speaker 1
00:02:49 - 00:02:50
was
Speaker 2
00:02:50 - 00:03:03
making a pun. I let that sink in. They couldn't help it. I felt like if I showed up with a sink, they'd have to let me in. Because you can't help but let that sink in.
Speaker 2
00:03:03 - 00:03:11
It's impossible. It also seems like the perfect Halloween costume. Presumably you want to get into people's houses.
Speaker 1
00:03:11 - 00:03:25
Yeah, that's hysterical. So You came in though and you said bring back the Babylon B, bring back these accounts and then there was pushback, right? Didn't you get pushback internally? That's what was reported anyway, just going off of what was in the news.
Speaker 2
00:03:25 - 00:03:47
Yeah. I mean it was chaos in the beginning. I was trying to figure out how to run this place, what was going on. And it's very difficult when the whole company works from home. So it's like, who do you even Zoom with?
Speaker 2
00:03:47 - 00:04:13
I don't know. It's like, normally you could walk around, introduce yourself to people and have conversations but Twitter had gone to almost fully work from home. So this building was empty And Twitter buildings around the world were empty. So it was just very difficult actually just trying to understand what was going on. I'd say like the analogy would be like being teleported into a plane that's in a nosedive with its engines on fire and the controls don't work.
Speaker 1
00:04:13 - 00:04:15
And you got to take it out of the nosedive.
Speaker 2
00:04:16 - 00:04:32
Yeah. Yeah. You got to reconnect the controls, you know, like who's, what's the org structure, what's the management team? Because unlike literally the hour 1 of the thing closing, I exited the top 4 people at the company.
Speaker 4
00:04:36 - 00:04:36
And then
Speaker 2
00:04:36 - 00:04:54
a whole bunch of people also quit. So it was like organizational structure was like Swiss cheese that you put in the microwave. It was like full of holes and melted. How's your Twitter feed? You finding it to be better these days or what?
Speaker 3
00:04:54 - 00:05:07
Well, I only use the following tab because to me that's what I always wanted social media to be was just Everybody I follow in order, exactly the order that it's posted. So the For You is more the algorithm driven stuff.
Speaker 2
00:05:07 - 00:05:09
Yeah, but I think you should try the For You. Try
Speaker 4
00:05:09 - 00:05:09
it out.
Speaker 1
00:05:09 - 00:05:24
Yeah. And I use it predominantly. And my time on the app is insane because it's feeding me so much stuff, you know, like the stuff that I'm watching like crazy fight scenes. So I get stuck in there.
Speaker 2
00:05:24 - 00:05:29
You get some weird like no context humans or CCTV video. Exactly.
Speaker 1
00:05:30 - 00:05:42
Yes, uncensored video. I'm getting drawn into these things, you know, CCTV stuff. Yeah, I think it's good. I mean, I'm seeing a mix of opinions too. I'm seeing people that I disagree with pop up that I don't follow.
Speaker 2
00:05:42 - 00:05:42
Sounds good.
Speaker 1
00:05:42 - 00:05:43
Which a lot of people object to.
Speaker 2
00:05:43 - 00:05:44
They don't
Speaker 1
00:05:44 - 00:05:47
want to see that. They only want to see people that they agree with.
Speaker 2
00:05:47 - 00:06:03
Yeah. Well, the system actually does take into account your interactions. So basically if you argue with someone else, it'll assume that you like arguing. Which probably, maybe someone secretly does actually like arguing. They may not want to admit it, but they actually like it.
Speaker 2
00:06:04 - 00:06:16
It's like that Monty Python sketch, have you ever seen that 1 where... I'm here for an argument. You're here for an argument, he accidentally goes into the insults room first. Abuse. Abuse.
Speaker 2
00:06:21 - 00:06:48
So if 1 account argues with another account, the algorithm will assume that that's what you want. You want more arguments. Or more of that, you know, because you chose your time, to spend your time that way, so it will actually then show you content that other content you might want to interact with. But I think that over time the recommended for you tweets should be extremely compelling. I suspect that you'd probably want
Speaker 1
00:06:48 - 00:06:49
80-20
Speaker 2
00:06:50 - 00:06:59
recommended versus following. The most compelling tweets from your followers are also in For You.
Speaker 3
00:07:00 - 00:07:04
So skip the boring ones that people post and go right to the good stuff.
Speaker 2
00:07:04 - 00:07:39
I mean there are some accounts that I follow that tweet 20 times a day or more. So it tends to follow up the following. But if you'd want to say the best of the people that you follow should be in for you or something's wrong. But the intent is to maximize unregretted user time. So that's, I think, the metric I've got the whole organization focused on is unregretted user time.
Speaker 2
00:07:39 - 00:07:55
I mean, so because I mean, I frequently hear someone tell me they spent a lot of time on TikTok, but They
Speaker 4
00:07:55 - 00:07:56
kind of hate themselves afterwards. And so that's, I'll call it, you know, regretted user time.
Speaker 2
00:07:56 - 00:08:12
And you know, so if we can, we want it to be that you spent the time on X slash Twitter and you're glad you did. Like you were informed and entertained. And I do, I think, get more laughs from Twitter per day than everything else combined, frankly.
Speaker 3
00:08:13 - 00:08:25
Yeah, because everybody's moving towards algorithm. Facebook's algorithm, and YouTube's all algorithm driven, and TikTok. So what's Twitter doing differently that's getting you that unregretted user time compared with these other apps?
Speaker 2
00:08:26 - 00:09:15
Well, I guess it matters what content is on a system to be recommended in the first place. So TikTok has a lot of teen dance videos, so that's a popular topic. Twitter is more of intellectual debates and learning things. And humor, it's much more a variety show, but includes serious information and the brain breaking news. So it matters that even if you had identical algorithms between Facebook and Twitter, if Twitter's content is better or more interesting in that way, you still prefer Twitter.
Speaker 2
00:09:16 - 00:09:25
And as you've seen, we've open sourced the algorithm and made, at this point, I think, well over 100 changes based on user feedback.
Speaker 1
00:09:26 - 00:09:29
But how will we stay informed if NPR isn't there anymore?
Speaker 2
00:09:31 - 00:09:39
You know, at any given point, I'm not sure if NPR is there or not. I wasn't aware that they were there until they said they were not going to be there.
Speaker 1
00:09:42 - 00:09:46
That's how it was with Jim Carrey. When I found out Jim Carrey was leaving Twitter I'm like, I didn't know Jim Carrey was on Twitter.
Speaker 2
00:09:46 - 00:09:49
Yeah. And maybe he's back. I don't know.
Speaker 3
00:09:50 - 00:09:51
We'll have to find out.
Speaker 2
00:09:51 - 00:10:28
We're trying to apply the rules consistently at Twitter so that the NPR thing is like, well, If we're going to call some media state affiliated, well, there's plenty of media organizations in the US or in the West that are state affiliated. So then we should apply the label equally. And then they got very upset about that. And said that state affiliated implies that the state has editorial authority of an influence over the content. And I'm like, so you're saying you don't have that?
Speaker 2
00:10:30 - 00:10:31
How self-aware are you?
Speaker 1
00:10:33 - 00:10:34
It was on their website.
Speaker 2
00:10:34 - 00:10:58
Well yeah, I mean they, NPR literally on their own website said government funding is essential to their operation. And so we even changed it from state affiliated to state funded. So that's just literally a statement of fact. I mean we could actually just lift the same text from their website and put it at the label. And they're unhappy with that.
Speaker 2
00:10:58 - 00:11:01
Yeah, They're unhappy with that.
Speaker 1
00:11:01 - 00:11:03
It gives the wrong impression, the true impression.
Speaker 2
00:11:03 - 00:11:15
Well, I mean, I think they're going to pull their punches in critiquing the government at NPR. That's just a guess, you know, based on their extreme dependence on government funding. I mean, you don't generally bite
Speaker 1
00:11:15 - 00:11:26
the hand that feeds you. Let's real quick, though, back to when you came into Twitter. Okay, so the plane is going down and you're at the controls and trying to save it. So you're doing all these things. But you also did want to restore banned accounts.
Speaker 1
00:11:28 - 00:11:42
And so the trust and safety team and everybody, were they arguing with you about that, pushing back on that? Or was it primarily advertisers who were giving you the most pressure and saying, if you restore all these accounts, we're pulling our ads? What was happening there with the pressure you were getting?
Speaker 2
00:11:42 - 00:12:18
Well, I mean, it was just a totally chaotic situation. Because I still got to run Tesla and SpaceX while trying to figure out anything about how Twitter runs. Where I exited the top 4 people in the first hour and then there were a ton of others that either got exited or quit. So it was really just trying to keep the wheels on the bus is the top priority. Twitter code base is actually much more complex than you think and it's not something that just works, you know.
Speaker 2
00:12:18 - 00:12:28
It requires a lot of care and feeding. And there were many people who predicted that the service would go down and Twitter's not going to exist after this weekend.
Speaker 1
00:12:28 - 00:12:30
World Cup was going to kill it, right?
Speaker 2
00:12:30 - 00:12:59
Yeah, it was going to get crushed by the World Cup. The number of predictions that Twitter is about to die were ridiculous. I mean, 1 can just go, like almost farcical, frankly. But they did kind of have a point in that it wasn't super easy to keep things running. There are so many esoteric corners of the Twitter software and the system operations that make it hard to keep running.
Speaker 2
00:13:01 - 00:13:16
I can't emphasize that enough. It's not like some app on the phone that just works. It's a super complicated thing. Financially it was a very tough situation because Twitter in a normal year would do probably $4.5 billion in revenue,
Speaker 1
00:13:16 - 00:13:17
$4.5
Speaker 2
00:13:17 - 00:14:01
billion in cost. They'd probably lose a little bit of money because I, you know, that's why I call it a really expensive non-profit because Twitter is, over its lifetime I believe, a negative on profitability. Like it's lost more money than it's made. But because of the high price of the acquisition, which is very foolishly high on my part, There's $12.5 billion of debt and debts I was saying is $1.5 billion. And then there's a cyclic decline in advertising that seems to start maybe around May of this year But was pretty significant.
Speaker 2
00:14:01 - 00:14:23
That's what you saw Facebook and Google and others also do some layoffs as a result So there's a cyclic decline in advertising and then a pretty major pause in advertising where advertisers weren't they just put their campaigns on pause. They didn't definitively say no. I mean, in a few cases they did, but most
Speaker 4
00:14:23 - 00:14:23
of them.
Speaker 2
00:14:23 - 00:14:24
You mean
Speaker 1
00:14:24 - 00:14:28
in response to you taking over and promising free speech? They're like, oh, we don't want free speech.
Speaker 2
00:14:28 - 00:14:56
Yes, well, It's more like they are worried about something that would affect their brand I suppose. Although it seems like sometimes they worry about the wrong things. The things they should be worried about. They should be worried about some things and they aren't and aren't worried about other things and should be. But nonetheless, a lot of the advertisers put their advertising on pause to kind of like see where things go.
Speaker 2
00:14:58 - 00:15:29
And that causes about a, I'd say almost a 50% reduction in revenue immediately. So in rough terms we're doing about negative $3 billion a year in cash flow And possibly trending to do worse than that, possibly 4 or 5. It's hard to tell where the bottom was. And a billion dollars in the bank, so roughly 4 months to live.
Speaker 1
00:15:29 - 00:15:34
What were the options if you ran out? Like you would have to continue to fund it, you'd have to seek funding from outside.
Speaker 2
00:15:34 - 00:15:55
Yeah, I mean it's limited to how much Tesla stock I can sell and I can't sell it all the time. I mean the reason I sold stock in December last year, which I regret at this point, was because I wasn't sure how much money Twitter would need. I thought, you know, I might... The... You know, in October, it was a
Speaker 1
00:15:56 - 00:15:56
$6
Speaker 2
00:15:56 - 00:16:21
billion cash burn, and we don't know where revenue is going to bottom out. I had to get the company to financial stability, which we were close to financial stability, roughly, not quite break-even, but close to it. And I think training positively. I think it's at least the plane's in level flight. The engine of engineering, software engineering, is in much better shape.
Speaker 2
00:16:23 - 00:16:33
I think we've deployed more features and capabilities in the last 6 months than Twitter has done in the last 6 years. So we're the 20% of the team.
Speaker 1
00:16:33 - 00:16:35
That's crazy. Yeah.
Speaker 3
00:16:36 - 00:16:43
Do you feel like it was worth it? I mean, do you ever regret it and go, why did I do this thing just to make a stand for free speech?
Speaker 2
00:16:44 - 00:17:06
I think it was necessary. And I wouldn't say that there's anything that's in retrospect has caused me to, I think it was still the right move to acquire Twitter even at the outrageously high price. I think it's going to turn out to be important.
Speaker 1
00:17:06 - 00:17:08
Well, the price wasn't just financial either. It's like...
Speaker 2
00:17:08 - 00:17:23
No, I mean, this is a... I mean, I should say, like, you know, I do... There were... There are other investors besides me and Twitter. And I'm the majority owner, but I want to make sure that those who invested with me do not suffer loss.
Speaker 2
00:17:23 - 00:17:54
So I'll make sure that the ultimate outcome is better than their original investment. But certainly, This is a hard way to get richer. That's for sure. This is this is the mega pain way to make money. There's A lot of pressure on whoever's running Twitter, you know, to do this or that on the platform, as you might imagine.
Speaker 2
00:17:56 - 00:18:35
At any given point, someone, you know, someone powerful is unhappy, basically. So anyway, but I think it was it was still important to do. Because pretty much all of the social media companies and the search companies were acting in unison along with the legacy media. So it was just where do you find the truth? And if everyone is in lockstep with a lie.
Speaker 1
00:18:35 - 00:18:59
Well, you just, did you see Obama's comments recently in a CBS interview, I think it was CBS, where he was talking about how the thing that keeps him up at night is how the media, there's like diversity in the media now and there didn't used to be. And so now you have different narratives and people basically inhabiting different realities. And that's the thing that's like keeping him up at night is like the big problem that we face right now.
Speaker 3
00:18:59 - 00:19:00
He wanted to go back to the glory days.
Speaker 1
00:19:00 - 00:19:03
Yeah, there was 1 narrative. And the media was unified.
Speaker 3
00:19:03 - 00:19:04
3 different channels all telling you the same.
Speaker 1
00:19:04 - 00:19:07
So you disagree. You're telling us you disagree.
Speaker 2
00:19:07 - 00:19:12
Well, I haven't seen the full video, so it's possible there may be some contextual elements that mitigate.
Speaker 1
00:19:12 - 00:19:14
Community notes will help us with that 1.
Speaker 2
00:19:14 - 00:19:55
Yeah, Community notes is being pretty good. Yeah, I'm watching it closely to make sure, because as community notes gains credibility, the value of gaming increases. So we've got to make sure that it is as ungameable as possible. I mean, you may know this, but the essential idea behind community notes is that a note is only shown if it is rated highly by people who historically have different opinions. So where do people who would normally disagree agree?
Speaker 2
00:19:56 - 00:20:13
And so even if there's a lot of people from say like 1 political point of view, doesn't have to be political, but 1 point of view, and it's only a small amount of number from another political point of view, they still have to agree. You can't just brigade the thing with 1 ideology.
Speaker 1
00:20:15 - 00:20:26
I think the value in community notes goes so far beyond just like fact-checking notes. It's actually entertaining. When somebody gets noted, it's like, it's better than getting ratioed. It's awesome.
Speaker 2
00:20:26 - 00:20:33
It is awesome. Yeah. The human value of some of the notes is high. I think it's...
Speaker 1
00:20:33 - 00:20:38
When fact-checkers get noted, it's like, it's so ironic, you know, It's beautiful.
Speaker 3
00:20:38 - 00:20:56
Well, we have so much experience with fact checking because we were getting fact checked by Snopes all the time for our jokes. But that was just 1 blue haired lady or something getting mad at 1 of our jokes. And it's so much more powerful when it's like you said people that disagree. The community coming together to fact check something.
Speaker 2
00:20:56 - 00:21:24
And yeah, I mean the ones that I see are, they're solid. It's like rare to see a wrong 1. So And anyone can get community noted, including me obviously, as well as advertisers and presidents of countries and whatnot. I think once you get noted a few times, it's an honesty amplifier. Well, and
Speaker 3
00:21:24 - 00:21:25
they're not used to it.
Speaker 2
00:21:25 - 00:21:27
They're not used to it.
Speaker 3
00:21:27 - 00:21:34
People that are on the left or just in their own echo chamber, government leaders, they're not used to that kind of accountability.
Speaker 2
00:21:34 - 00:21:39
They're not. I think it definitely comes as a surprise when they get a community note for the first time.
Speaker 3
00:21:39 - 00:21:41
Have you been noted? How many times have you been noted?
Speaker 2
00:21:42 - 00:21:44
I'm not sure, maybe 3 times, I don't know.
Speaker 3
00:21:45 - 00:21:49
I've never been community noted, I don't think. Babylon Bee hasn't been community. Not yet.
Speaker 1
00:21:49 - 00:21:54
I was wondering about that. If we're gonna get like marked like this was satire, if people believed it was true or something like that.
Speaker 4
00:21:54 - 00:21:55
But it hasn't happened. No, we
Speaker 3
00:21:55 - 00:21:57
don't make any truth claims really, right?
Speaker 4
00:21:57 - 00:21:57
I mean,
Speaker 3
00:21:57 - 00:21:59
so I don't know. That was the weird thing about the fact checks.
Speaker 1
00:21:59 - 00:22:04
People believe it sometimes. I can't tell if it's real. I can't blame them. No, no, no.
Speaker 2
00:22:04 - 00:22:21
Some of these headlines or I can't... Some of these articles are... I can't tell... I mean, I can tell that it doesn't have a B logo on it, but the actual text of the headline is indistinguishable from the, you know, piece. So it's very, very common at this point.
Speaker 2
00:22:22 - 00:22:23
It's a crazy world we're living in.
Speaker 1
00:22:23 - 00:22:42
Yeah, that's why I don't blame people when someone says that they make fun of people who think that satire is true. I'm like, I don't, You know, the real headlines are so satirical in nature, they seem like a parody. So if you believe a B article, it's just… Yeah. Naturally it's going to happen. So you want to
Speaker 2
00:22:42 - 00:22:50
see… You know, you see some of these things where it's like free speech is bad for free speech, you know. There was like 1 headline to that effect.
Speaker 1
00:22:50 - 00:22:50
Yeah, yeah.
Speaker 3
00:22:52 - 00:22:54
Wait, is the satire real?
Speaker 2
00:22:55 - 00:23:03
Is the satire real? It's like, yeah. I don't know. You know, when the New York Times had like, critical thinking is bad for our democracy or something.
Speaker 3
00:23:03 - 00:23:04
I remember that.
Speaker 2
00:23:04 - 00:23:06
I'm like, what are you talking about? That's insane.
Speaker 1
00:23:08 - 00:23:09
There's a lot of those.
Speaker 2
00:23:09 - 00:23:09
There's a lot of those.
Speaker 1
00:23:09 - 00:23:27
But you want to see Twitter be a place where truth prevails through debate, right? Where people can actually go back and forth rather than you having somebody from the top down deciding this is what's true and this is what everybody needs to believe and in lockstep unison putting that out there and silencing it and stamping out everybody else.
Speaker 2
00:23:27 - 00:24:11
Yes, I think we definitely want devotion narratives. It doesn't mean they have to be, you know, that people need to get violent about it or anything, but I think you want to have a marketplace of ideas where people you know have different viewpoints, they argue their viewpoints and maybe some minds are changed along the way. So there's both, it's also is the narrative true and who's choosing the narrative? Who's choosing the narrative is a bigger deal than is everything in the article accurate. Because there's a lot of things you can write about on Earth, why write about that thing?
Speaker 2
00:24:15 - 00:24:30
So the aspiration here for X slash Twitter is to be, I would call it the least untruthful place on the internet. Like we should acknowledge that there will be things that are untrue on the platform, but we'd like to be the least untrue of any place.
Speaker 1
00:24:30 - 00:24:33
And people have a right to be wrong anyway. They have a
Speaker 2
00:24:33 - 00:24:52
right to be wrong. But I think we're succeeding if Twitter is the place where you can get the most accurate, the closest to the truth, and different perspectives on the truth to the degree that it's subjective. That's the goal. Pretty straightforward.
Speaker 3
00:24:53 - 00:24:56
And Elon even thinks that he has the right to share his opinion.
Speaker 1
00:24:56 - 00:25:10
Yeah, it's the audacity. Which we found out. The audacity to share your own opinions. I was thinking the opposite. So in this, this is a reference to the interview that you had recently with CNBC and And you were asked Why you tweet certain things?
Speaker 1
00:25:11 - 00:25:16
Why do you why do you share that publicly? Why do you share it widely? Why don't you why don't you keep it in the question?
Speaker 2
00:25:16 - 00:25:22
They're like sources Magneto? Yes, sorry Magneto, I shouldn't have said that.
Speaker 4
00:25:22 - 00:25:23
It's unfair
Speaker 2
00:25:23 - 00:25:23
to Magneto.
Speaker 4
00:25:24 - 00:25:25
It's unfair
Speaker 2
00:25:25 - 00:25:26
to Magneto, exactly.
Speaker 1
00:25:29 - 00:25:35
My mind goes the opposite direction. I'm like, what's in your unsent drafts? I want to know what you're holding back.
Speaker 3
00:25:35 - 00:25:36
We promise we won't
Speaker 1
00:25:37 - 00:25:39
publish this. Yeah, yeah, yeah. We'll turn the cameras off.
Speaker 2
00:25:40 - 00:26:09
Yeah, I probably, there have been a few cases where I, you know, if I still want to send this tweet in the morning, I will do so. And there have been a few cases where I should have done so that would have saved a lot of grief. So there is some truth to that. Like a good friend of mine said, you know, I'm not going to suggest that I stop tweeting, but you might want to consider saving it as draft and seeing if you want to tweet it in the morning.
Speaker 3
00:26:09 - 00:26:11
That's good advice. That should be in Bible first.
Speaker 2
00:26:12 - 00:26:24
I think Lincoln did something where he used to write these angry letters to people and then he came to the conclusion that It really wasn't doing any good and so he'd write the angry angry letter and then see if he wanted to mail it the next Day
Speaker 1
00:26:24 - 00:26:25
put in a drawer
Speaker 2
00:26:25 - 00:26:29
stick in a drawer. I believe that was and then he almost never mailed it
Speaker 1
00:26:29 - 00:26:31
But you don't write angry tweets
Speaker 2
00:26:33 - 00:26:47
Once in a while, you know, or then sometimes that my tweets are seen as angry when they're not. So, it's difficult to convey tone. You know, sarcasm in a tweet.
Speaker 3
00:26:48 - 00:26:50
We need that sarcasm font. That's a feature there.
Speaker 2
00:26:50 - 00:26:55
Yeah, Twitter You seem kind what is the sarcasm fun?
Speaker 3
00:26:56 - 00:27:00
We got to make it up. It was like the Spongebob. It's the you know the capital letters and
Speaker 2
00:27:00 - 00:27:02
the capital yet lettery thing Yeah,
Speaker 3
00:27:02 - 00:27:05
make a sarcasm font on Twitter. We can use it.
Speaker 2
00:27:05 - 00:27:05
Okay,
Speaker 1
00:27:05 - 00:27:07
You seem taken back by that question, though.
Speaker 2
00:27:07 - 00:27:10
Like, I'm just trying to like, how would you visually show that something is sarcasm?
Speaker 1
00:27:10 - 00:27:14
Oh, yeah. I don't know. There's no way to do that. You need a disclaimer.
Speaker 2
00:27:15 - 00:27:16
Yeah.
Speaker 1
00:27:19 - 00:27:21
He's thinking about it. He's trying
Speaker 4
00:27:21 - 00:27:21
to come
Speaker 1
00:27:21 - 00:27:23
up with something. Now he's calculating. Now it's a problem he's got
Speaker 2
00:27:23 - 00:27:27
to solve. You can animate the font. Yeah. In a sarcastic way. Right.
Speaker 2
00:27:27 - 00:27:33
Sarcastic animation. Can't you tell?
Speaker 3
00:27:33 - 00:27:35
We've lost him. Now he's just trying to solve this
Speaker 2
00:27:35 - 00:27:42
problem. You call that animation? Of course not. It's sarcastic animation.
Speaker 4
00:27:42 - 00:27:42
I'm
Speaker 1
00:27:42 - 00:27:51
seeing a lot of people who are saying they want to put their video on Twitter now. They can't. Tucker Carlson is going to do his show on Twitter, right? And that wasn't like a deal you worked out, you said it wasn't.
Speaker 2
00:27:51 - 00:27:52
There's no deal
Speaker 1
00:27:53 - 00:27:58
at all. You recommended it to Don Lemon too. Did he respond? Did he say he's going to do it?
Speaker 2
00:27:58 - 00:28:22
I don't know, I actually haven't looked it up. But in general I think it would be cool. So I mean, we did talk and he just asked me if he does something on Twitter, will we censor it? And I was like, well, No, we believe in the First Amendment and the Second Amendment too. For protecting the First Amendment.
Speaker 2
00:28:24 - 00:28:45
Amen. Exactly. Amen. So, you just want to confirm that we wouldn't suspend his account or whatever. And I said, as long as it's lawful, then we will not suspend the account.
Speaker 1
00:28:46 - 00:28:48
That was enough for him to say,
Speaker 4
00:28:48 - 00:28:48
this is what
Speaker 2
00:28:48 - 00:28:52
I want to do. Yeah, he wanted to confirm that as a free speech.
Speaker 1
00:28:52 - 00:28:57
He wasn't concerned about monetization. We want monetization. How can we make money?
Speaker 2
00:28:58 - 00:29:32
I think Dhaka's actually in a pretty good financial position. I think he's not struggling to pay the mortgage. I did say that we've got the subscriptions thing and that we will be also sharing ad revenue with creators. Kind of normal stuff really. In order for someone to put their video on Twitter as well as say YouTube, then they need to at least make money on that, like equivalent amount of money or maybe more on Twitter ideally.
Speaker 3
00:29:32 - 00:29:38
Yeah, we make about $27 a month on Facebook video. Nice. YouTube's only a little bit better.
Speaker 2
00:29:39 - 00:29:40
Are you serious?
Speaker 1
00:29:40 - 00:29:42
We don't make much on Facebook. Facebook's very little.
Speaker 2
00:29:42 - 00:29:42
It's very little.
Speaker 3
00:29:42 - 00:29:51
Yeah. Facebook's really bad on the monetization. Yeah. YouTube's a little bit better, but still not. Nowhere near the number of views you're getting.
Speaker 3
00:29:51 - 00:30:00
You'll get millions and millions of views, especially on these shorts videos, and they throw pennies your way. So what is the plan for Twitter video monetization for users?
Speaker 2
00:30:00 - 00:30:04
Well, we obviously have subscriptions, and subscribe to the video course.
Speaker 1
00:30:04 - 00:30:05
Thank you.
Speaker 2
00:30:05 - 00:30:05
You're welcome.
Speaker 3
00:30:05 - 00:30:08
Can you look at that camera and say, subscribe to the Babylon Be Here? Mad Fientist
Speaker 2
00:30:08 - 00:30:09
Subscribe to the Babylon Be Here.
Speaker 4
00:30:09 - 00:30:09
Ryan Lewis There you go. Mad Fientist You
Speaker 2
00:30:09 - 00:30:14
won't regret it. Ryan Lewis All right, the interview is done. We
Speaker 3
00:30:15 - 00:30:16
got what we needed. Mad Fientist We have
Speaker 2
00:30:17 - 00:30:25
subscriptions obviously. We're also going to be surfacing tweets that are subscription tweets where you see the first line.
Speaker 3
00:30:25 - 00:30:26
We saw that. We've seen
Speaker 1
00:30:26 - 00:30:26
it already.
Speaker 2
00:30:30 - 00:30:47
Yeah. So that's going to be really a big deal for growing subscriber base because people want to kind of see like what is this tweet, you know. So we'll keep like, you know, providing, analyzing content that is for subscribers only. I think it'll take a moment for users to get used to it, but the rate of growth of subscriptions is crazy.
Speaker 4
00:30:50 - 00:30:50
This
Speaker 2
00:30:54 - 00:31:30
is the fastest I've seen anything grow. It's like wildfire. I think it's going to be good. So there's subscriptions and then there's a share of advertising revenue. And we just really needed to complete writing the software for figuring out what ads were associated with what content and then figure out some reasonable revenue share and from the date that I said we would compensate creators, we will backdate it to that and send people checks.
Speaker 1
00:31:30 - 00:31:55
So with the advertisers, are the advertisers that paused back for the most part, are you continuing to see problems with them being concerned about your free speech stance? And the new hire that you have coming in as CEO, is she gonna be addressing some of those issues and trying to work with advertisers to make sure that they feel comfortable on the platform? But then also how do you balance that with a commitment to free speech? It's kind of a
Speaker 2
00:31:55 - 00:32:31
yeah, so I mean, I think it's reasonable for an advertiser to decide what? What what what content they appear next to you know? We'll make sure that that their ad doesn't appear next to something controversial or whatever they consider controversial. Something is fairly obvious, like you don't want a kids movie showing up next to some racy content of some kind, like NSFW stuff. Or, you know, Compass at Disneyland next to dead bodies in Ukraine.
Speaker 2
00:32:32 - 00:33:05
You know, it's like, that's, it's reasonable to have these adjacency controls. Just like, you know, advertisers could pick a particular time of day to advertise. If they're gonna advertise, you know, up through, I don't know, whatever it is, 8 or 9 p.m. Versus like, you know, midnight advertisement, it's gonna be different. So, you know, just, We have those controls in place for advertisers and we said look, there's freedom of speech but not freedom of reach necessarily.
Speaker 2
00:33:05 - 00:33:30
So it's like if somebody, I mean, we have to be careful that that doesn't become dystopian, but people can basically say what they want to say. But If a lot of verified accounts flag it as questionable content, then it will not get amplified. So we're doing it, like I said, we need to make sure this doesn't have bad effects.
Speaker 1
00:33:30 - 00:33:48
This new hire though, there's been people, you made kind of both sides mad about it, right? You got people on the left who are mad for their reasons, you got people on the right who are mad for their reasons. You seem satisfied with that outcome because it means that if you've got people on the fringes on both sides upset, then you're doing something right, right? Yeah. But when it comes to...
Speaker 4
00:33:48 - 00:33:48
I have to go back
Speaker 2
00:33:48 - 00:33:52
to LGBTQ and QAnon, simultaneously upset.
Speaker 1
00:33:53 - 00:33:54
I was thinking...
Speaker 2
00:33:54 - 00:34:01
That's why I think things go full circle, and LGBTQ and on. It'll probably get me in trouble.
Speaker 1
00:34:02 - 00:34:19
If you like, well, probably will. You were saying, you're saying you want to speak your mind even if it costs you something. You want to be able to say what you want even if it costs you something. You said that very emphatically in that interview you just did. And so that matters to you a lot.
Speaker 1
00:34:20 - 00:34:50
And so in a context where you've got like a new CEO coming in, who's gonna be taking over, and you want everybody else to have that same freedom, to be able to say what they want, even if it costs them something personally in their lives or whatever, to have that freedom. It would seem to me that the number 1 thing you'd be looking for is someone who's going to come in and be as committed to free speech as you are. That trumps even advertising revenue, in your view, if you're willing to lose money to be able to personally speak freely, then …
Speaker 2
00:34:50 - 00:35:02
Yeah, we've also lost money, advertising money because some advertisers got community noted. And we wouldn't take the community note down. So We lost $40 million in advertising because
Speaker 1
00:35:02 - 00:35:04
of that. Because of big ones pulled out.
Speaker 2
00:35:05 - 00:35:08
Because of community notes and refusing to pull community notes.
Speaker 1
00:35:08 - 00:35:10
There was a willingness to lose money.
Speaker 2
00:35:10 - 00:35:15
Yes, we literally just lost $40 million. It's not an approximation.
Speaker 1
00:35:16 - 00:35:25
Will things change along those lines? Will there be more willingness to compromise to keep the revenue coming in from these big advertisers? Is that a concern of yours?
Speaker 2
00:35:26 - 00:35:55
No, I think we've actually largely addressed the concerns of the advertisers. A lot of them weren't necessarily against me taking over Twitter They just weren't sure what it would look like, you know, is it say, you know, is it? Are there sharks in the water or what? You know Is there gonna be rampant disinformation Is it gonna be some like massive right-wing takeover or what? So they're just like uncertain.
Speaker 2
00:35:56 - 00:36:06
I think we're getting to the point here where they're like, you know, And a lot of them use Twitter every day. And it's like, it's actually obviously not filled with hate speech. Like anyone who uses it knows that.
Speaker 1
00:36:06 - 00:36:11
Well, the media keeps trying to say that. They keep publishing these articles. It's hate speech is on the rise. Hate speech
Speaker 3
00:36:11 - 00:36:11
is on the rise.
Speaker 1
00:36:11 - 00:36:11
2, 000%
Speaker 2
00:36:11 - 00:36:24
increase in hate speech. Yeah, it's totally false. Yeah, it's completely false. The view counts are down by a third, maybe a half, and that's despite usage of the, you know, all-time record usage of Twitter.
Speaker 1
00:36:24 - 00:36:27
And how do they define hate speech is a big question mark.
Speaker 2
00:36:27 - 00:36:31
Well, it turned out 1 of the ways that it, 1 of the hate speech terms was literally George Soros.
Speaker 1
00:36:31 - 00:36:32
His name.
Speaker 2
00:36:34 - 00:36:41
Yeah, just saying it. We counted it as a hate speech thing. So I thought, well, you know, maybe the difference is a little broad.
Speaker 3
00:36:41 - 00:36:46
Yeah. So calling him Magneto is considered hate speech.
Speaker 2
00:36:46 - 00:36:48
I mean, you know. It's like,
Speaker 4
00:36:49 - 00:36:49
it's
Speaker 2
00:36:49 - 00:36:54
a comic book, you know. It's like, let's relax, it's not the end of the world here.
Speaker 3
00:36:54 - 00:36:59
Does that make you Professor X? Who's Professor X in this?
Speaker 2
00:37:02 - 00:37:38
Well, I do know a lot of smart people. I think we'll be fine. I think the advertisers at this point are comfortable with the fact that Twitter is going to be fair and not a haven of hate type of thing. And they've used it enough for themselves at this point to, they know that from their usage it's actually fine. We've vastly reduced the amount of spam and scam stuff that happens on Twitter.
Speaker 2
00:37:39 - 00:37:59
It's quite rare at this point to see spammers, whereas my feed used to be filled with them before. I think we can, We're not going to make all advertisers happy, but I think we'll make most of them happy. There'd be enough that are happy to support this platform. But we're not going to compromise on free speech.
Speaker 3
00:38:01 - 00:38:09
You see Twitter as more than social media. You know, you've got X and the everything. Yeah. What is the future vision for them?
Speaker 2
00:38:09 - 00:39:08
You know, some of that will be fulfillment of the vision I had for X.com 24 years ago, which was to be an all-encompassing financial services provider. And that's actually an important part of freedom of freedom as well. If those doing the money exchange or running the money system can stop people who disagree with them politically, that's a huge problem. I mean, I saw that happen in Canada with the trucker strike, where people were being basically financially ostracized from society for just being in a peaceful strike. And I saw some strange things with PayPal, where they were suspending accounts for what appeared to be political reasons.
Speaker 2
00:39:08 - 00:39:09
I don't know if you saw any of that.
Speaker 1
00:39:09 - 00:39:10
Yeah,
Speaker 2
00:39:10 - 00:39:33
yeah. So, like, you know, like, so there has to be freedom of information, But money is a form of information. So I think it's important that we have a, within the bounds of the law, that we enable freedom of flow of money, which is a form of information.
Speaker 1
00:39:33 - 00:39:37
And you see that living on Twitter eventually as part of X.
Speaker 2
00:39:38 - 00:39:56
Yeah, yeah. I think we need to broaden the branding. Like Twitter made sense as a name. You know, if people are sending, you know, texts, basically group texts at 140 characters a piece between each other. That's kind of a short thing, a tweet, you know, not a long thing.
Speaker 2
00:39:58 - 00:40:32
But at the point at which you've got not just text but pictures, video, live interaction, a full array of financial services, a full array of communications, encrypted communications, voice, video, everything. Twitter, I think, is the wrong branding for that. I believe in descriptive branding. So, whereas X is kind of, X can mean anything. So X marks the spot.
Speaker 2
00:40:32 - 00:40:35
X is where the treasure is. XXX is where the coin is.
Speaker 3
00:40:38 - 00:40:41
We're Christians, so we wouldn't know.
Speaker 2
00:40:45 - 00:41:23
X rated. But I think in general the goal here is like, look, let's just make sure we take the actions that strengthen the pillars of democracy and further civilization. You know, We want to have a future we can look forward to, that we're excited about the future, not 1 we're sad about it. And Civilization is more fragile than people realize. You study the rise and fall of civilizations in history, you know, when they're at the top they never think they're going to fall.
Speaker 1
00:41:24 - 00:41:25
But they always do, eventually.
Speaker 2
00:41:26 - 00:41:29
Every civilization has a lifespan.
Speaker 3
00:41:29 - 00:41:32
About How many years do you think we got left?
Speaker 2
00:41:33 - 00:42:10
Well I'm seeing a lot of late stage civilization vibes these days. Seriously. I don't know man, there's so many wild cards. I mean the short term we've got, we had the financial crisis, probably some economic thing, but whatever, those things happen from time to time. But we've got some geopolitical wild cards with Ukraine and Taiwan.
Speaker 2
00:42:15 - 00:42:31
And then AI, which is called the singularity for a reason. Because you don't know what's gonna happen. It's like a black hole. You go in a black hole, what happens? Don't know.
Speaker 2
00:42:32 - 00:42:40
We're on the singularity, AI singularity event horizon, circling the black hole.
Speaker 3
00:42:41 - 00:42:59
And does AI concern you more than those other things like geopolitical nuclear war or whatever? And where does that fall on the scale for you? I know you've said some cautionary things about it, but I don't know if you're as much of a doomer as some people are about AI.
Speaker 2
00:43:00 - 00:43:03
I mean, the good news about Russian roulette is 5 of the barrels are unloaded.
Speaker 3
00:43:04 - 00:43:06
That's encouraging. Yeah. That's good.
Speaker 2
00:43:08 - 00:43:10
Look on the bright side.
Speaker 3
00:43:10 - 00:43:22
Yeah, it just seems like a lot of times people that are in tech are pushing these things forward without any guardrails or consideration, you know, for what could go wrong, despite all the cautionary tales.
Speaker 1
00:43:23 - 00:43:23
What is that?
Speaker 3
00:43:23 - 00:43:25
That's what sci-fi is all about.
Speaker 2
00:43:25 - 00:43:25
You think
Speaker 1
00:43:25 - 00:43:30
there should be regulation and stuff like that, right? But what does that look like? How do you regulate AI, for example?
Speaker 2
00:43:31 - 00:44:14
Well I think you start off with an insight committee that has people that are independent from the leading players as well as maybe some representatives from the leading players And that's an insight committee. The goal is simply to learn things. And then consult with industry and propose rules. That's basically how it's worked for food and drug or for aircraft or cars. So When there's something that is a danger to the public, there's some regulatory body to, kind of like a referee, make sure companies don't cut corners.
Speaker 1
00:44:15 - 00:44:50
When it comes to, just coming back to speech real quick, and Twitter, I've often wondered, in our case with you, you know, we had a situation where this benevolent billionaire comes in and decides, I'm going to make sure free speech exists for people because there's no place where they can speak freely right now. So I'm gonna buy Twitter and restore free speech on this platform. Great, we're glad that you did that, thank you. But should we have to rely on benevolent billionaires to solve these problems for us, or should there be legal changes? Should there be, like right now the law currently protects us from government censorship, obviously, right?
Speaker 1
00:44:50 - 00:45:08
The First Amendment does that. But it doesn't protect us from private companies that want to muzzle our speech. So should there be legal changes that prevent viewpoint discrimination? And do you support laws like that? Are you aware of some of the legal, the current bills like in different states that are being pushed
Speaker 2
00:45:08 - 00:45:44
along those lines? There are various bills. I mean, we have to be careful that the bill that is intended to do good does not pave the road to hell. So I think something that I think would be very powerful is to say that all social media companies have to open source their algorithms so that if at least it's known what they're suppressing and they can't secretly suppress things. There's a massive amount of secret suppression going on on Facebook, Google, Instagram, massive.
Speaker 2
00:45:44 - 00:45:46
It's just nonstop secret suppression.
Speaker 1
00:45:46 - 00:45:47
And there was at Twitter.
Speaker 2
00:45:47 - 00:45:48
And yeah, absolutely.
Speaker 1
00:45:49 - 00:45:50
But not anymore, right?
Speaker 2
00:45:50 - 00:46:04
Right. And people call it shadow banning. Correctly, it's like there's nothing that says that you're banned, but you are effectively stuck in the basement.
Speaker 4
00:46:05 - 00:46:05
Well, that
Speaker 1
00:46:05 - 00:46:11
would be a legal change. That would be a law that requires open source, that if you're going to censor, you better do it transparently.
Speaker 2
00:46:11 - 00:46:22
Yeah, the thing is that it really doesn't take much to censor something. So there's a lot of censorship that goes on at Google that people don't realize because all that needs to happen is just move that link to page 5.
Speaker 1
00:46:22 - 00:46:23
Right.
Speaker 2
00:46:23 - 00:46:43
You know, I mean, the thing about, the little joke about Google is like, what's the best place to hide a dead body? Well, the second page of Google search results. And nobody ever looks there. So if you just nudge something to page, even page 2, it's going to drop the visibility by a hundred.
Speaker 1
00:46:43 - 00:46:52
But if you apply that to the principle of freedom of speech but not of reach, isn't that the same thing? Is there a difference? Potentially, yeah. That's why you're saying you've got to be careful with how you handle that.
Speaker 3
00:46:52 - 00:47:03
Yeah. Well, 1 of the craziest things you did when you took over Twitter was start releasing the Twitter files, which, like who takes over a company and then says, look how horrible all this stuff is going on.
Speaker 2
00:47:03 - 00:47:17
We need to have truth and reconciliation here. So we're not going to expose the things that were done wrong. Why should people believe us in the future? So that's why we're trying to be as transparent as possible. So it's like, don't take my word for it.
Speaker 2
00:47:17 - 00:47:41
Literally look at the algorithm, you should be able to recreate the results that you see on Twitter with the, you know, using that algorithm. So we're trying to make sure that everything is brought to light, not just, there's no hidden layers or anything. We just discovered a, you know, last week a hidden, like some hidden layer of censorship that was written in
Speaker 1
00:47:41 - 00:47:42
2012.
Speaker 2
00:47:43 - 00:48:12
And like censorship is maybe the wrong word. It would basically suppress, it had like a list of words and any of those words, some of them were like, like suck, if you put suck. Just S-U-C-K, actually even with other words, it gets massively deamplified. And that was literally code from like 2012. We found this relic of code last week.
Speaker 1
00:48:12 - 00:48:14
And it was being applied to like all tweets?
Speaker 2
00:48:14 - 00:48:28
Yeah, all tweets. So there's like the guy that's bought a flight on Starship in Japan is his name is his Twitter handle is you suck
Speaker 1
00:48:28 - 00:48:29
2020.
Speaker 2
00:48:32 - 00:48:35
He was like, listen, there's something wrong with my account, you know.
Speaker 1
00:48:36 - 00:48:38
I think I'm shadow banned. Yeah.
Speaker 2
00:48:40 - 00:49:08
And sure enough, it's because his Twitter handle had suck in it. So it turned out there was about a thousand words from some ancient list that were tweets were being suppressed. And then we just found a list of URLs that are being suppressed. Some of them are obviously renounced gamma URLs, but some of them aren't.
Speaker 3
00:49:08 - 00:49:09
Like Babylonbee.com.
Speaker 4
00:49:11 - 00:49:12
Yeah.
Speaker 2
00:49:14 - 00:49:17
So, it's like an archeological dig, frankly.
Speaker 3
00:49:19 - 00:49:26
You talk a lot about wokeness and the woke mind virus and all that. Do you feel like that's what was driving those kinds of decisions here?
Speaker 2
00:49:26 - 00:49:41
Yeah, absolutely. It's like, you know, around here it's not about whether you had some of the world Kool-Aid, you're swimming in the world Kool-Aid. It's like a fish doesn't see the water, it's swimming in
Speaker 1
00:49:41 - 00:49:42
it.
Speaker 2
00:49:43 - 00:50:05
It just seems normal. Like if you're in the cult, it's normal. The thing is like there's actually so much material, like there's God knows how many lines of Slack, for example. This company ran on Slack, so it was like 1 massive group text situation. So it's a lot of material.
Speaker 2
00:50:07 - 00:50:21
Not that many emails, emails mostly were for people coming from outside, you know, outside communications. And then there was that weird government portal that destroyed everything after 2 weeks, which doesn't sound legal, but yeah.
Speaker 1
00:50:21 - 00:50:26
I thought it was amazing how you were, from day 1, basically handling customer service.
Speaker 4
00:50:26 - 00:50:27
Yeah.
Speaker 1
00:50:27 - 00:50:32
You were like, personally, looking at reports from people, trying to dig into things.
Speaker 2
00:50:32 - 00:50:33
Yeah.
Speaker 1
00:50:33 - 00:50:36
And that was like around the clock for you for a little while.
Speaker 2
00:50:36 - 00:51:07
I mean, yeah, the sheer layers, like this was a term of sea of bullshit basically. So many layers. I mean, really there needs to be a ground up rewrite. Otherwise we're going through this ghost mansion, trying to get the ghosts out 1 at a time. Still finding ghosts as recently as last week.
Speaker 2
00:51:10 - 00:51:24
So, but we are actually slimming down the code base. We are rewriting a lot of it. The main sort of home timeline mixer was
Speaker 1
00:51:26 - 00:51:26
700, 000
Speaker 2
00:51:26 - 00:51:34
lines of code. Now it's 70, 000. So it's down by a factor of 10. And does a better job and is faster.
Speaker 1
00:51:35 - 00:51:40
So... And that's the side of the business you're going to stay involved in and personally invested in is the tech side.
Speaker 2
00:51:40 - 00:52:11
Yeah, yeah. I mean, I'm setting the overall ground rules and like the Constitution type of thing, which is very clearly free speech. And that is not a revenue optimizing strategy. So, But we don't need to make a ton of money, we just need to not go bankrupt. That's part of the reason for having some amount of subscription revenue.
Speaker 2
00:52:15 - 00:52:18
We respect the advertisers but we don't want to be too dependent on them.
Speaker 1
00:52:18 - 00:52:21
Not go bankrupt, that's the business, that's the end goal.
Speaker 2
00:52:22 - 00:52:35
Not go bankrupt, you know, hopefully make at least some reasonable return for investors. But it's not like, you know, It's not like just some sort of mercantile optimization.
Speaker 1
00:52:36 - 00:52:40
Do you think it's possible to run a free speech platform that is profitable?
Speaker 2
00:52:40 - 00:52:52
We're going to find out. I think so. I mean, like I'm not sure. You can definitely sell your soul to the devil here and increase revenue. That's for sure.
Speaker 2
00:52:52 - 00:53:30
Yeah. So that, you know, that's, and I think it becomes tough, especially for a publicly traded company. You can be sued for not taking the actions that maximize profitability. So, you know, Jack was right that it was impossible to reform as a public company because you just get excoriated in the market and have a zillion lawsuits. Anyway, so it's like, I'm hopeful that Platform can be, like I said, the least untrue, closest to true and complete.
Speaker 2
00:53:32 - 00:53:55
It's going to be, you know, hopefully the whole truth, probably not nothing but the truth, because there'll be some things that aren't true. But the whole bit, but diversity, diversity of perspectives. So, and diversity of narratives. So people, you know, can decide what they want to focus on and not be forced to add things. Also recommend Twitter.
Speaker 2
00:53:55 - 00:54:19
Twitter lists are actually pretty great. Yeah. You know, so, and we've now, it made it easy to find like lists created by experts on various subjects. So that's another way to stay informed on any subject, whether it's sports or politics or video games or whatever.
Speaker 1
00:54:19 - 00:54:22
How worried are you about Mastodon and Tribal?
Speaker 2
00:54:26 - 00:54:34
Well, once I tried using them, I was like, it's not going to be, It's not a threat. That was my conclusion.
Speaker 3
00:54:35 - 00:54:39
Most importantly, you brought up video games. Have you played the new Zelda game yet?
Speaker 2
00:54:40 - 00:54:43
Is it great? It is. You know, I've never played a Zelda game before.
Speaker 3
00:54:43 - 00:54:45
You haven't? Wow.
Speaker 2
00:54:46 - 00:54:51
That's bright. I only... With Rarox, I've only... I've played just PC games. Is Zelda on PC?
Speaker 2
00:54:51 - 00:54:53
Nintendo only. Not legally. Okay.
Speaker 4
00:54:53 - 00:54:54
Can't connote
Speaker 1
00:54:54 - 00:54:55
on that, but you
Speaker 3
00:54:55 - 00:54:55
know.
Speaker 2
00:54:56 - 00:54:57
Halo was
Speaker 4
00:54:57 - 00:54:57
the first game I played. I played it. I played it. I played it.
Speaker 3
00:54:57 - 00:54:58
I only with works.
Speaker 4
00:54:58 - 00:54:58
I've only I played just PC
Speaker 3
00:54:58 - 00:54:59
games It's all done PC Nintendo
Speaker 4
00:54:59 - 00:54:59
not legally
Speaker 2
00:54:59 - 00:55:01
Halo was the only game I ever played on a console.
Speaker 3
00:55:01 - 00:55:03
Okay, what are you playing right now?
Speaker 2
00:55:03 - 00:55:14
I'm actually looking for something good to play. So I finished Dead Space, which is the Dead Space remake is really good.
Speaker 4
00:55:14 - 00:55:14
Is it?
Speaker 3
00:55:14 - 00:55:16
Yeah. I heard it was woke. Is it woke?
Speaker 2
00:55:17 - 00:55:19
The Dead Space remake I would not
Speaker 3
00:55:19 - 00:55:20
say. Not woke, okay.
Speaker 2
00:55:21 - 00:55:34
No, I mean, it's literally trying to make things as scary as possible. It's like, it's maximizing fear. Yeah, it's gruesome.
Speaker 3
00:55:35 - 00:55:37
I played the original, I haven't played the remake.
Speaker 2
00:55:37 - 00:55:53
No, it's gruesome. Yeah. I mean it's like it's nightmares, really the stuff of nightmares. Yet I prefer doing that than email.
Speaker 1
00:55:56 - 00:56:00
Or Slack. Are you in the Slack channels? Do you go in there?
Speaker 2
00:56:00 - 00:56:01
I don't use Slack because there's just
Speaker 1
00:56:01 - 00:56:02
too much. It's overwhelming.
Speaker 2
00:56:03 - 00:56:06
Yeah, exactly. It's like being in a million group chats.
Speaker 3
00:56:06 - 00:56:09
Dead space, not as terrifying as a group chat.
Speaker 2
00:56:10 - 00:56:11
Yeah, better than email.
Speaker 1
00:56:11 - 00:56:12
Better than email.
Speaker 2
00:56:15 - 00:56:19
The Elden Ring was really good. Did you play Elden Ring?
Speaker 3
00:56:20 - 00:56:25
He beat it. I haven't beaten it yet. I got a horse, but I didn't get very much.
Speaker 2
00:56:25 - 00:56:29
That's a long way to go after the horse, to say the least.
Speaker 1
00:56:31 - 00:56:34
You played Halo. Did you ever play multiplayer or did you just do the campaign?
Speaker 2
00:56:34 - 00:56:46
Well, there's the recent 1 which is... I finished... To brag a little, I completed the campaign on Legendary. Wow. Which is hard.
Speaker 2
00:56:48 - 00:56:55
Thank you. I mean, you know, that last battle was really difficult.
Speaker 1
00:56:57 - 00:57:03
But you don't go online and play against random people on the internet, just multiplayer, like team deathmatch?
Speaker 2
00:57:04 - 00:57:12
No. Well, I did that for a split second with a friend. It's tough. So, I actually...
Speaker 3
00:57:12 - 00:57:15
He's trying to ask if you'll be his friend on Xbox.
Speaker 1
00:57:15 - 00:57:19
I'm just wondering if every now and then I'm maybe playing against you under some...
Speaker 2
00:57:20 - 00:57:43
I used to play a lot of video games, a lot of competitive video games. So way back in the day, I was quite competitive at Quake. And my team of 4, I was the second best guy on the team. We came second in the first, I think was the first paid E-Games tournament in the US.
Speaker 1
00:57:43 - 00:57:45
That's where you got your money, Your start.
Speaker 2
00:57:46 - 00:57:48
Yeah, I think we got like $3, 000 or something.
Speaker 3
00:57:48 - 00:57:49
We need community notes.
Speaker 1
00:57:49 - 00:57:52
It wasn't the Emerald Mines. It was a Quake tournament.
Speaker 3
00:57:52 - 00:57:56
We need community notes to fact check this claim. Get on it, community.
Speaker 2
00:57:57 - 00:58:15
Yeah. So yeah, we were actually, we almost won. But the guy who's the best on the team, Brandon Spikes, his computer crashed so we came second instead. We still got money and stuff. I've made money playing video games.
Speaker 1
00:58:15 - 00:58:17
That's amazing. I have not.
Speaker 2
00:58:20 - 00:58:31
These days I kind of like a campaign with a good story. It's hard to match reflexes with like, you know, 16 year olds. Yeah. There's got to be some strategy element to it.
Speaker 3
00:58:31 - 00:58:36
They're always saying mean things about your mom and stuff on there. It's kind
Speaker 2
00:58:37 - 00:58:41
of harsh. Yeah. I have some friends who play League of Legends.
Speaker 3
00:58:41 - 00:58:43
Oh yeah. They're pretty mean
Speaker 2
00:58:43 - 00:58:45
in those communities. Yeah, they are.
Speaker 3
00:58:46 - 00:58:47
Toxic. They call it toxic community.
Speaker 2
00:58:48 - 00:58:52
Yeah. League is intense. League is a lifestyle.
Speaker 3
00:58:53 - 00:58:55
Yeah. 1 of those lifestyle games.
Speaker 2
00:58:55 - 00:59:16
Yeah. I think my son Griffin put more time into League than he did into his college applications. I mean, if he was awake at like 5 in the morning, that's like he's playing League. So yeah, I'm kind of looking for a game to take your mind off things for half an hour or whatever.
Speaker 3
00:59:16 - 00:59:24
Well, I'll try to come up with some recommendations for you. The new Jedi game, I'm going to get that soon. Can't recommend it yet because I haven't played it yet. What's a
Speaker 2
00:59:24 - 00:59:27
fun game for phones called Vampire Survivors?
Speaker 1
00:59:29 - 00:59:29
I'll check that out.
Speaker 2
00:59:29 - 00:59:30
Yeah, that's good.
Speaker 3
00:59:31 - 00:59:56
My recommendation. Well, what do you like to watch in terms of comedy and entertainment? I know you share a lot of memes and obviously comment on some Babylon Bee articles and stuff like that. You commented, You tweeted 1 time about how wokeness is destroying comedy. You can't write comedy from a leftist place of wokeness because there's no truth claim or the truth claim is wrong.
Speaker 3
00:59:57 - 00:59:57
I'm butchering it.
Speaker 2
00:59:57 - 01:00:59
I mean, you know, I hardly need to tell you guys, but the essence of a lot of comedy is a revealed truth, like a hidden truth that people understand intuitively or explicitly. And there's that, there's that sort of moment of reveal, you know, kernel of truth, of often unacknowledged truth, and in that unacknowledged truth is the humor. So if you're, you know, premised on a lie, You can no longer be funny because there's no reveal truth and this is You know why I love you know a lot of people on the left have no sense of humor they're not funny And if there's so many no-fly zones, you know That you have to do you have to avoid all the time, then how do you, what is there, like, there's nothing left to make, to have fun about, you know.
Speaker 1
01:01:01 - 01:01:24
Well, Rogan was saying like, wokeness is funny. Like a lot of the woke ideas are so outrageous, they are funny. And so when you're not allowed to make fun of the funniest thing, that's why sites like ours do well, it's because we are willing to do that. There's a lot of people who are just holding back on the things that are right for comedy. Have you been to the Mothership Club yet in Austin where he's...
Speaker 1
01:01:24 - 01:01:33
No? I'd love to go there. He's like, he's trying to make it so that comedians can come there, cancel free environment, make the jokes you're not supposed to make.
Speaker 2
01:01:33 - 01:02:13
I mean, if you see any comedies from times past, I mean, they're so sort of verboten at this point, it's insane. You know, they're making fun of all sorts of things that would be totally unacceptable these days. That's what our Twitter wants, like, just, you know, legalized comedy. If you keep, you know, saying that you can't make fun of things, or something's always punching down, you know, like well You're not gonna be anything left to make humor, I don't think we want to humorless future
Speaker 1
01:02:16 - 01:02:35
This is it means the same thing that you have with speech, where it's like, you know, if only the popular narrative is being promoted and it's never being challenged, then that's a problem. That's a problem for comedy, too. You can't be funny just propping up the popular narrative and trying to get people to clap for you. Yeah. Like it's applause, it's not laughter.
Speaker 2
01:02:36 - 01:02:37
Yes, true.
Speaker 1
01:02:37 - 01:02:39
Clapped her, somebody called it.
Speaker 2
01:02:39 - 01:02:54
Yeah. Yeah, it's funny seeing some of the snippets from the White House Correspondents Club dinner, you know, where Biden was getting a lot of claps, but it was like, is this your cheering squad or what? You know?
Speaker 1
01:02:55 - 01:02:57
A lot of applause. A lot of applause.
Speaker 2
01:03:01 - 01:03:06
You know, yeah, whoever stopped clapping last is out of the, out of the bottle.
Speaker 1
01:03:08 - 01:03:13
You just want someone normal to be precedent, you said recently. Someone normal, a normal human being.
Speaker 2
01:03:13 - 01:03:23
I mean, I think so. No, No, the thing is that it is contrary to my prediction of the most entertaining outcome is the most likely.
Speaker 1
01:03:23 - 01:03:23
Yeah.
Speaker 2
01:03:25 - 01:04:03
You may have heard me say that, like, this awkward phrase, the simplest explanation is most likely. My friend Jonah has this, you know, his theory is the most ironic explanation is most likely and then my variant is the most entertaining outcome is the most likely as viewed from a third party who's not in the show. So you could be watching a World War I movie where people are getting blown to bits and you're just having a soda and popcorn, you know, it's fine. It's rough for the people getting blown up by cannon shells. We're in there getting blown up by cannon shells Or a very good thing happening.
Speaker 2
01:04:04 - 01:04:22
But my theory is that the, more often than not, the most entertaining outcome, as though we were an alien soap opera, is the most likely. So if you could say What is the most entertaining outcome for the election next year? That's probably what's going to happen.
Speaker 3
01:04:22 - 01:04:26
Is that because God is a jokester and is messing with us?
Speaker 1
01:04:27 - 01:04:28
Or because we're in a simulation?
Speaker 3
01:04:28 - 01:04:29
Or is it because of simulation?
Speaker 2
01:04:30 - 01:04:32
I mean ratings, man. Just for the ratings.
Speaker 3
01:04:32 - 01:04:34
God is doing it for the ratings.
Speaker 2
01:04:34 - 01:04:49
Well, I'm saying that's often the case. The most entertaining outcome, as viewed by a third party, like it's an alien self-advertisement, is odd.
Speaker 1
01:04:50 - 01:04:53
The whole Twitter thing has been very entertaining from the outside.
Speaker 4
01:04:53 - 01:04:54
Yeah.
Speaker 1
01:04:54 - 01:04:57
On the inside, it's been hell? Or has it
Speaker 2
01:04:57 - 01:05:07
been fun? I mean, definitely some hellish moments. Yeah. Yeah, definitely been some hellish moments. So it's been, I would say it's a rough, It's been a rough 6 months, but at this point, training positively.
Speaker 2
01:05:07 - 01:05:19
It wasn't easy. A lot of open heart surgery. But things seem to be, you know, knocking wood, headed in a good direction.
Speaker 1
01:05:19 - 01:05:23
And your chief twit still right now, but you hand that off when?
Speaker 2
01:05:24 - 01:05:51
I'm still figuring that out. Linda's got to exit her obligations at NBC, So a month or so, I suppose. But I, you know, I'll still be responsible for software development and the core principles, you know, the constitution of the company being free speech is Linda understands that she supports that.
Speaker 1
01:05:51 - 01:05:52
Good.
Speaker 2
01:05:52 - 01:05:52
Yeah.
Speaker 1
01:05:54 - 01:05:57
So she signed something like it's in
Speaker 2
01:05:57 - 01:06:25
writing. Yeah. I mean a lot of things that need to get done for a company are, you know, there's a lot of chores. It's not just things that need to be done in legal, HR, finance, you know, solving interpersonal arguments that people have. You know, there's a lot of sort of just general management stuff that takes a lot of time.
Speaker 2
01:06:25 - 01:06:33
So I'm hopeful Linda can handle that and I will manage the software team.
Speaker 1
01:06:33 - 01:06:35
Well, do you have any questions for us?
Speaker 2
01:06:38 - 01:06:41
No, just, you know, cheers, I suppose. Cheers.
Speaker 1
01:06:43 - 01:06:44
Cheers, man.
Speaker 3
01:06:44 - 01:06:46
Cheers. Thank you for taking the time. This is awesome.
Speaker 2
01:06:47 - 01:06:48
Well, you're welcome here.
Speaker 1
01:06:48 - 01:06:53
Yeah. It's pretty crazy to go from Twitter jail to sitting down at Twitter HQ.
Speaker 2
01:06:54 - 01:07:00
The barbarians burst through the gate and are having a drink. Yeah. In the enemy palace.
Speaker 3
01:07:03 - 01:07:04
Drinking from the skulls of
Speaker 4
01:07:04 - 01:07:06
our, well.
Speaker 1
01:07:06 - 01:07:07
What is best in life?
Speaker 2
01:07:07 - 01:07:11
What is best in life? Exactly. Conan the Barbarian. Underrated.
Speaker 4
01:07:12 - 01:07:12
Ha ha ha ha ha.
Speaker 2
01:07:12 - 01:07:14
It does feel like, it feels like victory.
Speaker 1
01:07:14 - 01:07:16
Yeah, it does. Yeah.
Speaker 4
01:07:16 - 01:07:17
To
Speaker 1
01:07:17 - 01:07:19
us, anyway. Yeah. Yeah. Yeah. Feel great.
Speaker 1
01:07:19 - 01:07:20
Didn't cost us much.
Speaker 3
01:07:20 - 01:07:21
Didn't cost us
Speaker 1
01:07:21 - 01:07:21
40.
Speaker 3
01:07:21 - 01:07:23
Well, we did give him an IOU.
Speaker 1
01:07:23 - 01:07:25
Oh, yeah. Yeah, we're on the hook for that IOU.
Speaker 4
01:07:30 - 01:07:25
You
Omnivision Solutions Ltd