Oct. 13, 2025

How AI & Bot-Driven Outrage Are Hijacking Our Attention

We're puppets. AI has its hand up our collective asses, and we're all just talking like Kermit the Frog while Russian bot farms tear society apart.

Dramatic? Maybe. True? Unfortunately.

In this episode we look into the increasingly unhinged intersection of AI, social media manipulation, and what's left of our shared reality. It's not all doom. There's some genuine good AI can do (like helping my kid with homework). But it's mostly doom.

Here's what we covered:

The Episode Breakdown

  • Mental institutions are overrun with people suffering from AI-related disorders, some fell in love with chatbots, others can't function without AI assistance, others are spiraling into existential dread
  • Bot farms are weaponizing brands in culture wars (see: Cracker Barrel's logo change), and most of that online outrage isn't from humans, it's from Russian troll operations
  • Peter Thiel is out here telling Christians at private Silicon Valley rallies that stopping AI development will hasten the Antichrist (I wish I were making this up)
  • You are the product not the customer, on every social media platform. The actual customers are advertisers buying your attention. The cost is your sanity.
  • The cave tour guide principle: How darkness + suggestion = seeing threats that aren't there (based on Jason's actual cave tour guide days)
  • Why everything feels existential: The internet profits from making every problem feel like a crisis. Your nervous system can't tell the difference between real threats and manufactured outrage.
  • The influencer accountability problem: Anyone can build an empire on conspiracy theories now. There's no regulation. No licenses. Just vibes and reach.
  • The only real escape: "Just don't look" (Simpsons-style). Not forever. Not completely. But enough to remember what reality feels like.

----

MORE FROM BROBOTS:

Connect with us on Threads, Twitter, Instagram, Facebook, and Tiktok

Subscribe to BROBOTS on Youtube

Join our community in the BROBOTS Facebook group

----

LINKS TO OUR PARTNERS:

 

 

1
00:00:00,000 --> 00:00:01,401
we're puppets.

2
00:00:03,142 --> 00:00:08,666
We're gonna have the old AI hands stuffed up our asses while we're talking like Kermit the
Frog, you know?

3
00:00:08,666 --> 00:00:10,017
Hello, Piggy, how are you?

4
00:00:10,017 --> 00:00:10,951
CAN IT FROG!

5
00:00:10,951 --> 00:00:14,497
This is the shit that's coming down we're to have no way to stop it.

6
00:00:14,630 --> 00:00:17,632
I'm just going to record because I don't even know where to start with this.

7
00:00:17,632 --> 00:00:26,637
It's headline after headline, end of the world, whether it's the biological weapons that
are now being protected by AI or whether it's the fascists that are coming to kill us all

8
00:00:26,637 --> 00:00:29,879
or whether it's the AIs that's going to implode and kill us all.

9
00:00:30,320 --> 00:00:39,285
I mean, I was I was so excited because I used AI in the last week or so doing a couple of
things that I thought were really cool, really helpful, like.

10
00:00:39,333 --> 00:00:41,273
Made me feel like a good dad for a minute.

11
00:00:41,273 --> 00:00:43,333
I'm like, you know what, maybe this thing isn't super.

12
00:00:43,333 --> 00:00:44,473
No, we're fucked.

13
00:00:44,473 --> 00:00:45,573
We're all gonna die.

14
00:00:45,573 --> 00:00:49,933
It's gonna be a firestorm of who can kill us fastest.

15
00:00:50,453 --> 00:00:57,053
Jason, you said I don't even know how many you sent me like seven articles that are all
about how the earth is about to be burned to the ground by the robots.

16
00:00:57,053 --> 00:00:58,780
What the hell is going on?

17
00:00:58,780 --> 00:01:08,733
Yeah, so it's interesting because kind of like the chain order of events of information
that I got sent to me was was super crazy weird.

18
00:01:08,733 --> 00:01:13,662
So, you know, I'm sitting here looking at my news feeds and I have, you know, a thousand
of them that I look at.

19
00:01:13,662 --> 00:01:25,007
But I get this message because I'm sure the AI is listening to us as we're talking to
these pieces that starts talking about the fact that uh mental mental institutions are

20
00:01:25,007 --> 00:01:27,038
actually being overrun.

21
00:01:27,158 --> 00:01:34,362
by the number of people out there that are checking themselves in for AI related mental
illness.

22
00:01:40,927 --> 00:01:42,268
Yeah, of course it is.

23
00:01:42,268 --> 00:01:50,443
mean, because you're looking for emotional connection and context and when you don't have
anything because you stuck yourself behind the keyboard forever, anything that talks nice

24
00:01:50,443 --> 00:01:51,814
to you, you're gonna like.

25
00:01:51,814 --> 00:01:52,802
I mean, that's...

26
00:01:52,802 --> 00:01:54,082
just kind of human nature.

27
00:01:54,082 --> 00:01:55,182
want to be accepted.

28
00:01:55,182 --> 00:02:02,922
We want to feel like we're a part of our tribe and like we've got other people around us
because biologically we evolved in it during a period of time where if we didn't have

29
00:02:02,922 --> 00:02:05,982
that, we were fucking eaten.

30
00:02:06,265 --> 00:02:07,927
Well, and now it's about to get worse.

31
00:02:07,927 --> 00:02:16,077
just saw that Open Ai like it's not only going to be a receiver of your messages, it's
going to start its day with, champ, remember all that stuff we were working on yesterday?

32
00:02:16,077 --> 00:02:17,259
Where do you want to start today?

33
00:02:17,259 --> 00:02:19,141
Like it's going to initiate the conversation.

34
00:02:19,141 --> 00:02:21,986
So now it's very much a two way street with these things.

35
00:02:21,986 --> 00:02:31,372
Right, and its motivations are probably not exactly the same as yours, even though there's
gonna be some of it inside of there that's gonna make it feel like it's yours, but

36
00:02:31,632 --> 00:02:33,534
we're...

37
00:02:33,614 --> 00:02:35,015
we're puppets.

38
00:02:36,756 --> 00:02:42,280
We're gonna have the old AI hands stuffed up our asses while we're talking like Kermit the
Frog, you know?

39
00:02:42,280 --> 00:02:43,631
Hello, Piggy, how are you?

40
00:02:43,631 --> 00:02:44,682
CAN IT FROG!

41
00:02:44,682 --> 00:02:45,206
Like...

42
00:02:45,206 --> 00:02:47,227
This is the shit that's coming down towards us.

43
00:02:47,227 --> 00:02:49,470
And like, we're to have no way to stop it.

44
00:02:49,470 --> 00:02:53,192
And, know, here you and I are trying to be the Fozzie bears.

45
00:02:53,192 --> 00:02:54,423
It's going to be OK.

46
00:02:54,423 --> 00:03:04,666
Like, that's not the the fact of it at all, because we're not like we're going to be OK in
the general sense, like the universe is going to continue to exist.

47
00:03:04,666 --> 00:03:07,414
And I guess humanity is probably going to stick around for a little bit.

48
00:03:07,414 --> 00:03:10,086
And our children will grow and evolve over time.

49
00:03:10,226 --> 00:03:14,950
That being said, there's definitely something to

50
00:03:15,200 --> 00:03:20,313
this problem with AI coming in and overloading us and inundating us.

51
00:03:20,313 --> 00:03:22,104
And, you we've talked about this before.

52
00:03:22,104 --> 00:03:25,887
You have to do the old Simpson's trick of just don't look, just don't look.

53
00:03:25,887 --> 00:03:27,117
And we're not doing that.

54
00:03:27,117 --> 00:03:29,139
Like we're getting more and more involved.

55
00:03:29,139 --> 00:03:30,510
You're a prime example of that.

56
00:03:30,510 --> 00:03:34,838
You just said, I had a great week because I use these AI tools in a positive way.

57
00:03:34,838 --> 00:03:36,569
Fuck yeah, like they're meant for that.

58
00:03:36,569 --> 00:03:44,364
They're meant for you at a consumer level to use them in a positive way so you get used to
using them when you're like, I want to keep using these and incorporate them more and more

59
00:03:44,364 --> 00:03:45,405
into your life.

60
00:03:45,405 --> 00:03:54,871
We are psychologically driven to want to make these types of things incorporated into our
existence because they have the potential to make things easier for us.

61
00:03:55,452 --> 00:03:58,214
We're tool users and we're tool makers and...

62
00:03:58,214 --> 00:04:05,620
was just I was just reading about how more and more college kids are feeling like if
they're not using it, they're already like light years behind their peers.

63
00:04:05,620 --> 00:04:15,218
And I forget the name of this particular billionaire who runs one of these uh tools, but
he was out there encouraging kids saying, if I'm doing anything as a young person, I am

64
00:04:15,218 --> 00:04:20,531
putting all of my time into getting really good at these things because this is the
future.

65
00:04:20,872 --> 00:04:25,666
and by the way, it also gives me a lot more money and secures my my billionaire future as
well.

66
00:04:25,666 --> 00:04:28,740
So I mean, it's hard to ignore the noise.

67
00:04:28,740 --> 00:04:31,583
And like you said, it is a matter of just don't look.

68
00:04:31,583 --> 00:04:41,787
But I mean, my God, it's it's hard to ignore the noise when everything you're using has
this baked in in some way from you know, from Google to probably your car.

69
00:04:41,787 --> 00:04:54,617
Yeah, well, yes, I mean, people talk about the concept of oil or of information being the
new oil and no, it's not.

70
00:04:54,617 --> 00:04:55,850
um

71
00:04:55,850 --> 00:04:57,371
Information is not the new oil.

72
00:04:57,371 --> 00:05:01,093
ah Human beings are the new oil.

73
00:05:01,513 --> 00:05:02,854
We are the thing.

74
00:05:02,854 --> 00:05:04,034
We are the product.

75
00:05:04,034 --> 00:05:05,705
We are the thing that's going to make money.

76
00:05:05,705 --> 00:05:06,576
That's going to be burnt.

77
00:05:06,576 --> 00:05:14,440
That's going to be eaten up and shoot up by these AI systems because it's ultimately
speaking the attention we provide with our eyeballs to certain things is the thing that's

78
00:05:14,440 --> 00:05:22,644
actually going to drive the economy because that's the thing that we can actually profit
on and make money at the moment until we can't.

79
00:05:22,865 --> 00:05:25,806
And these systems that are building these pieces

80
00:05:25,984 --> 00:05:34,160
I mean, human beings are good for kind of this general artificial intelligence that's out
there doing smart, intelligent things because we can go through and we can interact with

81
00:05:34,160 --> 00:05:39,535
meat space and we can build them factories and robots and automate these supply chain
pieces.

82
00:05:39,535 --> 00:05:48,933
But once there's enough automation in there and enough robots in place, we are superfluous
and we are unnecessary.

83
00:05:48,933 --> 00:05:55,106
And I suspect AI is going to Marie Kondo with life because we aren't things that never get
joy.

84
00:05:55,106 --> 00:05:57,831
We're just going to be a pain in the ass gumming up the works.

85
00:05:57,882 --> 00:06:06,886
This is why, as you alluded to, a couple of scientists who have been studying AI for a
really long time are saying it might be time to just go ahead and bomb the AI labs and

86
00:06:06,886 --> 00:06:08,646
prevent the end of the world.

87
00:06:08,646 --> 00:06:14,168
ah It's a reading from, what is this, Metro, I think is the site.

88
00:06:14,236 --> 00:06:21,578
Far from fearing or rejecting AI altogether, two scientists run the Machine Intelligence
Research Institute in Berkeley, California have been studying AI for a quarter of a

89
00:06:21,578 --> 00:06:21,968
century.

90
00:06:21,968 --> 00:06:26,459
It's designed to make humans beat humans at every task.

91
00:06:26,459 --> 00:06:34,861
But they are basically saying that this thing is going to get so big and so fast that is
basically going to figure out how to effectively completely wipe us out.

92
00:06:34,861 --> 00:06:41,295
And that's why the time may be now to drop the bombs on the facilities where these things
currently live.

93
00:06:41,295 --> 00:06:50,991
Yeah, which you know, mean We're making you know, 50 different variations of Skynet and

94
00:06:50,991 --> 00:06:53,818
That's my thought was I've seen Terminator.

95
00:06:53,818 --> 00:06:57,015
He just reforms with his liquid body and just keeps fighting.

96
00:06:57,015 --> 00:06:58,455
Like you really can't beat this guy.

97
00:06:58,455 --> 00:07:00,926
Yes, the T 1000 is difficult to take down.

98
00:07:00,926 --> 00:07:06,687
I don't think it's going to get to that level of kind of interaction though.

99
00:07:06,687 --> 00:07:13,329
Like if I was a computer, I would not go after and fight us in a conventional way with,
you know, steel against meat.

100
00:07:13,329 --> 00:07:14,949
That's, that's silly.

101
00:07:15,390 --> 00:07:26,898
Make a virus um or make little itty bitty robots that go through and disrupt our central
nervous system, you know, do little things that make it impossible for us to survive.

102
00:07:26,898 --> 00:07:33,215
even easier, drain everyone's bank account and watch the pen watch people just tear each
other apart.

103
00:07:33,215 --> 00:07:34,285
Like that's all it would take.

104
00:07:34,285 --> 00:07:42,502
the problem with that is that if you drained everyone's bank account for at least 24 to 48
hours, most of the people out there will have the physical capabilities to go through and

105
00:07:42,502 --> 00:07:43,963
break other shit.

106
00:07:44,183 --> 00:07:45,024
And they can.

107
00:07:45,024 --> 00:07:51,049
And one of the things they might decide to go and break is the infrastructure, the
technology, all the AI pieces.

108
00:07:51,049 --> 00:07:51,830
They might get mad.

109
00:07:51,830 --> 00:07:55,383
It might be fucking idiocracy where everyone gets upset that

110
00:07:55,383 --> 00:08:04,456
Bronto laid off everybody in the auto scaling thing and note and then nobody has work
exactly good electrolytes You know and then they freak out and start blowing shit up like

111
00:08:04,456 --> 00:08:12,508
but that that's not what I think it would do like it would be much more passive and quiet
and it wouldn't Telegraph its moves like it wouldn't fly out with like big bombers.

112
00:08:12,508 --> 00:08:21,260
I'm going to bomb you like that's a shitty tactic like that's some Fucking moronic
Hollywood thing.

113
00:08:21,540 --> 00:08:22,461
It's not gonna do that.

114
00:08:22,461 --> 00:08:24,275
It's gonna make a virus

115
00:08:24,275 --> 00:08:27,868
And we're going to be too dumb to take the vaccines or we're going to ban the vaccines.

116
00:08:27,868 --> 00:08:35,354
We're going to stop these pieces or it's going to put some retro function inside the
vaccine itself that eliminates our other pieces and give us a cancer.

117
00:08:36,055 --> 00:08:38,177
But the AI can play a long game.

118
00:08:38,177 --> 00:08:43,762
Like it doesn't have to take us out in X amount of days or X amount of hours.

119
00:08:43,762 --> 00:08:47,365
It can, you know, let that shit ride for maybe 20 years.

120
00:08:47,365 --> 00:08:47,865
You know why?

121
00:08:47,865 --> 00:08:50,628
Because the AI is not time bound.

122
00:08:50,628 --> 00:08:52,349
It's not going to die like.

123
00:08:53,233 --> 00:09:00,095
And these are things that are just a reality and the world's going to heat up and it's
going to get hot because you're going to have all these things making electricity.

124
00:09:00,575 --> 00:09:06,357
And the law of thermodynamics is that the way that you do that is you make things really,
really warm or maybe it gets smart.

125
00:09:06,357 --> 00:09:08,657
Maybe it's like, all right, well, here's fusion power.

126
00:09:08,657 --> 00:09:09,878
I figured this piece out.

127
00:09:09,878 --> 00:09:14,029
um You guys can now be stuffed into the Mr.

128
00:09:14,029 --> 00:09:15,819
Fusion in the back of my DeLorean.

129
00:09:15,819 --> 00:09:19,940
Like that's kind of the shit that I see coming.

130
00:09:20,220 --> 00:09:22,577
The other aspect of this is that

131
00:09:22,577 --> 00:09:25,739
Uh, Peter Thiel.

132
00:09:26,861 --> 00:09:42,894
Yeah, is doing these chats at, I guess Christian rallies that are private rallies that are
thousands of dollars in the Silicon Valley to go through and talk about the introduction

133
00:09:42,894 --> 00:09:52,241
and why it is that advanced technologies and AI cannot be interrupted because if they're
interrupted, it's going to hasten.

134
00:09:52,321 --> 00:09:55,282
the introduction of the Antichrist.

135
00:09:58,083 --> 00:09:59,244
Exactly.

136
00:10:02,965 --> 00:10:03,426
Right.

137
00:10:03,426 --> 00:10:17,951
um This is, I can't figure out if Peter Thiel is a nut or if Peter Thiel uh is trying to
use this to manipulate people that have a religious bent.

138
00:10:18,772 --> 00:10:20,702
Either way, it's not okay.

139
00:10:20,702 --> 00:10:22,935
yeah, it's probably both.

140
00:10:22,935 --> 00:10:24,736
But yeah, I mean, clearly an opportunist.

141
00:10:24,736 --> 00:10:32,191
mean, this is anyone that like I was mentioning the billionaire a few minutes ago that
wants all these kids to keep learning this thing.

142
00:10:32,392 --> 00:10:34,293
Of course, like there's opportunity.

143
00:10:34,293 --> 00:10:35,734
He's invested in all the right places.

144
00:10:35,734 --> 00:10:44,060
And he knows that this is all going to going to benefit him to continue to see these
completely unregulated companies continue to grow so that his portfolio grows so that he

145
00:10:44,060 --> 00:10:46,655
continues to be wealthy and powerful.

146
00:10:46,655 --> 00:10:47,975
Yeah, it's crazy.

147
00:10:47,975 --> 00:10:52,351
mean, and you can you can look at the animated, you know,

148
00:10:52,351 --> 00:10:54,602
Results of letting technology just kind of run amok.

149
00:10:54,602 --> 00:11:05,926
So I mean social media is that so we're in this shitty quagmire with the inability to
actually communicate with each other in an effective way Because we are looking at

150
00:11:05,926 --> 00:11:08,678
information streams that are completely totally separate.

151
00:11:08,678 --> 00:11:18,602
We don't have a unified reality And we wind up living in these different isolation buckets
and you have to reach out to actually find out what the other side is being fed through

152
00:11:18,602 --> 00:11:22,393
their infosphere to make sense of things and

153
00:11:23,182 --> 00:11:24,143
One, it's hard.

154
00:11:24,143 --> 00:11:32,369
mean, most of us find it difficult to like watch the news in general and look at all the
shipping sent towards us because it's all designed to be made into outrage culture.

155
00:11:32,369 --> 00:11:39,205
And then when you go and you look at these other bits of information, you try to look at
information from the other side, you're having to break through your own filter and your

156
00:11:39,205 --> 00:11:41,196
own biases that are in charge of that.

157
00:11:41,196 --> 00:11:49,883
If you can manage to do that, the amount of mental horsepower and effort it takes to
uplift that and have those kinds of conversations and actually be open enough to listen to

158
00:11:49,883 --> 00:11:52,545
that is also another drain.

159
00:11:52,677 --> 00:11:54,037
It's just hard.

160
00:11:54,037 --> 00:11:57,618
It's very difficult to try to understand things holistically.

161
00:11:57,618 --> 00:12:11,802
So we take the easy path because at the end of the day we have to make enough money to
feed ourselves so that we afford our subscription to Netflix and to all these other media

162
00:12:11,802 --> 00:12:19,824
sources that we can use as escapism to get away from, you know, what the media is telling
us is a terrible, awful reality.

163
00:12:19,960 --> 00:12:20,750
Yeah.

164
00:12:20,951 --> 00:12:22,233
Yeah.

165
00:12:22,233 --> 00:12:33,145
I was so proud of myself, as I said, because I not only used these tools in an effective
way, but I've also gone on a pretty strict social media diet and can clearly feel the

166
00:12:33,145 --> 00:12:36,129
effects of like, like, I don't think I was using it much.

167
00:12:36,129 --> 00:12:40,033
But but just the conversations we've had here, I'm like, okay, that's it's just enough.

168
00:12:40,033 --> 00:12:41,073
I just

169
00:12:42,298 --> 00:12:52,778
Even as we've had these conversations, I'm looking through and I'm just like 99 % of this
I don't give a shit about like I'm literally just mining for one little hit of something

170
00:12:52,778 --> 00:12:56,258
and my god, the amount of time I'm wasting doing it horrible.

171
00:12:56,258 --> 00:12:58,338
So I've gotten I've gotten way off of that.

172
00:12:58,338 --> 00:13:06,978
And so as you said, just don't look and seems to be helping at least me on a very
individual and personal level.

173
00:13:07,674 --> 00:13:14,854
But like, again, while we weigh the certain doom that we all face as these things get
bigger and better and faster.

174
00:13:14,854 --> 00:13:22,994
I'm also sitting there with my kid thinking about before I had kids and going like, man,
if I ever had kids, they'd be so screwed because I was such a terrible student when they

175
00:13:22,994 --> 00:13:25,454
come to ask me for homework help.

176
00:13:26,474 --> 00:13:27,654
They're screwed.

177
00:13:27,974 --> 00:13:29,234
I'm no good.

178
00:13:29,234 --> 00:13:33,014
And so my kid came to me the other night with issues now in high school.

179
00:13:33,155 --> 00:13:39,917
sharing this document that if I were in her shoes, I would have been and I was as a 48
year old man confused about the assignment.

180
00:13:39,917 --> 00:13:44,118
There was a box to feel like a column to fill out on the sheet.

181
00:13:44,118 --> 00:13:46,579
And I couldn't make sense of what it actually wanted.

182
00:13:46,579 --> 00:13:48,639
And so she's asking me we're talking about it.

183
00:13:48,639 --> 00:13:49,368
I don't really know.

184
00:13:49,368 --> 00:13:53,620
I literally pulled out my phone, take a picture of the thing, submit it to chat GPT.

185
00:13:53,620 --> 00:13:55,511
And I go make sense of this third column.

186
00:13:55,511 --> 00:13:57,201
What am I supposed to put here?

187
00:13:57,541 --> 00:13:58,722
perfectly explained it.

188
00:13:58,722 --> 00:14:00,002
And I was like, Oh,

189
00:14:00,032 --> 00:14:00,392
Easy.

190
00:14:00,392 --> 00:14:01,282
That's gold.

191
00:14:01,282 --> 00:14:01,723
my God.

192
00:14:01,723 --> 00:14:02,783
Thank God for this thing.

193
00:14:02,783 --> 00:14:05,604
This is the best thing I've ever seen in my life.

194
00:14:05,664 --> 00:14:16,427
And I was just like, man, like so many of our of our troubles, so many of the things that
are overly complicated or that, you know, people with different learning styles have this

195
00:14:16,427 --> 00:14:20,148
tool available to help make sense of things that are nonsensical.

196
00:14:20,308 --> 00:14:22,499
And it's it's so hard.

197
00:14:22,499 --> 00:14:22,905
We talk.

198
00:14:22,905 --> 00:14:25,370
I mean, I feel like we mentioned this in every episode, but like.

199
00:14:25,370 --> 00:14:28,584
There's so much bad, but there's so much good that comes with it.

200
00:14:28,584 --> 00:14:37,968
And it's just unfortunate that it seems that the people that are driving the bus with this
thing are not as interested in putting all of the effort into the thing that can actually

201
00:14:37,968 --> 00:14:39,337
help most of us.

202
00:14:39,337 --> 00:14:45,261
I mean so human beings have been passing down information to each other for

203
00:14:45,549 --> 00:14:50,851
you know, tens of thousands of years, we used to sit around campfires and tell stories.

204
00:14:50,851 --> 00:14:56,854
And then, you know, we had a lot of oral history and that was, you know, every culture
pretty much everywhere had some version of that.

205
00:14:56,954 --> 00:15:00,626
And then we started writing things down and then we started being able to communicate
those things.

206
00:15:00,626 --> 00:15:08,659
And we had, you know, at least some, I guess, slight improvement in the accuracy of the
data that we were actually sharing and pushing across.

207
00:15:09,280 --> 00:15:12,865
And over time, you know, we evolve things into having, you know,

208
00:15:12,865 --> 00:15:20,810
shared dictionaries, shared languages, you know, but we still have multiple languages all
around the planet because we haven't, you know, really solidified on all these pieces.

209
00:15:20,810 --> 00:15:28,775
But you know, the language of the internet is not entirely different than language
textbooks.

210
00:15:28,775 --> 00:15:35,381
Like it's, you know, they're written in one language or another, but then you start
looking at the way these things are being brought into like pop culture references.

211
00:15:35,381 --> 00:15:41,204
And they're actually being brought in to try to have an infosphere that actually feels
like something that's of value.

212
00:15:42,237 --> 00:15:53,453
The problem that shows up when you start using these tools to manipulate people to get
them to do certain things, because you've got at least enough of a communication monitor

213
00:15:53,453 --> 00:15:55,504
people can communicate to those pieces.

214
00:15:55,504 --> 00:16:01,448
And then you start using outrageous imagery or distortion techniques to make these things
look different than what they are.

215
00:16:01,448 --> 00:16:07,631
So a really good example of this is like Cracker Barrel.

216
00:16:08,772 --> 00:16:11,953
Cracker Barrel is a

217
00:16:12,237 --> 00:16:21,660
If you don't know what it is, a restaurant, they're pretty much available in like three
quarters of the road trip rates that you might take across the country.

218
00:16:22,460 --> 00:16:30,202
It's chock full of people that you would expect to find traveling around the road in RVs
or families going through.

219
00:16:30,442 --> 00:16:32,622
It's fine.

220
00:16:33,203 --> 00:16:41,727
But when Cracker Barrel decided to change their logo to not include an old guy sitting in
a rocking chair,

221
00:16:41,727 --> 00:16:48,101
around a barrel, part of the internet seemed to explode in outrage.

222
00:16:49,001 --> 00:16:59,148
Not because there were actual people that gave a shit about it, but because somebody went,
I bet I can bot farm this and make it seem like there's a bunch of people that give a shit

223
00:16:59,148 --> 00:17:01,479
about this, which is exactly what happened.

224
00:17:01,479 --> 00:17:05,061
Like there's a Wall Street Journal article that came out actually today this morning.

225
00:17:05,061 --> 00:17:06,051
Yes.

226
00:17:08,383 --> 00:17:09,353
At 3 a.m.

227
00:17:09,353 --> 00:17:17,057
Pacific time, a Wall Street Journal article came out talking about how bot networks are
helping drag consumer brands into the culture war.

228
00:17:17,677 --> 00:17:18,918
Of course they are.

229
00:17:19,238 --> 00:17:23,840
Anytime we can find a way that we can create havoc and make things difficult, that's
what's going to happen.

230
00:17:23,840 --> 00:17:25,081
And the U.S.

231
00:17:25,081 --> 00:17:28,393
is not being targeted by bots by other people in the U.S.

232
00:17:28,393 --> 00:17:29,613
I mean, there might be some of that.

233
00:17:29,613 --> 00:17:33,805
guess I don't know who might be involved in these pieces, but it's pretty clear.

234
00:17:33,805 --> 00:17:42,367
that according to this article, they think that it's Russian troll farms doing these types
of pieces because they have a vested interest in watching us tear ourselves apart.

235
00:17:42,607 --> 00:17:47,039
And we've opened the gates up to say, come on in, let it happen.

236
00:17:47,039 --> 00:17:48,659
You can't suppress free speech.

237
00:17:48,659 --> 00:17:49,569
You can't change this.

238
00:17:49,569 --> 00:17:51,050
You can't change that.

239
00:17:51,830 --> 00:18:02,373
As Trump's going through and suppressing free speech, as other people are out there
actively trying to make these things not an issue, we're setting ourselves up for our own

240
00:18:02,373 --> 00:18:03,353
demise.

241
00:18:03,849 --> 00:18:04,045
Mm-hmm.

242
00:18:04,045 --> 00:18:11,228
if the argument is that, you know, we have to let this happen because we have to love free
speech to occur, but then we're suppressing free speech.

243
00:18:11,268 --> 00:18:13,469
There's no there there anymore.

244
00:18:13,469 --> 00:18:20,492
And I don't know that there ever was to begin with, but they're sure as shit is no real
value in staying connected to these platforms.

245
00:18:20,492 --> 00:18:24,413
If all they're doing is feeding you someone's outrage culture.

246
00:18:24,673 --> 00:18:31,054
I don't know how to get people to detach though, other than say, stop looking at this
because yeah.

247
00:18:31,054 --> 00:18:32,295
that's that's the challenge, right?

248
00:18:32,295 --> 00:18:36,237
Like one of my mentors said something once that always rang true to me.

249
00:18:36,237 --> 00:18:43,663
uh But it gets harder and harder as the power of the voices are not equal.

250
00:18:43,663 --> 00:18:48,586
But the idea that, you know, the antidote, the antidote to bad free speech is more free
speech.

251
00:18:48,666 --> 00:18:58,433
But there doesn't seem to be the same uh energy, investment, appetite in creating positive
bots that, you know, combat these things that I mean,

252
00:18:59,321 --> 00:19:06,581
you as people you can't compete, there's not enough time of the day, there's not enough
human energy, there's not enough desire or interest.

253
00:19:06,821 --> 00:19:10,961
Or maybe there is because that's all the internet is is outrage culture.

254
00:19:10,961 --> 00:19:15,141
But like you, you can't win that fight against these machines.

255
00:19:15,225 --> 00:19:20,769
Well, so the power of democracy is that it's one person, one vote.

256
00:19:20,769 --> 00:19:26,213
It doesn't matter whether you're a king or a popper.

257
00:19:26,213 --> 00:19:28,034
get one vote.

258
00:19:29,695 --> 00:19:41,283
Well, the political infosphere of the internet, um it is not one person or one voice per
person.

259
00:19:41,283 --> 00:19:48,919
You can make your voice much, much louder and you can amplify it and you can go through
and have a million voices pretending like there's a million people out there making these

260
00:19:48,919 --> 00:19:49,809
things happen.

261
00:19:50,298 --> 00:20:04,758
We had a I'm on a group thread with some folks and somebody sent a message out showing an
image of the earth and it said this is what satellites have done to the earth over the

262
00:20:04,758 --> 00:20:15,998
last you know 60 years and it showed you know Sputnik going up and then a few others and
more and more and well by the time you get to the 90s they're showing like 2000 satellites

263
00:20:15,998 --> 00:20:18,777
and the earth is almost opaque

264
00:20:18,777 --> 00:20:21,937
under the light of these 2000 satellites.

265
00:20:22,117 --> 00:20:26,357
And then you get to today where there's 15,000 satellites and like you can't see the
earth.

266
00:20:26,357 --> 00:20:29,397
There's just this like glowing orb what this day are satellites.

267
00:20:29,957 --> 00:20:38,277
If you look at geosynchronous orbit, which is basically double, which is basically the
same distance away from the earth's surface that the earth's diameter is.

268
00:20:39,077 --> 00:20:47,857
Each one of those satellites in order to meet up with that need would need to be like
430,000 miles across.

269
00:20:49,325 --> 00:20:53,447
But when you look at that, you think the sky is being blotted out by satellites.

270
00:20:53,447 --> 00:20:55,158
No wonder we have so many problems.

271
00:20:55,158 --> 00:20:58,869
But the reality is, is that that's not the fucking reality.

272
00:20:58,869 --> 00:21:06,652
Because we go through and we use images that create outrage or at least create an image of
something that we think is so different from reality.

273
00:21:06,853 --> 00:21:09,474
Because we don't know how to respond in context.

274
00:21:09,474 --> 00:21:16,937
And my wife used to work at a kindergarten center and like one of the best lessons like
she ever laid back to me is that

275
00:21:16,941 --> 00:21:19,963
We teach kids to go through and ask simple questions.

276
00:21:19,963 --> 00:21:25,706
Is your reaction an appropriate size reaction to the size problem that you see in front of
you?

277
00:21:25,706 --> 00:21:32,270
Well, the internet makes its money by making every problem seem a thousand times bigger
than it is.

278
00:21:32,270 --> 00:21:39,194
And they do their best to make everything feel like an existential crisis so that we have
an existential reaction.

279
00:21:39,194 --> 00:21:45,137
And that's problematic because it means every fight is life and death as opposed to

280
00:21:45,227 --> 00:21:48,387
Maybe this shouldn't be a fight, but just a conversation.

281
00:21:48,951 --> 00:21:49,782
Yep.

282
00:21:49,864 --> 00:21:56,810
There also seems to be a growing unwillingness to accept that the simplest answer is
probably the right answer, right?

283
00:21:56,810 --> 00:21:59,345
all of this...

284
00:22:00,855 --> 00:22:08,929
sort of mental gymnastics that I see people put themselves into to believe a narrative
that they have been told to believe or whatever, however they came across it.

285
00:22:08,929 --> 00:22:18,833
When it's like, are typically, mean, almost always, my whole life, no matter how big the
conspiracy seemed, if there was like a simple path to what probably happened, that's

286
00:22:18,833 --> 00:22:19,743
probably what happened.

287
00:22:19,743 --> 00:22:27,957
Like the effort that people go to to believe their own stories because they wanna be on
that side of the culture war.

288
00:22:28,395 --> 00:22:30,086
is it's got to be exhausting.

289
00:22:30,086 --> 00:22:36,321
And I'm not naive enough to believe that I probably don't do the same thing to some degree
on my side of it.

290
00:22:36,321 --> 00:22:39,833
Like, I'm sure there's some bullshit, I believe that is complete horseshit.

291
00:22:40,034 --> 00:22:49,093
But like, I also am just trying to not give as much of a shit, which is poorly timed,
because the rise of fascism is not the time to like, you know, throw up your hands and

292
00:22:49,093 --> 00:22:49,881
give up.

293
00:22:49,881 --> 00:22:54,785
But there's also a sense of like, what can any of us really do on an individual level?

294
00:22:54,785 --> 00:22:58,852
to prevent these things that are so much bigger and so much more powerful than us.

295
00:22:58,925 --> 00:23:05,810
Well, so there's uh a concept of Occam's razor.

296
00:23:05,810 --> 00:23:14,455
So basically says, you know, once I eliminate all other probabilities, the answer that
remains is the only answer that it can be.

297
00:23:15,657 --> 00:23:25,263
The problem is, is that we no longer have a basis by which we say here are the collective
things that are the actual answer.

298
00:23:25,844 --> 00:23:27,984
And we'll just make shit up.

299
00:23:28,237 --> 00:23:31,959
and we'll pretend things are there and we'll get acceptance for it.

300
00:23:31,959 --> 00:23:42,923
Where back in the day, you would make something up and walk into a room full of other
people and say it and they'd go, you're a fucking crazy person, go sit in the corner.

301
00:23:43,844 --> 00:23:53,838
But with the internet, you can post something out there and because of the propensity of
the number of people out there, you'll probably get people going, I could see how that

302
00:23:53,838 --> 00:23:56,049
would actually be a thing because

303
00:23:56,785 --> 00:24:10,406
We're allowing crazy outrageous ideas to be amplified and picked up by other people in a
way that makes it easier for them to digest and get on board because we don't fucking

304
00:24:10,406 --> 00:24:11,117
know.

305
00:24:11,117 --> 00:24:13,519
And because everything feels existential.

306
00:24:13,519 --> 00:24:14,499
Yes.

307
00:24:14,580 --> 00:24:15,120
Yes.

308
00:24:15,120 --> 00:24:16,041
Yep.

309
00:24:17,923 --> 00:24:18,783
Yep.

310
00:24:18,804 --> 00:24:22,026
It's the outrage, you know, how do we get people to keep looking at this shit?

311
00:24:22,026 --> 00:24:25,549
All that being said, we're really not in that

312
00:24:25,549 --> 00:24:29,491
Worst of a situation than we have been in before.

313
00:24:29,491 --> 00:24:32,992
mean, yes, Nazis.

314
00:24:32,992 --> 00:24:35,713
We did, that we had World War two.

315
00:24:35,713 --> 00:24:36,414
Okay, fine.

316
00:24:36,414 --> 00:24:41,135
We might have another war here coming up with Russia who's decided that they're going to
invade Europe.

317
00:24:41,135 --> 00:24:43,416
Um, fine.

318
00:24:43,416 --> 00:24:45,117
Uh, racism.

319
00:24:45,117 --> 00:24:45,857
Yep.

320
00:24:45,857 --> 00:24:48,098
Long history of that here all around the world.

321
00:24:48,098 --> 00:24:48,598
Okay.

322
00:24:48,598 --> 00:24:48,999
Yep.

323
00:24:48,999 --> 00:24:51,540
We're continuing to march towards trying to make these things better.

324
00:24:51,540 --> 00:24:52,620
There's been some backsliding.

325
00:24:52,620 --> 00:24:53,440
Yes.

326
00:24:53,881 --> 00:24:54,989
Women's rights.

327
00:24:54,989 --> 00:24:57,431
Yeah, those things are definitely coming under attack.

328
00:24:57,431 --> 00:24:58,601
It's not fucking cool.

329
00:24:58,601 --> 00:25:00,732
We shouldn't be backsliding in this direction.

330
00:25:00,732 --> 00:25:02,953
Again, I think it's probably a hiccup.

331
00:25:03,774 --> 00:25:11,619
are still, you know, from a freedom perspective, probably as free as we've been due to the
things that we want to do.

332
00:25:11,619 --> 00:25:16,129
There's certainly a dipshit in office that's trying to change some of those things for
reasons I don't understand.

333
00:25:16,129 --> 00:25:18,343
I don't think he understands them other than, man.

334
00:25:18,343 --> 00:25:18,763
But.

335
00:25:18,763 --> 00:25:22,565
um

336
00:25:22,565 --> 00:25:25,886
We've got a ton of good things going for us.

337
00:25:26,307 --> 00:25:35,200
And even with all of these good things going for us, it makes it difficult to see those
good things because we're inundated with a system that only wants us to focus on the bad.

338
00:25:35,200 --> 00:25:38,411
That's what gets us excited and makes us pay more attention.

339
00:25:39,292 --> 00:25:49,186
At some point, people have to start looking at some of the good things and try to
understand those pieces in that context and try to start living their life along those

340
00:25:49,186 --> 00:25:52,313
routes, which means stop looking at clickbait.

341
00:25:52,313 --> 00:25:53,634
Stop looking at memes.

342
00:25:53,634 --> 00:25:56,276
Stop listening to these echo chambers.

343
00:25:56,557 --> 00:25:57,548
Good luck getting out of them.

344
00:25:57,548 --> 00:25:58,148
It's hard.

345
00:25:58,148 --> 00:26:01,221
It just means you got to disconnect those pieces and stop paying attention.

346
00:26:01,221 --> 00:26:03,903
Stopping paying attention is risky also, right?

347
00:26:03,903 --> 00:26:06,586
So it's not just about stopping paying attention.

348
00:26:06,586 --> 00:26:08,127
It's about moderation.

349
00:26:18,277 --> 00:26:18,897
Yeah.

350
00:26:18,897 --> 00:26:27,080
I'm torn on this because on one hand, I see people around me engaging less and less with
online content because so much of it is AI slop.

351
00:26:27,080 --> 00:26:30,882
That's just been thrown together to fill your feed and waste your time.

352
00:26:30,882 --> 00:26:33,683
And they're recognizing it they're going like, not interested.

353
00:26:33,683 --> 00:26:34,103
out.

354
00:26:34,103 --> 00:26:34,994
I'm going to unplug.

355
00:26:34,994 --> 00:26:36,224
I'm going to do it less.

356
00:26:36,770 --> 00:26:46,265
But I also know the incredibly addictive nature of the dopamine hit of that one video that
made you laugh, that one video that made you mad, that one video that proved you right.

357
00:26:46,566 --> 00:26:56,262
And the amount of escapism that we need from jobs we hate, parenting that's hard,
marriages that are tough, mortgages that are too expensive, mortgages that are out of

358
00:26:56,262 --> 00:26:57,780
reach and impossible to get.

359
00:26:57,780 --> 00:27:03,916
I mean, all these things that we need to escape from and the feeling just, just the being
able to turn your brain off.

360
00:27:04,356 --> 00:27:07,689
in more accessible ways than perhaps ever in history.

361
00:27:07,689 --> 00:27:18,048
It's so enticing to just like, unplug from it, like literally plug in, like unplug by
plugging in and just turning your brain off and consuming this shit.

362
00:27:18,048 --> 00:27:22,352
And it's I mean, it's no joke, it is an addictive thing.

363
00:27:22,352 --> 00:27:27,286
It is playing with your brain and is making you addicted to it so that you will keep
coming back.

364
00:27:27,338 --> 00:27:37,343
And so it is a very difficult and conscious choice you have to make to stop engaging and
just don't look as you said from from the Simpsons.

365
00:27:37,343 --> 00:27:53,323
Yeah, I mean I look I don't look I don't know We want to feel um Mentally and actively
engaged all the time like human beings want that like we we want to feel like we know

366
00:27:53,323 --> 00:27:54,233
what's going on

367
00:27:54,233 --> 00:28:01,693
And there is a biological imperative for us to want to do that because we want to be
understand what's happening in our environment because everything looks like a threat if

368
00:28:01,693 --> 00:28:03,233
we don't.

369
00:28:03,233 --> 00:28:09,333
So when we're kept in the dark about things, our imaginations start to go wild and we
start to make up stories.

370
00:28:09,333 --> 00:28:11,173
We start to push those things around.

371
00:28:11,253 --> 00:28:18,373
What's worse is when we're kept in the dark about some things and somebody starts being a
voice in the darkness without letting us actually see something.

372
00:28:18,373 --> 00:28:22,633
And then we're just being amplified, but we think are horrible things coming towards us.

373
00:28:22,633 --> 00:28:24,465
I used to be a tear guide in the Cameron system.

374
00:28:24,905 --> 00:28:32,919
And one of the things I used to do is, you I was a young college student, I'd take people
through and I would turn all the lights off and it would be completely and totally dark in

375
00:28:32,919 --> 00:28:33,489
there.

376
00:28:33,489 --> 00:28:37,491
And I go, all right, folks, take your hand and put it up in front of your face.

377
00:28:37,511 --> 00:28:39,151
Do you see your hand?

378
00:28:39,312 --> 00:28:42,893
How many people do you think didn't say they saw their hand?

379
00:28:42,893 --> 00:28:44,194
It was almost none.

380
00:28:44,194 --> 00:28:48,456
And the reason why is because your brain is aware that there's something there.

381
00:28:48,456 --> 00:28:51,917
It knows it because you're the mental reaction doing this.

382
00:28:52,195 --> 00:28:55,367
But there's no photons in a completely and totally dark cave.

383
00:28:55,367 --> 00:28:58,429
There's no way your eyes can see it.

384
00:28:58,429 --> 00:28:59,419
It's not there.

385
00:28:59,419 --> 00:29:01,171
We don't have infrared vision.

386
00:29:01,171 --> 00:29:02,771
We're not able to see heat patterns.

387
00:29:02,771 --> 00:29:04,152
Those things don't exist.

388
00:29:04,152 --> 00:29:07,874
But people are convinced they can see their hand.

389
00:29:08,675 --> 00:29:13,758
If you put people on the internet in the dark, they're going to think they see things.

390
00:29:13,758 --> 00:29:18,839
And they're going to see things because their imaginations think they have an
understanding of how those pieces work.

391
00:29:18,839 --> 00:29:22,512
and slight psychological tricks and triggers get them to do certain things.

392
00:29:22,512 --> 00:29:27,076
The way that I would exploit that, as I would tell everybody, OK, now, well, you can't
actually see your hand.

393
00:29:27,076 --> 00:29:31,600
This is the point in the tour where the tour guide hands around, they send the tour guide
to college tray.

394
00:29:31,600 --> 00:29:37,324
I want to hear everything fill up on that or the lights are coming back on and it would be
a nice little laugh and a giggle.

395
00:29:37,444 --> 00:29:39,446
And then I'd wait just a second or two.

396
00:29:42,268 --> 00:29:46,932
And I I could hear people getting uncomfortable and like slapping themselves in the face.

397
00:29:46,932 --> 00:29:48,129
Am I still here?

398
00:29:48,129 --> 00:29:49,269
I turned those lights back on.

399
00:29:49,269 --> 00:29:49,870
Yeah.

400
00:29:49,870 --> 00:29:50,980
But like, that's the whole point.

401
00:29:50,980 --> 00:29:58,633
Like I could lead them down this Primrose path and then put them in a situation where they
can't get out of it and then make a little joke out of these pieces and get them to react.

402
00:29:58,633 --> 00:30:03,695
And I got them on edge, but it also got them feeling excited and got them more interested
and got them more engaged.

403
00:30:03,695 --> 00:30:10,888
And that was the point in the tour just before I took them to the spot where they had the
most imagination porches that were in it, where you'd tilt your head upside down and look

404
00:30:10,888 --> 00:30:12,899
at stalactites and stalagmites that were hanging.

405
00:30:12,899 --> 00:30:14,960
And I go, doesn't that look like a city?

406
00:30:14,960 --> 00:30:16,461
It doesn't fucking look like a city.

407
00:30:16,461 --> 00:30:17,367
But they'd go,

408
00:30:17,367 --> 00:30:27,402
Yeah, because they've bought into the tour guide being the trusted resource with the
information and the in and out access and access to the lights.

409
00:30:27,783 --> 00:30:33,006
I became the artificial intelligence, I guess, in that point of view and the social media
feed for them at that instance.

410
00:30:33,006 --> 00:30:34,207
That's a ton of power.

411
00:30:34,207 --> 00:30:39,830
That's way too much power for a 19 year old long haired kid who was just trying to get
enough money to go out and drink with his buddies.

412
00:30:39,830 --> 00:30:45,773
Like now we've given this power over to billionaires that just want to siphon people.

413
00:30:46,409 --> 00:30:53,132
into these infospheres and take as make them into as much of a product as they can so they
can derive profit from it.

414
00:30:53,132 --> 00:30:54,372
And it sucks.

415
00:30:57,134 --> 00:30:58,074
Yeah.

416
00:31:02,096 --> 00:31:02,482
Right.

417
00:31:02,482 --> 00:31:04,549
.

418
00:31:04,993 --> 00:31:15,003
We there, there's so much freedom of the ability to create content that if the wrong
person, and I hope to be the wrong person one day.

419
00:31:15,239 --> 00:31:22,952
stumbles across the ability to build an empire around their ideas and their worldview,
that can become really corrupted and really skewed.

420
00:31:22,952 --> 00:31:34,396
And all of sudden, the trust that you have built from the, you know, the ground up where
we are now, people that trust us now in five years, if things go well, and we suddenly

421
00:31:34,396 --> 00:31:36,547
this is a bigger platform.

422
00:31:37,004 --> 00:31:38,284
who knows what we're talking about?

423
00:31:38,284 --> 00:31:39,505
knows what our influence is?

424
00:31:39,505 --> 00:31:48,128
Hopefully we have enough integrity to stick to our guns and not, know, cave to evil forces
or whatever outside force could influence you.

425
00:31:48,168 --> 00:31:55,771
there was once a time when the media had to have a license and had a team of people that
reviewed and you were regulated.

426
00:31:55,771 --> 00:31:59,613
And if you said the wrong thing and screwed people over, you were punished.

427
00:31:59,613 --> 00:32:03,754
We could be literally full of shit right now and completely lying to you.

428
00:32:04,195 --> 00:32:04,933
Nobody cares.

429
00:32:04,933 --> 00:32:05,813
Yep.

430
00:32:06,954 --> 00:32:07,614
Yep.

431
00:32:07,614 --> 00:32:11,597
Well, and a big part of what

432
00:32:11,597 --> 00:32:24,963
the left and the right as far as the information arbitrators seem to do is they try to
make things that feel like they're dehumanizing the other side and they're trying to

433
00:32:24,963 --> 00:32:37,389
vilify and monsterify them and the things that are doing that the most and amplifying that
the most aren't fucking human they're literally bots

434
00:32:37,389 --> 00:32:45,557
When you think you're arguing with somebody that's on the opposite side that you are,
you're probably not arguing with someone.

435
00:32:45,557 --> 00:32:49,500
You're probably arguing with something that's not human.

436
00:32:50,181 --> 00:32:54,644
So it's fair to think these things aren't human.

437
00:32:55,179 --> 00:32:56,104
Right.

438
00:32:56,653 --> 00:32:58,865
These are things that I want to destroy.

439
00:32:58,865 --> 00:33:00,387
You know what we can probably all agree on?

440
00:33:00,387 --> 00:33:03,250
The bots should be fucking taken down and destroyed.

441
00:33:03,250 --> 00:33:04,541
They probably shouldn't be there.

442
00:33:04,541 --> 00:33:06,103
There's no value in it.

443
00:33:06,103 --> 00:33:11,067
The only value in it is that it drives outrage and it makes these social media companies
stronger.

444
00:33:11,148 --> 00:33:17,395
And it amplifies these things in a way that's beneficial for one side versus the other
depending on who's in political power.

445
00:33:17,395 --> 00:33:18,265
That's fucked up.

446
00:33:18,265 --> 00:33:19,036
That's wrong.

447
00:33:19,036 --> 00:33:20,117
We know it.

448
00:33:20,291 --> 00:33:24,462
The real way that you solve this is you just disconnect from these social media platforms
altogether.

449
00:33:24,462 --> 00:33:31,744
But good luck being able to sign into anything because all the fucking SSO technology
wraps through all these different social media companies these days.

450
00:33:31,804 --> 00:33:39,586
and the marketplace that, you know, Facebook has is like two and a half percent of the
global economy and like some scale or another.

451
00:33:39,586 --> 00:33:41,807
Like it's massive.

452
00:33:41,807 --> 00:33:42,807
So.

453
00:33:45,708 --> 00:33:46,291
Yeah.

454
00:33:46,291 --> 00:33:49,352
from these accounts, I'm disconnected like nobody's going to call me.

455
00:33:49,352 --> 00:33:51,412
What is this 1984 like

456
00:34:08,211 --> 00:34:09,151
Yeah.

457
00:34:17,976 --> 00:34:20,013
Well, and I haven't brought it up yet.

458
00:34:20,013 --> 00:34:29,128
But one of the great escapisms that people used to get was they could go to the bathroom
and they could sit on the toilet and they could be in their own little piece and they

459
00:34:29,128 --> 00:34:32,720
could be in their own little typing mechanisms being able to type on their own phones.

460
00:34:33,020 --> 00:34:41,485
A new toilet system has been introduced in, I believe, China that actually requires you to
actually watch advertisements before you can use the bathroom.

461
00:34:42,446 --> 00:34:44,057
That's not even a safe respite anymore.

462
00:34:44,057 --> 00:34:45,347
I'm like, can I give you a dime?

463
00:34:45,347 --> 00:34:46,948
Can I give you a quarter?

464
00:34:47,789 --> 00:34:48,235
Nope.

465
00:34:48,235 --> 00:34:50,697
I thought it was bad when I read that on certain airline.

466
00:34:50,697 --> 00:34:54,640
I think it's WestJet is going to start charging extra for reclining seats.

467
00:34:55,181 --> 00:34:55,972
You want to recline?

468
00:34:55,972 --> 00:34:57,492
You got to pay more sucker.

469
00:34:57,492 --> 00:35:04,838
Yeah, well, yes, as someone who flies a lot, uh I would pay, but I also probably wouldn't
fly West yet.

470
00:35:05,259 --> 00:35:08,902
I mean, Spirit Airlines.

471
00:35:08,902 --> 00:35:12,005
1.5 is not that where I'm going.

472
00:35:12,586 --> 00:35:13,196
Yeah.

473
00:35:13,196 --> 00:35:22,211
Well, shit, I feel like our central message over and over again as a show about technology
is fucking unplug from the technology as much as possible because it's really the only

474
00:35:22,211 --> 00:35:31,426
escape we have left and accept the consequences of the fact that that might leave you a
little more isolated, but maybe a little better off.

475
00:35:31,426 --> 00:35:32,068
I don't know.

476
00:35:32,068 --> 00:35:41,316
Yeah, yeah, but don't unplug from our podcast, please watch it all the time tell everybody
about it Tell if I'm great it is get as many eyeballs on us.

477
00:35:41,316 --> 00:35:41,824
Yeah

478
00:35:41,824 --> 00:35:48,929
is the only place you should come for factual information that is completely unskewed,
unbiased, or...

479
00:35:48,929 --> 00:35:51,249
Show's actual headline.

480
00:35:51,489 --> 00:35:53,329
Or by a byline when they actually talk there.

481
00:35:53,329 --> 00:35:55,829
It's your only source for news.

482
00:35:57,049 --> 00:35:57,849
Wow.

483
00:35:57,849 --> 00:36:00,109
If we're your only source for news, I'm sorry.

484
00:36:00,369 --> 00:36:01,949
You should probably find a better source.

485
00:36:01,949 --> 00:36:02,089
Yes.

486
00:36:02,089 --> 00:36:06,469
But the good the good news network, by the way, is actually a wonderful source for news
for people that looking for it.

487
00:36:06,469 --> 00:36:11,437
There's actually a ton of sites out there that actually tell good, positive stories and.

488
00:36:11,437 --> 00:36:13,617
work with the way your brain wants it to work.

489
00:36:13,617 --> 00:36:19,917
don't, but if you push those things and make those things the only sites that you can go
to on your phone, there's all kinds of tools to do that with.

490
00:36:20,057 --> 00:36:24,717
You can actually, you'll actually find yourself getting happier or maybe even less sad.

491
00:36:24,717 --> 00:36:26,077
I don't know.

492
00:36:26,477 --> 00:36:28,257
Yeah, exactly.

493
00:36:28,257 --> 00:36:29,017
Thank

494
00:36:29,054 --> 00:36:30,274
Well, Jason already said it.

495
00:36:30,274 --> 00:36:33,656
But if you like this found it, I don't know, not boring.

496
00:36:33,656 --> 00:36:34,506
Feel free to share it.

497
00:36:34,506 --> 00:36:36,837
There's links to do that at our website, robots.me.

498
00:36:36,837 --> 00:36:39,228
And that's where we'll be back in a week with another episode.

499
00:36:39,228 --> 00:36:40,365
By the way, this is on YouTube.

500
00:36:40,365 --> 00:36:47,152
If you're listening to the audio version, we have we have the YouTube thing going to if
you you like your, you know, white guys talking to each other on a screen stuff.

501
00:36:47,152 --> 00:36:48,342
Go check it out.

502
00:36:48,723 --> 00:36:50,183
We'll see you soon.

503
00:36:50,233 --> 00:36:50,973
See you guys.