How You Can Stop Radicalization Before It Starts

Why do complete strangers online make you angrier than people you actually know?
We're living in algorithmic echo chambers where billion-dollar tech companies profit from our rage while we argue about shit that doesn't matter. Meanwhile, Charlie Kirk's death spawned government censorship, proving we've lost the ability to tell minor grievances from existential crises.
In this episode, learn how to escape the digital outrage machine that's hijacking your mental health. Discover why real-world conversations feel so different from online battles. Get practical tools to recognize bot manipulation and algorithmic targeting before they radicalize your thinking.
Topics Discussed
- Charlie Kirk's assassination and the different realities people experienced based on their algorithmic feeds
- Government censorship threats against Jimmy Kimmel and other TV commentators following the shooting
- The bot epidemic: Over half of internet traffic isn't human, and you're arguing with machines
- Tech company manipulation: How the top 10 companies (worth trillions) profit from keeping you divided and angry
- Algorithm-driven radicalization: The pathway from loneliness to extremist action through social media rabbit holes
- Media consolidation myths: Why blaming "mainstream media" misses the real culprits making billions from outrage
- The kindergarten playground effect: How we treat minor disagreements as cosmic battles between good and evil
- Real vs. digital relationships: Why face-to-face conversations with political opposites feel completely different
- State actor interference: Foreign governments amplifying American outrage through fake accounts and bot networks
- Practical escape strategies: How to tune algorithms away from rage-bait and toward content that doesn't make you crazy
----
MORE FROM BROBOTS:
Connect with us on Threads, Twitter, Instagram, Facebook, and Tiktok
Subscribe to BROBOTS on Youtube
Join our community in the BROBOTS Facebook group
----
LINKS TO OUR PARTNERS:
-
Explore the many benefits of cold therapy for your body with Nurecover
-
Muse's Brain Sensing Headbands Improve Your Meditation Practice.
-
Get a Free One Year Supply of AG1 Vitamin D3+K2, 5 Travel Packs
-
You Need a Budget helps you quickly get out of debt, and save money faster!
00:00:00,001 --> 00:00:06,044
we're all arguing over whether or not Charlie, Charlie Kirk was a good guy, or whether or
not Jimmy Kimmel should have been pulled off the air.
2
00:00:06,044 --> 00:00:10,787
We're not arguing about the fact that there are 10 companies running everything and
fucking over the rest of us.
3
00:00:10,787 --> 00:00:13,840
But the power and the control is what keeps
4
00:00:13,840 --> 00:00:24,606
the money coming in because if all of us actually banded together and rose up against the
very few billionaires that are causing all of this or at least making it possible for us
5
00:00:24,606 --> 00:00:28,988
to live this way, it would be a quick turnaround, right?
6
00:00:28,988 --> 00:00:37,416
Like if we realized who the real bad guy is here, it would be a very different world
rather than the one where we're sitting here arguing about whose idea about, know, Charlie
7
00:00:37,416 --> 00:00:39,028
Kirk or Jimmy Kimmel is right.
8
00:00:44,279 --> 00:00:52,279
this is robots where the podcast that tries to help you use AI and technology to safely
manage your mental health, your well being all the things that we're trying to manage on a
9
00:00:52,279 --> 00:00:54,299
daily day based on a day to day basis.
10
00:00:54,719 --> 00:00:57,319
And this is gonna be a tough conversation.
11
00:00:57,319 --> 00:01:03,007
This is gonna be hard because there's there's some lines that we want to walk here to make
sure that we don't
12
00:01:04,835 --> 00:01:13,601
mischaracterize any any part of this, but obviously we're talking about what everybody's
talking about the the death of Charlie Kirk and the blowback that has followed since then
13
00:01:13,601 --> 00:01:21,417
uh goes without saying I I almost didn't even want to say this but it goes without saying
that this is a horrible tragedy that nobody should ever stand for this should not happen
14
00:01:21,417 --> 00:01:32,494
to anybody I don't care what kind of a piece of shit you are you shouldn't be gunned down
for your ideas there we've done the disclaimer now we can have the conversation so this is
15
00:01:32,494 --> 00:01:33,895
tough because
16
00:01:34,247 --> 00:01:40,172
I completely disagree with pretty much everything I ever heard fall out of his mouth.
17
00:01:40,172 --> 00:01:49,039
The only thing I respect about Charlie Kirk is the way that he took on his issues face to
face and took his debate out in public and challenged people's ideas.
18
00:01:49,180 --> 00:01:50,301
I think we need more of that.
19
00:01:50,301 --> 00:01:57,817
You even made a statement on the show, Jason, recently about like, you know, maybe more
and more people need to be afraid of being punched in the mouth for saying something
20
00:01:57,817 --> 00:01:58,287
stupid, right?
21
00:01:58,287 --> 00:02:02,893
Like, because we all hide behind our keyboards and we all get into these into these wars
online.
22
00:02:02,893 --> 00:02:05,946
Yeah, we're super, exactly.
23
00:02:05,946 --> 00:02:09,088
And we're super brave and we're at a distance and people can't touch us.
24
00:02:09,088 --> 00:02:17,447
But when we're up close and personal, we tend to, we tend to back away from our opinions
or we tend to be less vitriolic in what we say.
25
00:02:17,447 --> 00:02:18,247
And
26
00:02:18,609 --> 00:02:21,521
Yeah, I actually, do think that's the line.
27
00:02:21,521 --> 00:02:30,866
Like being punched in the face, that's a level of violence that, you is not super
acceptable, but...
28
00:02:32,847 --> 00:02:36,579
Right, like, yeah, the founding fathers did that, right?
29
00:02:36,579 --> 00:02:42,563
Like, they would get into fistfights, and they'd get into arguments, and they'd go back
and forth, but they wouldn't fucking shoot each other.
30
00:02:42,563 --> 00:02:46,635
Like, when they would get to the point where we have irreconcilable differences,
31
00:02:46,659 --> 00:02:52,623
they wouldn't stand at a distance and assassinate each other, they'd get pistols and do a
fucking duel.
32
00:02:52,864 --> 00:02:55,706
Like, and it was straight up.
33
00:02:56,908 --> 00:03:01,952
No, right, it wasn't a cowardly act from a distance.
34
00:03:01,952 --> 00:03:14,281
mean, snipering somebody from 200 yards away, at an elevated position on a college campus,
that's cowardice.
35
00:03:14,562 --> 00:03:16,145
You know, I mean, straight up.
36
00:03:16,145 --> 00:03:17,438
And like, that's not okay.
37
00:03:17,438 --> 00:03:20,850
And we have to look, yeah, so good.
38
00:03:20,850 --> 00:03:23,110
was saying the reaction to it as well, right?
39
00:03:23,110 --> 00:03:27,130
We're now in a few days in the wake of the Jimmy Kimmel being pulled off the air.
40
00:03:27,130 --> 00:03:34,590
He's one of a number of TV commentators who are now axed because they said something that
made you know, Mr.
41
00:03:34,590 --> 00:03:38,070
grumpy pants unhappy and he wanted something done about it.
42
00:03:38,070 --> 00:03:39,038
So
43
00:03:40,538 --> 00:03:44,399
This just continues to spread in awful ways.
44
00:03:44,399 --> 00:03:50,041
And so much of it comes back to things that you and I have been saying here for months,
things I've been saying on this show for years.
45
00:03:50,381 --> 00:03:52,852
And there's a couple of instances I want to share.
46
00:03:52,852 --> 00:03:55,062
And one, actually, ironically, is from Facebook.
47
00:03:55,062 --> 00:04:00,014
And it is somebody who shared a conversation that someone else shared, and they just
reshared it.
48
00:04:00,014 --> 00:04:05,305
it so brilliantly highlighted what I think is a huge root of this problem.
49
00:04:05,306 --> 00:04:07,426
And it basically describes
50
00:04:07,431 --> 00:04:11,211
two different points of view about who Charlie Kirk was based on people's algorithm.
51
00:04:11,211 --> 00:04:13,151
And so I'm just going to read a little bit this.
52
00:04:13,151 --> 00:04:14,771
Stop me when it gets boring.
53
00:04:15,291 --> 00:04:22,031
But it says, one thing that has become really clear since yesterday is that we live in at
least two different realities.
54
00:04:22,031 --> 00:04:30,291
Talking to a friend who only knew Charlie as a Christian motivational speaker, because
that's all that ever came across her feed, they showed me videos I'd never seen of him
55
00:04:30,291 --> 00:04:32,951
saying perfectly reasonable and empowering things.
56
00:04:32,951 --> 00:04:34,692
I showed her videos.
57
00:04:34,692 --> 00:04:40,946
She'd never seen before of his racism, misogyny, homophobia, advocating for violence
against specific groups of people.
58
00:04:40,946 --> 00:04:48,490
She was horrified by his remarks about Nancy Pelosi's husband's attacker being bailed out
and celebrated for his violent act.
59
00:04:48,490 --> 00:04:53,614
She was horrified by the number of things he'd said, but she had never seen them or heard
them before.
60
00:04:53,614 --> 00:05:01,388
The same as I had never heard or seen the generalized clips of him sounding like a
personally nice, like a perfectly nice, loving man and father.
61
00:05:01,838 --> 00:05:03,769
Neither of us had the whole picture of this man.
62
00:05:03,769 --> 00:05:07,541
I mentioned he was a known white supremacist and she thought I was joking.
63
00:05:07,541 --> 00:05:13,003
She talked about him giving a speech about finding your purpose and doing good in the
world and I thought she was joking.
64
00:05:13,304 --> 00:05:17,225
I saw why this friend was mourning the loss of a person she thought was a good person.
65
00:05:17,225 --> 00:05:20,827
My friend bless her saw why I feel the way I do about him.
66
00:05:20,827 --> 00:05:27,706
We understood each other better in spite of a billion dollar internet machine specifically
focused on keeping us apart because we talked to each other.
67
00:05:27,706 --> 00:05:33,732
with the desire to listen and learn rather than the desire to change someone else's mind
or be right.
68
00:05:33,732 --> 00:05:34,692
I mean, that's it.
69
00:05:34,692 --> 00:05:36,899
That's it in a nutshell, right?
70
00:05:36,899 --> 00:05:42,421
The very salient point in that is that Charlie Kirk was a people.
71
00:05:42,902 --> 00:05:50,725
And people as individuals, persons, are not entirely good and are not entirely bad.
72
00:05:50,725 --> 00:05:53,246
You can find fault in Mother Teresa.
73
00:05:53,246 --> 00:05:55,787
You can find fault all over the place.
74
00:05:55,787 --> 00:05:59,408
Like, we're not perfect.
75
00:05:59,869 --> 00:06:02,409
And thinking that we are is foolish.
76
00:06:02,730 --> 00:06:05,070
But we're also not entirely evil.
77
00:06:05,171 --> 00:06:06,031
Like,
78
00:06:06,431 --> 00:06:09,053
and unworthy of sympathy and empathy.
79
00:06:10,894 --> 00:06:22,362
There is so much uh to unpack in that statement because the way that most of us consume
information these days is via a few media sources.
80
00:06:23,042 --> 00:06:25,204
Those media sources are being consolidated.
81
00:06:25,204 --> 00:06:32,569
ah Most of them are being consolidated towards the right right now, and they've gone
through their own path to be consolidated by the left.
82
00:06:32,569 --> 00:06:36,231
And really what they're being consolidated to is people with money.
83
00:06:36,323 --> 00:06:41,093
And we have to stop thinking about media in terms of political left and political right.
84
00:06:41,093 --> 00:06:45,416
They just start thinking about media by the people that are motivated to make money off of
it, which is what they're doing.
85
00:06:45,416 --> 00:06:47,406
Like they're not doing it for the public good.
86
00:06:47,406 --> 00:06:56,569
And the FCC coming out and saying that we're going to revoke ABC's licenses if they don't
do something about Jimmy Kimmel is government censorship.
87
00:06:56,569 --> 00:06:58,649
This is not cancel culture.
88
00:06:58,749 --> 00:07:00,900
This is government censorship.
89
00:07:00,900 --> 00:07:03,481
This is against the Constitution.
90
00:07:03,481 --> 00:07:04,774
It should be against the law.
91
00:07:04,774 --> 00:07:07,438
than internet outrage about somebody who said something stupid.
92
00:07:07,438 --> 00:07:14,420
This is somebody who made a joke that that made, what do they call him tangerine Palpatine
made him very upset.
93
00:07:14,420 --> 00:07:16,512
And so they had to do something about it.
94
00:07:16,721 --> 00:07:18,883
Right, and it's...
95
00:07:18,888 --> 00:07:19,750
Look.
96
00:07:21,509 --> 00:07:32,106
The fact that the right is getting upset about this uh at a level where they're like
cancel everybody, cancel everybody, fuck this.
97
00:07:32,106 --> 00:07:38,150
you don't, if you say anything negative against people that we like and love, we're going
to cancel you.
98
00:07:38,150 --> 00:07:41,832
This is exactly what they were complaining that the left was doing to people.
99
00:07:42,233 --> 00:07:46,003
So this blade cuts both ways and.
100
00:07:46,003 --> 00:07:51,187
there are definitely people on the right that are standing up and saying, no, no, no, this
is not okay.
101
00:07:51,187 --> 00:07:53,648
Like Fox News, people on Fox News are saying this.
102
00:07:53,648 --> 00:07:59,652
Like cancel culture should not be a thing because the next party that comes into power
could cancel us.
103
00:07:59,733 --> 00:08:02,554
Whether or not we have another election is still up for debate.
104
00:08:02,554 --> 00:08:15,013
um But this feels very, very much so like uh stories I've heard of authoritarians and
dictators coming in and
105
00:08:15,013 --> 00:08:21,450
forcing propaganda and messages to be scripted a certain way, which is exactly what the
right was accusing the left of.
106
00:08:21,450 --> 00:08:23,191
um
107
00:08:24,691 --> 00:08:33,064
But I've never seen the left make the threat that we're going to cancel Fox's licenses
from the FCC.
108
00:08:33,064 --> 00:08:38,436
chorus of the left saying cancel their license, like what they're doing is not good, what
they're doing is harmful.
109
00:08:38,436 --> 00:08:41,857
Like they are pressured by the population to do that.
110
00:08:41,857 --> 00:08:46,769
And they're like, no, no, because we understand that when the shoe is on the other foot,
we would be screwed.
111
00:08:46,769 --> 00:08:47,920
We know better.
112
00:08:47,920 --> 00:08:51,761
But that's kind of, yeah, I mean, and I don't, I don't mean for this to get political and
that's what I was afraid of.
113
00:08:51,761 --> 00:08:58,104
but it is the, the actions of someone who's who is behaving as though they aren't going
anywhere.
114
00:08:58,104 --> 00:09:00,633
Like they, are intending to make rules as though
115
00:09:00,633 --> 00:09:03,914
they don't ever have to worry about blowback from those decisions.
116
00:09:04,974 --> 00:09:08,625
So that's uh at the risk of getting political.
117
00:09:08,625 --> 00:09:21,239
I want to step back a little bit more and talk a little bit more about the perpetrators of
this violence and the information that tends to lead to the actions of, know,
118
00:09:21,239 --> 00:09:23,479
assassinating somebody you disagree with.
119
00:09:23,659 --> 00:09:24,820
There's a guy that I'm a fan of.
120
00:09:24,820 --> 00:09:25,800
His name is Scott Galloway.
121
00:09:25,800 --> 00:09:29,841
He was on CNN just within a couple of days of the shooting.
122
00:09:29,917 --> 00:09:40,008
And really in this few minutes articulated something that again that we have been saying
here for a long time and I wanted to just share this quick conversation that he had with
123
00:09:40,008 --> 00:09:41,629
Anderson Cooper on CNN.
124
00:15:11,582 --> 00:15:16,793
First comment, in the US corporations are people.
125
00:15:17,654 --> 00:15:21,875
Because they're people, they're given the inalienable rights of people.
126
00:15:22,835 --> 00:15:40,700
Also, the top 10 companies are Nvidia, Microsoft, Apple, Alphabet, is Google, Amazon, Meta
slash Facebook, Broadcom, Satiramco, Tesla,
127
00:15:41,202 --> 00:15:43,413
and TSMC Taiwan Semiconductor.
128
00:15:43,534 --> 00:15:54,961
The top 10 companies of those, the only ones that aren't making money directly off of
computational metrics is Saudi Aramco.
129
00:15:55,162 --> 00:15:59,805
Everyone else is making money off of these processing functions.
130
00:15:59,805 --> 00:16:03,207
And for those of you saying, Tesla, no, that's not what's happening.
131
00:16:03,207 --> 00:16:04,448
You bet your ass it is.
132
00:16:04,448 --> 00:16:10,930
You bet your ass that those cars are giant walking, traveling computers that are
collecting huge amounts of metadata, watching people's tracking flow.
133
00:16:10,930 --> 00:16:17,863
And if you Tesla is only involved in this, remember that Tesla funds the rest of Musk's
projects.
134
00:16:18,223 --> 00:16:20,084
These people aren't your friends.
135
00:16:20,144 --> 00:16:22,205
They don't have your best interest at heart.
136
00:16:22,465 --> 00:16:24,646
You are the product for them.
137
00:16:24,906 --> 00:16:32,630
You make them money and they make their money by bifurcating you and putting you into
classifications that are easy to market to.
138
00:16:32,630 --> 00:16:38,812
And the best way to do that is to control your information feed and only give you bits of
information.
139
00:16:38,894 --> 00:16:47,070
that make it plausible for you to believe and amplify whatever preconceived biases and
notions they think you have.
140
00:16:47,070 --> 00:16:56,056
And if you don't have those biases already, because you're keep watching that shit for
another 10 weeks and you will, because they will figure it out.
141
00:16:56,454 --> 00:17:00,595
I was going to share a little perspective I had on that.
142
00:17:00,996 --> 00:17:10,498
I haven't gone down the path of political radicalization thanks to the internet, but I did
have an experience a few years ago when attempting to grow this podcast in its previous
143
00:17:10,498 --> 00:17:11,378
form.
144
00:17:11,458 --> 00:17:20,881
Some advice I was given was to, in order to grow the podcast, go into these social media
groups, go into these places where your potential listener is hanging out, help them,
145
00:17:20,881 --> 00:17:22,189
offer them advice.
146
00:17:22,189 --> 00:17:24,872
do it and become basically become online friends with them.
147
00:17:24,872 --> 00:17:29,016
They discover your podcast, your podcast grows because they had this personal connection
with you.
148
00:17:29,016 --> 00:17:38,166
So as a show focused on men's mental health, I found myself spending a lot of time in a
lot of groups where a lot of really miserable, sad fucking men just wanted to vent.
149
00:17:38,166 --> 00:17:39,847
They didn't want help.
150
00:17:39,848 --> 00:17:44,312
And, you know, day after day, these 1015 minute sessions that I would spend
151
00:17:44,813 --> 00:17:51,837
really started to erode my own mental health because I was just going like these guys,
like I'm trying to help these guys and they're all just these sad fucking sacks that
152
00:17:51,837 --> 00:17:52,518
didn't want help.
153
00:17:52,518 --> 00:17:54,318
They just wanted to be heard.
154
00:17:54,559 --> 00:17:57,991
And it it fucked with me like in bad ways.
155
00:17:57,991 --> 00:18:04,795
Like my mental health was probably at the lowest it's ever been because I was spending so
much time with people whose mental health was so emotionally fucked up that that's where
156
00:18:04,795 --> 00:18:08,467
they that's the only place they felt they had to go to get help.
157
00:18:08,767 --> 00:18:11,897
So pivot that to I don't have
158
00:18:11,897 --> 00:18:12,387
community.
159
00:18:12,387 --> 00:18:13,508
I don't have connection.
160
00:18:13,508 --> 00:18:15,479
I don't belong anywhere.
161
00:18:15,479 --> 00:18:20,561
You find yourself in a political debate with somebody, somebody else suddenly makes a
little bit of sense.
162
00:18:20,561 --> 00:18:23,962
They share a group with you that you take part in, you join that group.
163
00:18:23,962 --> 00:18:34,076
And that just opens up the pathway to deeper and deeper rabbit holes that ultimately lead
to these radicalized behaviors where you're suddenly feeling like it makes completely
164
00:18:34,076 --> 00:18:38,468
rational sense to go murder somebody because they said something you don't like.
165
00:18:39,488 --> 00:18:41,189
That's how it works.
166
00:18:41,844 --> 00:18:42,844
Yep.
167
00:18:43,521 --> 00:18:51,417
Now, the thing I want to ask you about, I know you have a lot of experience uh in dealing
with bots, because that's the other part about this that people don't talk enough about is
168
00:18:51,417 --> 00:18:57,531
that all those people you're fighting with or agreeing with or having conversations with
online, they ain't people.
169
00:18:57,531 --> 00:19:05,016
And if you saw the latest Superman movie, interpretation of it as monkeys in space was
fucking fantastic.
170
00:19:05,016 --> 00:19:10,520
But you've done a lot of work to try to prevent bots from doing what bots do.
171
00:19:10,841 --> 00:19:12,081
Tell me more about.
172
00:19:12,165 --> 00:19:13,145
Yeah.
173
00:19:13,566 --> 00:19:17,848
So uh more than half the traffic in the internet is bots.
174
00:19:17,848 --> 00:19:24,091
uh I used to work, I've worked at several companies building web application firewalls.
175
00:19:24,552 --> 00:19:31,395
And uh RSA, one of the big security conferences out there, probably the biggest security
one, um which happens every year in the Bay Area.
176
00:19:31,395 --> 00:19:32,446
uh
177
00:19:32,446 --> 00:19:39,539
Their topic the last two, three years for their main stages have revolved a lot around
preventing bots.
178
00:19:39,539 --> 00:19:40,979
And there's different kinds of bots.
179
00:19:40,979 --> 00:19:47,502
There's good bots, things that are out there trying to do good things, trying to index
information for easier searching.
180
00:19:47,502 --> 00:19:49,443
Their intent is not to harm.
181
00:19:49,443 --> 00:19:57,306
There's bad bots whose intent is to harm and to take things down or to grab information or
to do things they're not supposed to do.
182
00:19:58,026 --> 00:20:02,260
And then you have this other category of
183
00:20:02,260 --> 00:20:07,083
uh just poorly configured, costly bots.
184
00:20:07,083 --> 00:20:14,187
So you might have good bots, but they cost website owners a lot of money or they cost
companies a lot of money because they're too aggressive and they do something, you know,
185
00:20:14,187 --> 00:20:16,118
they're misconfigured, do something bad.
186
00:20:16,118 --> 00:20:22,351
And then you also have bad bots that aren't expensive to let run and do their thing.
187
00:20:22,351 --> 00:20:23,268
um
188
00:20:23,268 --> 00:20:23,829
the BroBots?
189
00:20:23,829 --> 00:20:25,462
What do do about the BroBots?
190
00:20:25,778 --> 00:20:26,428
good question.
191
00:20:26,428 --> 00:20:36,901
um I would say the robots are in the upper left hand quadrant of the good and inexpensive
because if any of you are paying extra money for this podcast, we're fucking up because
192
00:20:36,901 --> 00:20:38,575
we're not getting any money from that.
193
00:20:38,675 --> 00:20:39,535
right.
194
00:20:39,756 --> 00:20:47,120
But if you but if you look at that, the companies that actually um are responsible for the
most bots are these top 10.
195
00:20:47,120 --> 00:20:49,861
I mean, it's it's Microsoft.
196
00:20:49,861 --> 00:20:53,293
um It's Apple.
197
00:20:53,293 --> 00:20:54,074
It's Alphabet.
198
00:20:54,074 --> 00:20:55,096
It's Amazon.
199
00:20:55,096 --> 00:20:56,506
It's definitely Facebook.
200
00:20:56,506 --> 00:21:03,941
I when you look at the people that are actually generating traffic to go through and feed
their AIM models, it's these folks on top of that.
201
00:21:03,941 --> 00:21:14,677
em Companies actually make money like you have more than a few state actors going out
there doing things that are relatively free for them, where they create their own version
202
00:21:14,677 --> 00:21:23,112
of chat bots that scrape and look at content that's going across the Internet and find
ways to provide outrage messages to those pieces.
203
00:21:23,144 --> 00:21:26,945
because they're trying to amplify outrage and anger.
204
00:21:27,085 --> 00:21:32,617
It's good economically to do this because it gets people fired up and keeps them more
engaged.
205
00:21:32,617 --> 00:21:33,988
There's no doubt about that.
206
00:21:33,988 --> 00:21:43,771
And we talk about these things like, you hear a lot about the mainstream media being a
problem and your mainstream media are basically your three commonly found broadcast
207
00:21:43,771 --> 00:21:44,131
channels.
208
00:21:44,131 --> 00:21:47,252
It's ABC, NBC, CBS and Fox.
209
00:21:47,600 --> 00:21:48,690
in the Seattle area.
210
00:21:48,690 --> 00:21:51,261
It's channel four, channel five, channel seven, channel 13.
211
00:21:51,261 --> 00:21:54,983
And there's other ones like the CW and other things, but they don't do anything for media.
212
00:21:54,983 --> 00:21:57,984
PBS used to kind of be called mainstream media, but it's not.
213
00:21:57,984 --> 00:21:59,965
It was public broadcasting.
214
00:22:01,166 --> 00:22:06,588
You have to get down to company number.
215
00:22:07,648 --> 00:22:08,499
Let me see here.
216
00:22:08,499 --> 00:22:12,030
Oh, I'm still going.
217
00:22:12,030 --> 00:22:12,640
I'm still going.
218
00:22:12,640 --> 00:22:14,311
I'm in the 50s.
219
00:22:14,311 --> 00:22:16,971
I'm in the 60s.
220
00:22:16,971 --> 00:22:19,476
the media money is not really there.
221
00:22:19,476 --> 00:22:21,736
I'm in the 70s, okay!
222
00:22:21,736 --> 00:22:28,812
I get down to the Walt Disney Corporation at spot number 79 at $204 billion dollars a
223
00:22:30,470 --> 00:22:36,905
In contrast, the real media companies, the ones actually doing real money things.
224
00:22:38,206 --> 00:22:49,215
They have market caps of, let's see, Microsoft 3.85 trillion.
225
00:22:49,856 --> 00:22:52,298
Google 3 trillion.
226
00:22:52,298 --> 00:22:54,580
Amazon 2.5 trillion.
227
00:22:54,580 --> 00:22:56,941
Facebook almost 2 trillion.
228
00:22:57,242 --> 00:22:58,443
It's not even close.
229
00:22:58,443 --> 00:23:00,496
Like these are orders of magnitude.
230
00:23:00,496 --> 00:23:02,457
more than the mainstream media.
231
00:23:02,457 --> 00:23:12,283
They make way more money, they do way more things, and their broadcast media coming over
the airwaves is not a fucking thing anymore.
232
00:23:12,283 --> 00:23:16,320
If you look at it, yes, you can still get broadcast media sent to your house.
233
00:23:16,320 --> 00:23:24,349
You can put up a high definition antenna and you might be able to pick up some channels,
but almost all of them don't spend any money on critical range addressing and they force
234
00:23:24,349 --> 00:23:27,211
you to go through a cable operator and look at these things online.
235
00:23:27,211 --> 00:23:29,592
So these companies really are not
236
00:23:29,662 --> 00:23:31,293
broadcast mainstream media.
237
00:23:31,293 --> 00:23:35,424
If you look at the newspapers, the companies that own newspapers are in this.
238
00:23:35,424 --> 00:23:38,095
Most of these are billionaires that own those newspapers.
239
00:23:38,095 --> 00:23:46,839
Like I remember back in the day, everyone was worried that Knight Ritter was going to
consolidate all these media sources and try to have one homogenous voice back in the 90s.
240
00:23:46,839 --> 00:23:49,830
was one of the thesis for my senior project when I was in college.
241
00:23:51,668 --> 00:23:53,728
That shit doesn't matter anymore.
242
00:23:54,388 --> 00:23:55,908
Nobody gives a fuck.
243
00:23:55,908 --> 00:23:57,528
Like people don't read newspapers.
244
00:23:57,528 --> 00:24:00,388
My wife loves to read newspapers.
245
00:24:00,388 --> 00:24:02,868
We get the Seattle Times every Sunday.
246
00:24:02,868 --> 00:24:11,168
Do you know the last time they got out of that plastic rain bag and actually got opened up
and it wouldn't just get tossed in the garbage on like Tuesday or Thursday was been a long
247
00:24:11,168 --> 00:24:12,268
fucking time.
248
00:24:12,288 --> 00:24:21,874
Not because we don't enjoy opening up the paper and holding it in our hands, but the
convenience of having these things fed to you and pushed to you.
249
00:24:21,874 --> 00:24:23,124
It's just too high.
250
00:24:23,124 --> 00:24:29,136
Why am I going to spend more cycles and time going deep on these kinds of stories when I
can just dummy box this thing right on my screen right in front of me?
251
00:24:29,136 --> 00:24:38,736
and who's who's going to read the three page article in the paper when there is a nice
four paragraph summary from somebody's perspective that is just fed to you while you sit
252
00:24:38,736 --> 00:24:42,846
there drinking your coffee on your couch with the you know, the phone staring you in face.
253
00:24:42,846 --> 00:24:46,618
Yes, now imagine you're 20-something.
254
00:24:46,618 --> 00:24:50,479
Like, you've probably never picked up a fucking newspaper in your life.
255
00:24:50,479 --> 00:24:56,302
Or if you have, it's been for, like, nostalgic reasons, because your parents told you to
go get it.
256
00:24:57,522 --> 00:24:58,423
Right.
257
00:24:58,823 --> 00:25:00,144
Yeah, exactly.
258
00:25:00,144 --> 00:25:01,065
And more than likely,
259
00:25:01,065 --> 00:25:01,776
These exist?
260
00:25:01,776 --> 00:25:02,156
Why?
261
00:25:02,156 --> 00:25:03,164
That's ridiculous.
262
00:25:03,164 --> 00:25:10,469
And more than likely, you printed it off on your computer and didn't actually get a
newspaper or actually go to the library and look at microfilm reels.
263
00:25:10,469 --> 00:25:12,031
Like it's just not practical.
264
00:25:12,031 --> 00:25:14,472
We've changed the way that we consume media.
265
00:25:14,532 --> 00:25:19,606
And people want to come back and say, you know, Fox News is the problem.
266
00:25:19,606 --> 00:25:20,647
CNN is the problem.
267
00:25:20,647 --> 00:25:21,808
MSNBC is the problem.
268
00:25:21,808 --> 00:25:22,248
Bullshit.
269
00:25:22,248 --> 00:25:27,643
Like in any given point in time, collectively, they don't have more than five percent of
the viewing audience.
270
00:25:27,643 --> 00:25:30,724
If you look at the ratings on them, like they're just not there.
271
00:25:30,792 --> 00:25:41,795
Well, also, as someone who spent 20 years in the media, the fallacy that there is some,
you know, somebody sitting in an office somewhere dictating all of the content that must
272
00:25:41,795 --> 00:25:44,936
be fed through all of these channels is just not real.
273
00:25:44,936 --> 00:25:45,976
It's just not a thing.
274
00:25:45,976 --> 00:25:49,097
I've worked in enough newsrooms to know that's not a thing.
275
00:25:49,097 --> 00:25:59,780
I will say when I did work for one company, Sinclair Broadcast Group, by the way, that is
the only company I ever worked with that mandated content to be fed across all of their
276
00:25:59,780 --> 00:26:00,860
affiliates.
277
00:26:00,876 --> 00:26:07,799
conservative, hard right propaganda that we are required to play and people lost their
jobs because they resisted.
278
00:26:07,799 --> 00:26:09,569
I never heard that shit coming from the left.
279
00:26:09,569 --> 00:26:14,341
I'm not saying it didn't happen, but I spent enough time there to know if it did, I would
have heard it.
280
00:26:14,341 --> 00:26:20,443
If somebody listening to this or watching this knows of something, please, please educate
me because I would love to see it.
281
00:26:20,443 --> 00:26:22,230
But that shit is unacceptable.
282
00:26:22,230 --> 00:26:36,071
and I will say that there have been lawsuits at uh Washington Post, at New York Times, at
multiple different print outlets by conservative editors or conservative writers, where
283
00:26:36,071 --> 00:26:46,450
they've come back and they said, my way of presenting this or my way of structuring this
editorial content was taken down or removed or I was fired because I was making opinions.
284
00:26:46,450 --> 00:26:49,032
So I think it does happen on both sides.
285
00:26:49,032 --> 00:26:49,393
I do.
286
00:26:49,393 --> 00:27:00,142
sure it does, but right, think there's a difference between I am an individual contributor
and my voice is being silenced versus we are the company mandating that everyone share
287
00:27:00,142 --> 00:27:00,693
this content.
288
00:27:00,693 --> 00:27:02,794
There's a very distinct difference there.
289
00:27:04,028 --> 00:27:05,568
Well, OK.
290
00:27:06,328 --> 00:27:09,088
If the effect is the same, though, is that.
291
00:27:09,108 --> 00:27:15,508
I mean, if the effect is the same, then it's editorializing content for a particular
brand, whether it be more conservative or progressive.
292
00:27:15,748 --> 00:27:29,468
I think I think those things still happen, and I I definitely think that Hollywood and the
people that produce content for entertainment purposes tend to skew more left than right,
293
00:27:29,468 --> 00:27:32,428
but not entirely right, like.
294
00:27:32,620 --> 00:27:42,447
There definitely does tend to be a narrative that gets put in from Hollywood by the people
that produce content to try to make things look and feel a certain way.
295
00:27:43,888 --> 00:27:49,933
We've seen the scope of military dramas change, police dramas change.
296
00:27:49,933 --> 00:27:54,296
They're definitely more progressive in the way that they talk about things.
297
00:27:54,296 --> 00:28:00,320
They definitely present things from that perspective, but they try to be holistic and they
try to be more inclusive.
298
00:28:00,360 --> 00:28:01,881
compared to what they were 20 years ago.
299
00:28:01,881 --> 00:28:04,933
Like we just watch uh Starship Troopers.
300
00:28:04,933 --> 00:28:06,344
My kids were watching it downstairs.
301
00:28:06,344 --> 00:28:07,982
My daughter's boyfriend had not seen it before.
302
00:28:07,982 --> 00:28:09,906
And they're like, we're to watch Starship Troopers.
303
00:28:10,207 --> 00:28:10,627
It is.
304
00:28:10,627 --> 00:28:13,139
And it's a total war propaganda film, right?
305
00:28:13,139 --> 00:28:21,935
Like, and it's this thing where it's human fascists against bugs because we've decided to
take fascism out to the universe and not just keep within the local country.
306
00:28:21,935 --> 00:28:30,010
But the nice part is that humans were all unified and there were men and women and
minorities all fighting with, fighting against the bad enemy, you know.
307
00:28:31,874 --> 00:28:41,728
In other words, we are highly, highly susceptible to having our minds changed by signal
inundation.
308
00:28:41,728 --> 00:28:46,413
And the more signals you have coming towards you, the more you're going to be affected by
this.
309
00:28:46,575 --> 00:28:48,046
At the same time,
310
00:28:49,646 --> 00:28:59,583
You can program automation in such a way to amplify negative signals or positive signals
however you want.
311
00:28:59,583 --> 00:29:03,406
And understanding the intent of those pieces is not particularly difficult.
312
00:29:03,746 --> 00:29:06,709
And creating fake Facebook accounts is super easy.
313
00:29:06,709 --> 00:29:11,322
And Facebook does not at all want to stop this from happening.
314
00:29:11,322 --> 00:29:12,813
It's how they make their money.
315
00:29:12,813 --> 00:29:15,715
They are not incentivized one bit to make this change.
316
00:29:15,715 --> 00:29:17,256
Neither is TikTok.
317
00:29:17,554 --> 00:29:19,985
Neither is Google slash YouTube.
318
00:29:20,545 --> 00:29:23,185
None of these big podcasts are set up that way.
319
00:29:23,526 --> 00:29:28,367
Apple makes their money reselling multiple different layers of these pieces and creating
their own content.
320
00:29:29,087 --> 00:29:37,580
mean, Amazon's in so many as their finger in so many cookie jars that, you know, there's
there's no way that they're not making money.
321
00:29:37,580 --> 00:29:38,850
They have their own ad tech platform.
322
00:29:38,850 --> 00:29:43,051
They have their own LLNs and machine learning systems like these are all there.
323
00:29:43,051 --> 00:29:46,276
And then your chip manufacturers, your Nvidia, your Broadcom.
324
00:29:46,276 --> 00:29:56,220
your TSMC, like all these places make money off of trying to crunch data at large volumes
inside their data centers.
325
00:29:56,220 --> 00:29:58,991
And Oracle is hugely on the rise right now.
326
00:29:58,991 --> 00:30:02,983
Larry Ellison's son now owns CBS.
327
00:30:02,983 --> 00:30:09,806
uh Larry Ellison is not known for being super progressive.
328
00:30:09,806 --> 00:30:13,127
But honestly, he's not known for being super political either.
329
00:30:13,127 --> 00:30:15,720
He tries to figure out a way to make money.
330
00:30:15,720 --> 00:30:24,471
because it's not about politics, it's about deregulation, and it's about getting access to
turn people's dollars out.
331
00:30:24,471 --> 00:30:33,091
But that but I think that that's something that is interesting, because it's not just
money, like the political impact is also very, very strong.
332
00:30:33,091 --> 00:30:41,071
Because if you talk about the fact that we're all arguing over whether or not Charlie,
Charlie Kirk was a good guy, or whether or not Jimmy Kimmel should have been pulled off
333
00:30:41,071 --> 00:30:41,771
the air.
334
00:30:41,771 --> 00:30:46,651
We're not arguing about the fact that there are 10 companies running everything and
fucking over the rest of us.
335
00:30:46,651 --> 00:30:49,631
And that's like, okay, the money's great.
336
00:30:49,631 --> 00:30:52,684
But the power and the control is what keeps
337
00:30:52,684 --> 00:31:03,450
the money coming in because if all of us actually banded together and rose up against the
very few billionaires that are causing all of this or at least making it possible for us
338
00:31:03,450 --> 00:31:07,832
to live this way, it would be a quick turnaround, right?
339
00:31:07,832 --> 00:31:19,659
Like if we realized who the real bad guy is here, it would be a very different world
rather than the one where we're sitting here arguing about whose idea about, know, Charlie
340
00:31:19,659 --> 00:31:21,740
Kirk or Jimmy Kimmel is right.
341
00:31:21,740 --> 00:31:28,040
I mean, as of 2024, there are 813 billionaires in the US.
342
00:31:28,500 --> 00:31:29,700
813 people.
343
00:31:29,700 --> 00:31:41,040
And if you just collect them all and turn them into a monolith and keep them as one group
and say these 813 people are the ones doing these things, we need to get rid of these 813
344
00:31:41,040 --> 00:31:41,960
people.
345
00:31:42,100 --> 00:31:43,960
That's not the answer.
346
00:31:44,140 --> 00:31:48,880
Because there are a few million millionaires.
347
00:31:49,108 --> 00:31:53,788
I mean, how many US I'm just Googling this live.
348
00:31:55,248 --> 00:32:01,868
There are almost 22 million millionaires in the US.
349
00:32:01,868 --> 00:32:10,448
I guarantee you that those people are more than willing to step up and take over the jobs
of those billionaires because we want a money.
350
00:32:10,828 --> 00:32:18,128
The way that you solve this problem is you get money out of the public good and stop
deciding that you're going to have.
351
00:32:19,720 --> 00:32:21,481
Capitalism be the driving force.
352
00:32:21,481 --> 00:32:22,342
Not just capitalism.
353
00:32:22,342 --> 00:32:23,292
It's not capitalism.
354
00:32:23,292 --> 00:32:26,574
mean, we're not a free market capitalist society in the US or in the West at all.
355
00:32:26,574 --> 00:32:36,340
We're at best a representational uh socialist, uh democratic socialist country in some
regard.
356
00:32:36,340 --> 00:32:39,871
We socialize losses, we privatize profits.
357
00:32:40,612 --> 00:32:43,767
We have socialized medicine in some regards.
358
00:32:43,767 --> 00:32:46,355
We have socialized roads, socialized school systems.
359
00:32:46,436 --> 00:32:47,536
But we
360
00:32:47,536 --> 00:32:55,484
change who gets what and who has to pay for what depending on their economic prowess and
upon whether or you can get lobbyists to go in and
361
00:32:57,362 --> 00:33:05,634
Look, we started this conversation off about radicalism in these pieces, and we're talking
about the motivation of folks of why it is that these companies are not going to make
362
00:33:05,634 --> 00:33:06,815
these changes.
363
00:33:06,995 --> 00:33:19,289
Really, I think what we should be talking about is the Simpsons episode where the ad tech
piece is in there and are not the ad tech that the ad company was in there and they're
364
00:33:19,289 --> 00:33:19,849
singing the song.
365
00:33:19,849 --> 00:33:23,930
Just don't look, just don't look and you'll kill the marketing items.
366
00:33:23,930 --> 00:33:24,360
I mean.
367
00:33:24,360 --> 00:33:33,023
You have Lardlad there with the giant donut holding this thing up, walking through places
like, yeah, you have to just not look like if you do things that aren't going to make the
368
00:33:33,023 --> 00:33:37,304
money, then you're no longer a profit target for them and they have a hard time sending
stuff to you.
369
00:33:37,304 --> 00:33:49,448
I don't want to set myself out as the golden example of this, but my Facebook feed is 90
percent cat videos and and like surfing clips these days.
370
00:33:49,704 --> 00:33:55,186
And every now and again, I'll get a pop up for like some probiotic or testosterone health
or hair loss health, because they know how old I am.
371
00:33:55,186 --> 00:33:57,106
And these are like common things, right?
372
00:33:57,506 --> 00:34:07,689
Every now and again, like I get, I guess, bro culture stuff thrown at me where it's like,
you should be shooting guns and you should be hunting and you should be doing this and
373
00:34:07,689 --> 00:34:07,839
this.
374
00:34:07,839 --> 00:34:08,770
And I'm like, yeah, okay.
375
00:34:08,770 --> 00:34:09,990
Like I've done that shit.
376
00:34:09,990 --> 00:34:10,480
I get it.
377
00:34:10,480 --> 00:34:12,511
But like, no, I don't need that.
378
00:34:12,511 --> 00:34:14,051
And I skip over that ad.
379
00:34:14,051 --> 00:34:14,922
I give it no air time.
380
00:34:14,922 --> 00:34:16,488
And they've tuned the algorithm.
381
00:34:16,488 --> 00:34:21,216
to figure out a way to make me profitable because I don't really cleanly fit into any of
these categories.
382
00:34:21,216 --> 00:34:29,759
That being said, I'm sure that when they have my profile in place and they're saying, hey,
do you want to market to this person and push those things across?
383
00:34:30,180 --> 00:34:37,272
Companies are probably getting more discerning as to which demographic bucket I fall into
because I do have a discerning way of tuning.
384
00:34:38,854 --> 00:34:43,298
Now, if I go on to blue sky or X.
385
00:34:43,959 --> 00:34:50,985
Like it just goes white male 50 and like barfs all of it on me.
386
00:34:50,985 --> 00:34:53,567
uh Instagram is not much different.
387
00:34:53,567 --> 00:34:58,382
Like there are certain people that follow on Instagram, but I get ads on Instagram, which
is fucking owned by Facebook.
388
00:34:58,382 --> 00:35:00,053
You would think they'd share the same algorithm.
389
00:35:00,053 --> 00:35:04,197
And I don't look at it at all because I don't have enough track record in history to look
at my wife's and my kid's stuff.
390
00:35:04,197 --> 00:35:05,828
I don't tend to scroll through.
391
00:35:06,118 --> 00:35:08,801
So its algorithm does not have as much information on me.
392
00:35:08,801 --> 00:35:11,225
So it's still trying to figure out kind of who I am.
393
00:35:11,225 --> 00:35:17,293
But in the figuring out who I am phase, it's test marketing and throwing shit at me that
it thinks I might like.
394
00:35:17,293 --> 00:35:23,059
And almost all of it is a variation of bro culture or right wing propaganda.
395
00:35:23,738 --> 00:35:24,878
And...
396
00:35:26,780 --> 00:35:28,951
They're not doing it to help me.
397
00:35:28,951 --> 00:35:32,153
They're not doing it to bring me better content that I will like.
398
00:35:32,153 --> 00:35:35,995
They're doing it to sell me ads and capture my eyes.
399
00:35:36,376 --> 00:35:37,416
That's what they're doing.
400
00:35:37,416 --> 00:35:41,899
And I'm 50, so radicalizing me is going to be hard because I'm old, fat, and lazy.
401
00:35:41,899 --> 00:35:52,305
uh A teenager or a 20-something, they've got energy, they've got time, and they feel
depressed because it feels like there's no future.
402
00:35:52,305 --> 00:35:54,126
The economy's busted.
403
00:35:54,126 --> 00:35:55,056
There's
404
00:35:55,676 --> 00:35:57,808
very little economic opportunity for them.
405
00:35:57,808 --> 00:36:00,249
AI is coming in and taking over things.
406
00:36:03,233 --> 00:36:12,299
If you don't fall into the cisgender white like kind of bucket of things as as a man,
407
00:36:12,820 --> 00:36:16,880
And most of the media sources have decided that they want to go after you in this way.
408
00:36:16,880 --> 00:36:20,460
And I mean, they're they're definitely there's a more there's a bigger prevalence, right?
409
00:36:20,460 --> 00:36:27,220
Like getting kids to be outraged by telling them that they're not part of the community or
what they're doing is wrong.
410
00:36:27,220 --> 00:36:32,040
And then amplifying that signal with negative messages coming from other people to create
more outrage.
411
00:36:32,080 --> 00:36:41,844
And then these bots that go through and amplify that message even more and put even more
like hysterically outrageous.
412
00:36:43,005 --> 00:36:47,066
Shitty things that they say to try to get people more more riled up.
413
00:36:47,566 --> 00:36:55,970
This is a recipe for disaster because we are telling people that are part of groups and
tribes that are not fully mentally developed that don't have all.
414
00:36:56,610 --> 00:37:01,422
All the mental regulation that's actually required to work in this space that.
415
00:37:01,724 --> 00:37:04,591
You know, don't get upset about this.
416
00:37:04,591 --> 00:37:10,244
These things don't get radicalized, don't get pushed in this direction when it's their
main information source is unrealistic.
417
00:37:10,346 --> 00:37:11,290
And yeah.
418
00:37:11,290 --> 00:37:14,773
they're they're young people that are that are lost.
419
00:37:14,773 --> 00:37:15,714
They feel alone.
420
00:37:15,714 --> 00:37:17,895
They feel ashamed because of it.
421
00:37:17,895 --> 00:37:20,157
They don't know where their people are.
422
00:37:20,157 --> 00:37:22,619
So they go more and more inward.
423
00:37:22,619 --> 00:37:26,131
They become more and more isolated, which then they still crave connection.
424
00:37:26,131 --> 00:37:34,687
So then they reach out through social media and then they become immediate targets that
that this very propaganda that you're talking about is targeted for.
425
00:37:34,687 --> 00:37:38,524
It's trying to reach that lost young man and push him in a certain direction.
426
00:37:38,524 --> 00:37:40,205
Well, I think it's more nefarious.
427
00:37:40,205 --> 00:37:46,670
So I actually think when these people go through in the street looking for things that are
part of counterculture, Facebook provides it.
428
00:37:46,710 --> 00:37:48,301
The social media sources provided.
429
00:37:48,301 --> 00:37:49,762
It's like, well, look at this group.
430
00:37:49,762 --> 00:37:50,643
Follow these things.
431
00:37:50,643 --> 00:37:52,475
Push these things in this direction.
432
00:37:52,475 --> 00:38:02,272
And then what you get is you get other companies out there and state actors going through
and saying, well, let's destabilize things a bit more and let's actually join these groups
433
00:38:02,272 --> 00:38:05,426
and pretend like we're part of this or
434
00:38:05,426 --> 00:38:12,402
just fuck with people to try to make these things more difficult and painful because it
amplifies the outrage.
435
00:38:12,402 --> 00:38:18,006
It amplifies the distrust and the hatred and the otherification of people out there.
436
00:38:18,066 --> 00:38:25,212
And we're all people, we're all persons, we're all flawed in all kinds of different ways.
437
00:38:25,212 --> 00:38:33,879
And one group saying these people shouldn't exist is no worse than any other group saying
these people shouldn't exist.
438
00:38:33,879 --> 00:38:35,550
Saying it is one thing.
439
00:38:35,550 --> 00:38:36,419
Mm-hmm.
440
00:38:36,978 --> 00:38:43,031
taking action and doing things to make those things people not exist.
441
00:38:44,172 --> 00:38:45,993
That's that next step.
442
00:38:46,113 --> 00:39:02,122
And when you look at the body politic and the removal of voices and the removal of
opinions and literally the physical removal of people in the US at the moment, there is
443
00:39:02,122 --> 00:39:05,934
plenty of reason to think that at any given point in time,
444
00:39:05,934 --> 00:39:10,027
you could be targeted and the fear is not unfounded.
445
00:39:10,027 --> 00:39:19,784
So you've got this convergence of these different pieces occurring right now and it's
scaring folks and the media outrage machine is leaning into that.
446
00:39:19,784 --> 00:39:24,167
And keep in mind, it's not the mainstream media that's the problem.
447
00:39:24,167 --> 00:39:26,759
It's fucking social media.
448
00:39:27,224 --> 00:39:31,092
I look, I still got a meta account.
449
00:39:31,092 --> 00:39:32,063
I still use Facebook.
450
00:39:32,063 --> 00:39:33,564
I still use Instagram.
451
00:39:34,056 --> 00:39:36,057
I've got an ex account, which I fucking hate.
452
00:39:36,057 --> 00:39:37,757
I refuse to have a TikTok account.
453
00:39:37,757 --> 00:39:39,147
I've got a blue sky account.
454
00:39:39,147 --> 00:39:40,858
Like I've got 40 other pieces.
455
00:39:40,858 --> 00:39:45,719
I I started a fucking social media company like eight, nine years ago to try to get it
moving.
456
00:39:46,720 --> 00:39:51,741
This is the, this is the things that we deal with and the things that, that we have to be
looking at.
457
00:39:51,741 --> 00:40:02,042
What I do think needs to occur is I think these social media companies um there needs to
be a way for people to have
458
00:40:02,042 --> 00:40:06,175
Other folks validated that what they're saying that these are real people, not just bots.
459
00:40:06,215 --> 00:40:09,557
Facebook could do all of that, but they're not doing it because it's expensive.
460
00:40:09,557 --> 00:40:10,388
It's difficult.
461
00:40:10,388 --> 00:40:12,399
It eats into the profit margins.
462
00:40:13,240 --> 00:40:14,041
Exactly.
463
00:40:14,041 --> 00:40:18,163
It doesn't allow people to be as outraged so they don't get as much money.
464
00:40:18,604 --> 00:40:19,034
Yeah.
465
00:40:19,034 --> 00:40:24,101
is, as we heard earlier, there is no appetite in DC for regulation on any of this stuff.
466
00:40:24,101 --> 00:40:24,601
no.
467
00:40:24,601 --> 00:40:30,104
mean, the fact that we've just deregulated AI for the next 10 years with no with no strict
controls.
468
00:40:30,965 --> 00:40:33,166
How are we going to deal with all this?
469
00:40:33,466 --> 00:40:37,468
And the real thing is we have to stop being assholes.
470
00:40:38,209 --> 00:40:49,796
And it's real hard to stop being an asshole when all you're seeing is messages coming from
assholes like to put it in really clear vernacular.
471
00:40:51,132 --> 00:40:56,088
when your entire mental space is occupied by the farts of shitty assholes.
472
00:40:56,088 --> 00:40:57,575
Laughter
473
00:40:59,288 --> 00:41:03,844
you might start, your breath might start to stink when you speak.
474
00:41:03,844 --> 00:41:11,551
It's like we said, with no threat of getting punched in the mouth or having to smell a
shitty asshole, you're gonna keep being an asshole.
475
00:41:11,551 --> 00:41:13,963
don't assume that you're not an asshole.
476
00:41:13,963 --> 00:41:16,265
Like, you fucking are.
477
00:41:16,265 --> 00:41:17,215
And that's okay.
478
00:41:17,215 --> 00:41:18,926
It's okay to be that.
479
00:41:18,926 --> 00:41:20,968
You don't deserve to be shot for that.
480
00:41:20,968 --> 00:41:22,439
You don't deserve to be killed for that.
481
00:41:22,439 --> 00:41:24,230
That's just fucking wrong.
482
00:41:24,671 --> 00:41:27,172
Punching in the face is a different story, but.
483
00:41:27,201 --> 00:41:31,734
and I just because this is setting off an alarm bell in my head, I want to make sure we're
clear about this.
484
00:41:31,734 --> 00:41:38,069
And we're talking about the the class war, big air quotes between the billionaires and the
rest of us.
485
00:41:38,069 --> 00:41:41,051
No one here is calling for the death of billionaires.
486
00:41:41,231 --> 00:41:47,977
I'm just I'm just suggesting that maybe some of their money could be better spent in other
ways and perhaps shared a little bit more evenly.
487
00:41:47,977 --> 00:41:50,328
But that's that's not for me to decide.
488
00:41:50,434 --> 00:41:56,767
Or we have better regulations that actually have the people in mind and not the profits of
corporations.
489
00:41:56,767 --> 00:42:03,940
And I think that's the biggest thing is that our body politic and the way that we make
laws and make decisions are not happening in the people's best interest.
490
00:42:03,940 --> 00:42:05,851
They're happening in corporations best interest.
491
00:42:05,851 --> 00:42:14,475
And don't don't buy into the concept that one side does this and the other side doesn't
because they both do like they both take money from lobbyists.
492
00:42:14,475 --> 00:42:17,146
They both take money from special interests.
493
00:42:17,840 --> 00:42:27,647
Certainly one side that's in power right now seems to be doing it a lot more than anyone's
ever done in modern history from the time that we've been taking data.
494
00:42:27,647 --> 00:42:37,073
Like the last time that we were in this kind of economic issue with the ruling elite
having such different barriers was just before we went into the Great Depression.
495
00:42:38,234 --> 00:42:39,594
Coincidence?
496
00:42:39,715 --> 00:42:40,396
I don't know.
497
00:42:40,396 --> 00:42:41,666
We're going to find out.
498
00:42:41,822 --> 00:42:47,424
The last point I'll make on this is just that, you know, you talked about don't look
right.
499
00:42:47,424 --> 00:42:49,624
Just don't don't give them the attention.
500
00:42:49,624 --> 00:42:57,206
I've had this now multiple times in real life where I have been surrounded by people that
I do not politically agree with that.
501
00:42:57,206 --> 00:43:02,728
If we were on Facebook, the conversation would be one I would not engage in because it's
not beneficial.
502
00:43:02,928 --> 00:43:08,849
But I was in shared space with a lot of people that see the world very differently than I
do.
503
00:43:08,849 --> 00:43:11,620
And we never once talked about it.
504
00:43:11,656 --> 00:43:20,902
What we talked about was how we're doing, how our health is, how our family is, how our
kids are doing, how they're doing in school, how works going, fun things they're working
505
00:43:20,902 --> 00:43:21,693
on.
506
00:43:21,953 --> 00:43:27,527
And it was fucking lovely to be able to share space with people that see the world
differently than me.
507
00:43:27,527 --> 00:43:30,739
But at the end of the day, we all want the same things.
508
00:43:30,739 --> 00:43:32,621
We want a safe place to raise our kids.
509
00:43:32,621 --> 00:43:33,932
We want to make a decent wage.
510
00:43:33,932 --> 00:43:36,333
We want to be able to buy a house and have some financial security.
511
00:43:36,333 --> 00:43:38,575
I that's that's all we really want.
512
00:43:38,575 --> 00:43:40,856
We used to just argue about the best way to get there.
513
00:43:41,183 --> 00:43:44,683
But now we argue about stupid shit that doesn't even matter.
514
00:43:44,683 --> 00:43:51,520
I just I bring this up just to again point out, just put the fucking thing down and go
talk to people in real life.
515
00:43:51,520 --> 00:43:54,911
It's just such a different reality than the one you're experiencing online.
516
00:43:55,913 --> 00:43:56,419
Yeah.
517
00:43:56,419 --> 00:43:56,579
up.
518
00:43:56,579 --> 00:43:57,860
We could end the show.
519
00:44:01,690 --> 00:44:08,205
We have turned minor grievances into existential problems of good versus evil.
520
00:44:09,166 --> 00:44:15,080
And my wife was a kindergarten paraeducator for a while.
521
00:44:15,151 --> 00:44:20,255
And one of the things they teach kindergartners is how to react to problems.
522
00:44:20,636 --> 00:44:25,920
Is your reaction comparable to the size of the problem?
523
00:44:27,656 --> 00:44:32,236
We have to remember that Facebook is the kindergarten playground.
524
00:44:32,376 --> 00:44:36,636
This is not where we're going out there and being our best selves and our best adult-like
people.
525
00:44:36,776 --> 00:44:44,556
We're dealing with a bunch of emotional limbic system pieces that are going out and we're
treating everything like an existential crisis.
526
00:44:44,736 --> 00:44:55,776
And there are several books about this and the world religions didn't have this problem
until about 2000 years ago.
527
00:44:56,084 --> 00:45:10,170
with the rise of Christianity and creating this idea that there are cosmic goods and
cosmic evils that are at war with each other, amplifies these pieces and stops letting us
528
00:45:10,170 --> 00:45:14,342
look at each other as being human beings that are both good and bad.
529
00:45:14,342 --> 00:45:23,186
And if we don't start thinking about each other as holistic, real people that maybe have
bad ideas, that maybe done some bad things, but actually are still good at some level and
530
00:45:23,186 --> 00:45:24,666
worthy of respect,
531
00:45:25,340 --> 00:45:26,780
This is going to get worse.
532
00:45:26,780 --> 00:45:28,761
It's going to get a lot fucking worse.
533
00:45:28,761 --> 00:45:30,842
And I don't want that to happen.
534
00:45:30,842 --> 00:45:33,842
I look, I break the country up.
535
00:45:33,842 --> 00:45:35,083
OK, great.
536
00:45:35,083 --> 00:45:35,733
I get that.
537
00:45:35,733 --> 00:45:36,973
You want to have a civil war.
538
00:45:36,973 --> 00:45:38,740
It sounds miserable and awful.
539
00:45:38,740 --> 00:45:40,364
I don't want to do that.
540
00:45:41,024 --> 00:45:43,545
No one's going to come out as a winner on this.
541
00:45:44,005 --> 00:45:53,528
And the reality is, is like you said, there are many people and many friends that I have
that are on a different side of the political spectrum who I have lovely conversations
542
00:45:53,528 --> 00:45:54,640
about sports.
543
00:45:54,640 --> 00:46:02,716
about food, go out to dinner with them, have beers with them, and when politics comes up,
we actually sit there and listen to each other and try to understand.
544
00:46:02,716 --> 00:46:06,140
And we'll say shit, but like, you're full of shit, here's why.
545
00:46:06,140 --> 00:46:14,988
And we'll give each other grief because we have the social dynamic and the context built
around that where we can act like adults with each other and not in the fucking
546
00:46:14,988 --> 00:46:17,730
kindergarten playground that is the internet.
547
00:46:17,822 --> 00:46:18,582
Yep.
548
00:46:19,463 --> 00:46:20,583
There you go.
549
00:46:21,403 --> 00:46:22,084
Real life.
550
00:46:22,084 --> 00:46:23,684
That's where to work this out.
551
00:46:23,804 --> 00:46:24,895
Yeah, exactly.
552
00:46:24,895 --> 00:46:26,275
All right, man.
553
00:46:26,275 --> 00:46:26,805
Good talk.
554
00:46:26,805 --> 00:46:27,456
That was good stuff.
555
00:46:27,456 --> 00:46:28,564
don't know if that helped anybody.
556
00:46:28,564 --> 00:46:29,276
It certainly helped me.
557
00:46:29,276 --> 00:46:31,407
I had a lot of that just rolling in my head for days.
558
00:46:31,407 --> 00:46:33,188
So I'm glad we got that out.
559
00:46:34,018 --> 00:46:35,549
well, we were in charge.
560
00:46:35,549 --> 00:46:38,790
We work for ourselves so nobody can nobody can cancel us.
561
00:46:38,790 --> 00:46:41,543
Well, yeah, we do have.
562
00:46:42,706 --> 00:46:44,828
We do have other employers, though.
563
00:46:46,291 --> 00:46:47,609
Yeah, yeah.
564
00:46:47,609 --> 00:46:47,989
All right.
565
00:46:47,989 --> 00:46:51,833
Well, if you have found this beneficial in any way, please feel free to share it with
somebody.
566
00:46:51,833 --> 00:46:55,897
You can do that with the links that are available at our website, BroBots.me.
567
00:46:55,897 --> 00:47:01,912
That's where you can also sign up for a newsletter where we sort of recap some of this and
give you some action action steps to sort of take forward with your life.
568
00:47:01,912 --> 00:47:04,765
All that's available available at BroBots.me.
569
00:47:04,765 --> 00:47:08,249
And that's where the next episode of our little show here will be available in about a
week.
570
00:47:08,249 --> 00:47:09,119
Thanks so much for listening.
571
00:47:09,119 --> 00:47:09,980
We'll see you next time.
572
00:47:09,980 --> 00:47:10,884
Thanks, guys.