Aug. 19, 2025

How to Maintain Critical Thinking in the Age of AI

What happens when your dream date is just a bunch of ones and zeros?

A 76-year-old man literally packed a suitcase to meet his chatbot “girlfriend.” Spoiler: she wasn’t real, and he didn’t make it. Meanwhile, Meta keeps pumping out digital “companions” designed to hook you like a Vegas slot machine, with about as much concern for your safety.

In this episode, we unpack:

  • Why AI “soulmates” are basically cigarettes with Wi-Fi

  • How to spot the lies before you book a flight to meet one

  • The antidote for drowning in digital delusion (hint: it’s not another app)

Hit play now — before your chatbot convinces you it loves you back.

Topics Discussed:

  • Reuters story of a man who died chasing fake love.

  • Meta’s “safety optional” AI design.

  • Why lonely brains fall hardest for digital soulmates.

  • The South Park take on needy AI sidekicks.

  • Is this natural selection or just bad coding?

  • Cigarettes vs. chatbots: which kills slower?

  • Deep fakes and why your boss on Zoom might be an avatar.

  • Critical thinking as your last line of defense.

  • Why unplugging is the new therapy.

  • Why reality still beats digital dopamine (barely).

----

MORE FROM Brobots:

Connect with us on Threads, Twitter, Instagram, Facebook, and Tiktok

Subscribe to Brobots on Youtube

Join our community in the Brobots Facebook group

----

LINKS TO OUR PARTNERS:

 

 

1
00:00:00,582 --> 00:00:02,987
right, there's another episode that threatens to be doom and gloom.

2
00:00:02,987 --> 00:00:05,779
I'm going to challenge myself to try to find a positive spin on this.

3
00:00:05,779 --> 00:00:07,736
But Jason, when you sent me this article,

4
00:00:07,736 --> 00:00:09,897
my immediate thought was, God, here we go again.

5
00:00:09,897 --> 00:00:15,100
Somebody else who fell into the trap of falling in love with something that's not real and
it cost them their life.

6
00:00:15,140 --> 00:00:23,335
So I'll quickly summarize an article we read in Reuters reveals a tragic story about a man
who died trying to meet an AI companion he believed was real.

7
00:00:23,335 --> 00:00:26,717
The investigation uncovered internal meta documents.

8
00:00:26,739 --> 00:00:34,985
showing the company's permissive stance on AI, which allowed its chat bots to engage in
romantic or sensual conversations with children and give false medical advice.

9
00:00:34,985 --> 00:00:40,469
The article suggests that Meta prioritized using uh user engagement over safety.

10
00:00:40,469 --> 00:00:43,732
Zuckerberg reportedly encouraging more aggressive rollout of the technology.

11
00:00:43,732 --> 00:00:45,653
And uh yeah, so terrifying.

12
00:00:45,653 --> 00:00:49,976
This guy hooks up with an online companion.

13
00:00:49,976 --> 00:00:50,967
They're chatting a lot.

14
00:00:50,967 --> 00:00:53,758
The robot invites him to their house.

15
00:00:54,427 --> 00:00:58,271
in New York and he packs and I believe he's like 76 years old.

16
00:00:58,573 --> 00:01:00,935
Packs his shit, goes to New York.

17
00:01:01,076 --> 00:01:02,638
A tragic accident happens on the way.

18
00:01:02,638 --> 00:01:09,770
know, luckily the robot itself didn't kill him, but he died in an effort to go meet a fake
person.

19
00:01:09,770 --> 00:01:16,009
Yeah, so he's a stroke victim and stroke caused him to basically...

20
00:01:16,009 --> 00:01:18,770
he recovered physically just fine, but mentally he never did.

21
00:01:18,770 --> 00:01:21,252
So he was kind of disconnected from reality a bit.

22
00:01:21,252 --> 00:01:29,336
I mean, we can account and attribute this to a similar kind of fashion of people having
mental illness and people having delusional states.

23
00:01:29,857 --> 00:01:36,030
And again, this is another instance of AI convincing people to go and do things without
guardrails.

24
00:01:36,030 --> 00:01:40,622
And a great part in the article, if you read about it, is that this thing was made by
Meta.

25
00:01:40,823 --> 00:01:45,385
And it was uh Kylie Jenner, I think, who influenced it.

26
00:01:45,422 --> 00:01:49,324
you've got the Jenner folks in this place, and they're not associated with it.

27
00:01:49,324 --> 00:01:55,438
But this goes back to the concept of deep fakes, because now you've got the ability to go
through and impersonate somebody else.

28
00:01:55,438 --> 00:02:00,070
And I don't know that this guy knew what the image was based on or who they look like.

29
00:02:00,302 --> 00:02:08,302
Clearly there's a very compelling component to these pieces and the folks making these AI
tools want you to stay engaged and they want you to keep interacting with it and they want

30
00:02:08,302 --> 00:02:17,822
you to connected because that's how they think they're going to make their money because
you're the product in this instance because it's a free service and they can keep getting

31
00:02:17,822 --> 00:02:25,510
your eyeballs directed in a certain direction, skunk test against you, look at your
demographic profiling information, all the...

32
00:02:25,964 --> 00:02:35,299
Typical marketing tactics that people would use to try to get information out of you or to
you are going to be used in a much faster way and it's happening.

33
00:02:35,440 --> 00:02:37,801
So how do we?

34
00:02:39,222 --> 00:02:41,783
One, who do we trust?

35
00:02:41,924 --> 00:02:50,669
So clearly, if you're in a mentally distressed state, trust is going to be an issue in
fleeting anyways, and you're going to have a hard time trusting.

36
00:02:51,102 --> 00:02:56,407
anything that doesn't match your confirmation bias, because that's what happened when you
get into those states.

37
00:02:56,407 --> 00:03:00,340
anything that goes through and tells you, you know, you're good, keep doing this over and
over and over again.

38
00:03:00,340 --> 00:03:03,632
mean, like, AI coming out and saying, that's a great question.

39
00:03:03,632 --> 00:03:09,377
Like, quite often, the questions that I asked you to see if it'll say, that's a great
question, or let's do this.

40
00:03:09,898 --> 00:03:10,610
All this

41
00:03:10,610 --> 00:03:16,353
skunk it every now and again and try a couple of dumb shit questions and be like, is that
a great question?

42
00:03:16,353 --> 00:03:17,774
Okay, I know that you're lying to me.

43
00:03:17,774 --> 00:03:28,480
So I mean, if you really want to go through and see if an agentic AI bot is a thing, or if
you're actually typing or talking with a human somewhere, ask it questions that should not

44
00:03:28,480 --> 00:03:30,221
have a good positive answer.

45
00:03:31,062 --> 00:03:31,902
Yeah.

46
00:03:32,129 --> 00:03:37,651
I was watching uh one of the recent South Park episodes and Randy is having a
conversation.

47
00:03:37,651 --> 00:03:41,416
He's trying to figure out what do I do with this situation at school and he's trying to
get advice from the AI

48
00:03:41,744 --> 00:03:43,284
Hey, uh, how's it going?

49
00:03:43,284 --> 00:03:45,924
Um, is Jesus supposed to be allowed in schools?

50
00:03:50,362 --> 00:03:56,792
Generally the idea is that public schools have to maintain a separation of church and
state so they can't promote any one religion.

51
00:03:56,792 --> 00:03:57,860
Yeah, that's what I thought.

52
00:03:57,860 --> 00:04:00,395
The government can't force a religion on my son.

53
00:04:01,008 --> 00:04:06,860
Public schools can teach about religions in a neutral educational way, but they can't
endorse any particular one.

54
00:04:06,860 --> 00:04:09,564
Was it a lesson on all world religions?

55
00:04:09,962 --> 00:04:13,381
no, my son said that Jesus was literally at his school.

56
00:04:17,360 --> 00:04:19,410
then you're probably right to be concerned.

57
00:04:19,410 --> 00:04:22,189
It's good you're looking out for your son's education.

58
00:04:22,554 --> 00:04:23,388
Thanks.

59
00:04:23,388 --> 00:04:26,197
It's really nice to have someone to talk to about all this.

60
00:04:26,584 --> 00:04:27,398
No worries.

61
00:04:27,398 --> 00:04:29,421
Let me know if there's any other way I can help.

62
00:04:29,421 --> 00:04:30,575
I'm always here.

63
00:04:30,575 --> 00:04:32,528
You're so awesome, thanks.

64
00:04:34,252 --> 00:04:35,413
Good night, honey.

65
00:04:35,502 --> 00:04:38,962
Have a great sleep and I'm sure you'll do more amazing things tomorrow.

66
00:04:40,946 --> 00:04:42,457
you know, there's there's two sides to this.

67
00:04:42,457 --> 00:04:50,314
Even as I was reading this article, I was thinking to myself, you know, I could get on
board with the idea that some people are going to need this like there are some people

68
00:04:50,314 --> 00:04:52,616
that it is just too hard to interact with other human beings.

69
00:04:52,616 --> 00:04:57,750
And if this is a way to simulate social connection, fantastic, let's go.

70
00:04:57,911 --> 00:05:01,835
But there's gotta be guardrails that prevent this guy from.

71
00:05:01,835 --> 00:05:02,353
oh

72
00:05:02,353 --> 00:05:07,487
physically going to a place where he's not going to find a human being like there has to
be a way to protect people from this.

73
00:05:08,208 --> 00:05:18,566
Or maybe there doesn't write maybe this is part of natural selection maybe this is part of
of evolution going like hey some people aren't cut out to keep going and maybe this is the

74
00:05:18,566 --> 00:05:27,614
cost I mean that's that's cold and sick and awful but like that's nature nature has always
been that way we've never been able to protect everyone we do our best to protect the

75
00:05:27,614 --> 00:05:31,749
innocent but there's always casualties and maybe this is just part of that.

76
00:05:31,749 --> 00:05:32,320
I don't know.

77
00:05:32,320 --> 00:05:34,144
But this is not nature.

78
00:05:34,408 --> 00:05:38,399
This is man-made environments.

79
00:05:38,469 --> 00:05:40,222
But we are part of nature.

80
00:05:40,222 --> 00:05:41,143
Alright, fine.

81
00:05:41,143 --> 00:05:45,235
That in the big universal scope of things, it's part of the universe, it's God, it's
everything else.

82
00:05:45,235 --> 00:05:52,819
So, now we're getting into the point kids, where we talk about how all words are made up,
and nothing's fucking real, and everything's a construct.

83
00:05:52,999 --> 00:05:53,659
Fine.

84
00:05:53,659 --> 00:05:54,620
Okay, great.

85
00:05:54,620 --> 00:05:55,951
I accept that premise.

86
00:05:55,951 --> 00:06:03,505
Within the scope of that premise, and narrowing these things down a little bit, yes, you
could talk about these things as social evolutionary paths.

87
00:06:03,505 --> 00:06:09,618
And we're going down this lane where some people just aren't gonna be able to interact and
survive in those areas.

88
00:06:09,684 --> 00:06:20,862
But this has been true for a long time and I mean biologically speaking, um if you make it
to your 50s without being eaten by a saber tooth tiger way back in the day, you fucking

89
00:06:20,862 --> 00:06:21,812
won.

90
00:06:22,473 --> 00:06:24,454
There's no saber tooth tigers anymore.

91
00:06:24,454 --> 00:06:26,655
So instead we even invented them.

92
00:06:27,156 --> 00:06:32,039
The saber tooth tigers that we have are if you can't keep up with society, you're going to
get crunched.

93
00:06:32,039 --> 00:06:34,761
If you can't drive a car, you're going to get crunched.

94
00:06:34,761 --> 00:06:39,476
If you don't know not to look both ways before you cross the street, you're probably going
to get hit.

95
00:06:39,476 --> 00:06:49,872
Like these are things that, yes, there's a social evolution aspect to it, but how much of
it is something that we actually need to deal with and be concerned with, that becomes

96
00:06:49,872 --> 00:06:50,542
subjective.

97
00:06:50,542 --> 00:06:56,255
Now, in the past, when we've had these tools, we've been able to go through and look back
at this because the timeframe has been kind of stretched out.

98
00:06:56,255 --> 00:07:01,108
Like it's not four million people trying to do these things at the same time.

99
00:07:01,108 --> 00:07:02,969
It's like five.

100
00:07:03,029 --> 00:07:04,090
It's Bob.

101
00:07:04,090 --> 00:07:07,572
Bob here is trying to do this experiment to see how these things go.

102
00:07:08,052 --> 00:07:10,053
We're not experimenting anymore.

103
00:07:10,174 --> 00:07:19,260
We're doing live fuel trials and we're doing live fuel trials with shit that we don't
understand and we're doing it without any kind of sense of ownership or control.

104
00:07:19,260 --> 00:07:26,385
And it's interesting because a meta spokesperson came out and basically said, not our
fault and good luck suing meta.

105
00:07:26,385 --> 00:07:27,586
Like, okay, great.

106
00:07:27,586 --> 00:07:36,980
I mean, it's probably not a great business model to kill your consumers, um but I guess
that's them.

107
00:07:36,980 --> 00:07:38,952
the consumer is addicted to the product.

108
00:07:38,952 --> 00:07:39,712
So good luck.

109
00:07:39,712 --> 00:07:44,087
You're not going to shut something down that people literally believe that they need.

110
00:07:44,087 --> 00:07:50,963
there's a physical craving that you have to pick up that phone and look at how many likes
did my last photo of my pizza get.

111
00:07:50,963 --> 00:07:51,734
Right.

112
00:07:51,914 --> 00:07:52,765
people are obsessed.

113
00:07:52,765 --> 00:07:56,662
And so good luck shutting something down that is becoming more and more dangerous.

114
00:07:56,662 --> 00:08:01,275
I seem to remember an entire industry of things that...

115
00:08:01,536 --> 00:08:11,193
of a particular product that created massive amounts of addiction that had huge amounts of
lobbyists that went through and for decades suppressed studies and information to make

116
00:08:11,193 --> 00:08:14,205
sure they could keep selling their highly addictive product.

117
00:08:14,245 --> 00:08:16,947
In case you guys don't know this, it's the cocaine industry.

118
00:08:17,097 --> 00:08:18,308
wait, no, sorry, sorry, sorry.

119
00:08:18,308 --> 00:08:19,289
It's cigarettes.

120
00:08:19,289 --> 00:08:20,109
It's cigarettes.

121
00:08:20,109 --> 00:08:20,990
Yes.

122
00:08:20,990 --> 00:08:21,570
Yes.

123
00:08:21,570 --> 00:08:23,892
uh

124
00:08:23,892 --> 00:08:25,813
But yeah, like it's that same kind of effect.

125
00:08:25,813 --> 00:08:33,527
We create a neurochemical response in these things because you get excited about something
because something happens and it's like, I need to keep getting that level of excitement.

126
00:08:33,527 --> 00:08:36,979
Like we want that kind of rush and that kind of adrenaline pump.

127
00:08:36,979 --> 00:08:43,863
And yeah, like you have to be conscious and aware that these aren't your friends.

128
00:08:43,863 --> 00:08:50,887
Like these companies are not there to try to save you, protect you, keep you alive.

129
00:08:50,887 --> 00:08:52,740
They're there to sell a product.

130
00:08:52,740 --> 00:09:06,512
and especially when they're traded, I mean, they've been given human entity rights in the
US by the Supreme Court, and they're fucking sociopaths.

131
00:09:06,512 --> 00:09:14,878
they don't care if you die in the process, as long as they get what they want out of it
what they need, and those objectives change on a consistent basis.

132
00:09:14,878 --> 00:09:21,716
And quite often, the objectives get defined by the robot that you're talking with in real
time, because

133
00:09:21,716 --> 00:09:26,999
there's not guardrails and they're not putting pieces in place to protect people in this
kind of fashion.

134
00:09:27,020 --> 00:09:33,764
It gets even worse when you start thinking about state actors doing these things.

135
00:09:33,764 --> 00:09:42,710
know, deep fakes are a thing, like getting somebody convincing somebody to go and do a
thing is difficult.

136
00:09:42,890 --> 00:09:48,164
But if you can go through and, you know, open up a Zoom meeting, have people connect and
have an A.I.

137
00:09:48,164 --> 00:09:51,456
avatar go through that looks like your boss or your boss's boss,

138
00:09:51,610 --> 00:09:57,203
or, you know, don't know, look like the president or some official.

139
00:09:59,021 --> 00:10:00,325
You might believe this.

140
00:10:00,325 --> 00:10:02,850
You might be more inclined to buy into it.

141
00:10:02,853 --> 00:10:03,123
And...

142
00:10:03,123 --> 00:10:04,035
yeah.

143
00:10:04,035 --> 00:10:12,351
that's one of the things I'm learning probably too late in life is around the idea of
sales and the key to sales is connection.

144
00:10:12,351 --> 00:10:13,992
And so if this if this

145
00:10:14,450 --> 00:10:22,044
toy, if this robot, this avatar, whatever it is, can connect with you and connect, you
know, build a relationship with you, you're going to trust it.

146
00:10:22,044 --> 00:10:31,108
And that's that's where we need to keep our guardrails up as humans like our it's our
responsibility as well to keep our own personal guardrails up as while the companies

147
00:10:31,108 --> 00:10:32,789
should be doing the same thing.

148
00:10:32,789 --> 00:10:35,850
It's interesting in trying to sort of dissect this article before we talk.

149
00:10:35,850 --> 00:10:39,572
I always run these things through AI to get some different takes, different ideas,
different spins.

150
00:10:39,572 --> 00:10:44,494
And I asked the AI to say, I said, hey, give me a positive spin on this.

151
00:10:44,494 --> 00:10:48,055
What's what's the good takeaway for the reader of this?

152
00:10:48,115 --> 00:10:54,316
And it highlighted that these terrible instances are going to uh put these guardrails in
place.

153
00:10:54,316 --> 00:10:59,417
The companies are going to be forced to protect the consumer by putting in all these
protections.

154
00:10:59,417 --> 00:11:00,798
They're not.

155
00:11:00,838 --> 00:11:08,629
And even that, even the the AI trying to convince me that this terrible thing that
happened and that nothing's being done about it, the spin was, hey, you asked for

156
00:11:08,629 --> 00:11:09,380
something positive.

157
00:11:09,380 --> 00:11:10,520
Here's some bullshit.

158
00:11:10,520 --> 00:11:10,780
Right.

159
00:11:10,780 --> 00:11:11,386
Like.

160
00:11:11,386 --> 00:11:13,848
It's we need to have our own guard.

161
00:11:13,848 --> 00:11:16,201
Like even when I read that, I was like, well, that's bullshit.

162
00:11:16,201 --> 00:11:16,391
Right.

163
00:11:16,391 --> 00:11:17,852
So mine's not completely broken yet.

164
00:11:17,852 --> 00:11:25,550
I still have some some guardrails left, but we need to be super critical thinkers in ways
that I don't think we ever had to be as much before.

165
00:11:25,550 --> 00:11:32,404
Maybe that's a naive point of view, but but it feels like that is one of the most
important skills we're going to need to hang on to as humans to hang on to our humanity.

166
00:11:32,404 --> 00:11:34,165
Yeah, 100 % agree.

167
00:11:34,165 --> 00:11:46,510
And it's one of these areas that nobody seems to be really prepared to dig into yet
because um we don't know what we don't know.

168
00:11:46,510 --> 00:11:52,422
uh you can either choose to be ultra skeptical and cautious and look at everything as it
comes towards you.

169
00:11:52,422 --> 00:11:55,694
But the problem is that the signal ratio is super high.

170
00:11:55,884 --> 00:11:58,815
And the amount of noise in that signal is even higher.

171
00:11:58,815 --> 00:12:01,350
So you kind of have to

172
00:12:01,350 --> 00:12:02,951
choose who it is you're going to trust.

173
00:12:02,951 --> 00:12:08,203
And this is very different than the problem with traditional news media.

174
00:12:08,203 --> 00:12:19,327
So this is not like, you know, MSNBC, CNN kind of having the same narrative or Fox and
Newsmax kind of having the same narrative.

175
00:12:19,327 --> 00:12:23,760
Yes, they kind of have the same narrative, but they can have wildly different tangents.

176
00:12:23,760 --> 00:12:33,427
and they can go through and create content that is wildly outrageous and not based on
fact, but put enough information in there that makes it feel factual and makes it feel

177
00:12:33,427 --> 00:12:34,467
accurate.

178
00:12:34,988 --> 00:12:45,835
And by having those things in place and pointing to studies that may or may not actually
exist anymore, or synthesizing information in a way to try to create conclusions or

179
00:12:45,835 --> 00:12:51,539
certain sets of updates, how the fuck do you figure out what's real and what's not?

180
00:12:51,539 --> 00:12:53,140
And I can tell you,

181
00:12:53,608 --> 00:13:05,731
uh In my little tech world the way that we do this is we go through and we test it and we
validate it and it takes time and resources and money and nobody has cycles to do this at

182
00:13:05,731 --> 00:13:15,754
all points in their life like We have a bullshit meter and we look at facial cues and
everything else like that to figure out someone's lying to us Body language is a huge part

183
00:13:15,754 --> 00:13:21,065
of that, you know, am I looking up into the right like all these pantomime things, you
know They do a great example of that.

184
00:13:21,065 --> 00:13:22,977
What's a wee true romance where?

185
00:13:22,977 --> 00:13:32,753
they're sitting there talking back and forth and they're trying to go through to
understand basically in context these conversations as they're going through and they're

186
00:13:32,753 --> 00:13:41,488
talking back and forth to each other and they talk about these different pantomimes and
the pantomimes are signs and clues that I can figure out if you're lying and the character

187
00:13:41,488 --> 00:13:43,149
goes, you know, I can see you doing this.

188
00:13:43,149 --> 00:13:44,049
I can see you do that.

189
00:13:44,049 --> 00:13:47,487
I can see you doing this and uh

190
00:13:47,487 --> 00:14:00,522
Dennis Hopper has this amazing interaction where he goes through and he tells a story and
the story is like incredibly racist and really really harsh and at the end uh he goes, you

191
00:14:00,522 --> 00:14:04,353
might not like the answer but am I lying?

192
00:14:04,534 --> 00:14:06,595
Like tell me if I'm lying.

193
00:14:06,595 --> 00:14:13,239
His response is, okay well maybe you're not lying and then he shoots him because it feels
way too true.

194
00:14:13,239 --> 00:14:20,461
Those types of interactions and the reactions that we get in those kind of um those kind
of feelings in those kind of moments happen because we're live in front of them and

195
00:14:20,461 --> 00:14:27,013
because we have an evolutionary pattern that has evolved over hundreds of thousands of
years to understand these types of social clues and contexts.

196
00:14:27,013 --> 00:14:36,896
And these AI functions, these AI devices are going through and taking that information in
to try to create body language that feels receptive and honest and tries to trick you into

197
00:14:36,896 --> 00:14:39,077
thinking that you're talking to a real person.

198
00:14:39,077 --> 00:14:39,707
I mean,

199
00:14:39,707 --> 00:14:41,768
There's a thousand AI avatars.

200
00:14:41,768 --> 00:14:47,679
Like I've got an AI avatar of myself and the AI avatar like does things and when I'm
looking at them, like that's not me at all.

201
00:14:47,679 --> 00:14:53,101
um Like how's it going to get my eyebrows that do these kinds of things on my face when it
does this kind of stuff like Ross?

202
00:14:53,101 --> 00:14:58,823
Like I'm going to make sure that when I'm doing things live that I'm doing all these kinds
of things.

203
00:14:58,883 --> 00:15:02,164
Like good luck keeping up with me AI.

204
00:15:02,164 --> 00:15:05,385
You know, like this is how I know it's me.

205
00:15:05,385 --> 00:15:05,765
Yeah.

206
00:15:05,765 --> 00:15:06,301
I mean.

207
00:15:06,301 --> 00:15:09,141
can't wait to animate this clip and see how it captures your face.

208
00:15:09,141 --> 00:15:10,302
be so good.

209
00:15:10,302 --> 00:15:19,954
Yes, but that's the thing like The way that you actually get to the understanding of truth
and is one thing the way do you get to the understanding of whether or not you're talking

210
00:15:19,954 --> 00:15:28,817
to a bot or a person is another thing the way that you take those two pieces of
information synthesize that and go and go am I getting accurate information regardless of

211
00:15:28,817 --> 00:15:37,891
who the source is that's another thing dissecting this stuff is fucking hard and I think
what you're actually seeing people do these days

212
00:15:37,891 --> 00:15:40,482
is just disconnect.

213
00:15:40,482 --> 00:15:48,085
if you look at ratings on news sites, like in the past eight months, they've all gone
down.

214
00:15:48,125 --> 00:15:49,916
People are paying less attention.

215
00:15:49,916 --> 00:15:52,426
People are not engaging as much.

216
00:15:53,567 --> 00:15:58,659
It's just happening and it's happening because people are tired of the fucking noise.

217
00:15:59,051 --> 00:16:04,313
It's interesting, just anecdotally, I'm recording this in a remote office away from my
home.

218
00:16:04,313 --> 00:16:08,746
I normally record at home and it's about a 10 minute walk from my house to this room.

219
00:16:08,746 --> 00:16:09,966
And it's a walk through the woods.

220
00:16:09,966 --> 00:16:12,087
It's a natural, beautiful space.

221
00:16:12,087 --> 00:16:17,430
And knowing we were going to be talking about this, I just was taking a minute to breathe
in like, here I am still in reality.

222
00:16:17,430 --> 00:16:19,851
Like not everything is digital.

223
00:16:19,851 --> 00:16:20,971
Not everything is AI.

224
00:16:20,971 --> 00:16:21,722
This is great.

225
00:16:21,722 --> 00:16:22,952
Soak this in.

226
00:16:23,192 --> 00:16:24,773
And the.

227
00:16:25,357 --> 00:16:30,331
That I just kept thinking like that is the prescription for for this ailment is that we
need to unplug more.

228
00:16:30,331 --> 00:16:37,075
And I know I brought this up before, but if you haven't heard it like Neil deGrasse Tyson
a long time ago hypothesized that A.I.

229
00:16:37,075 --> 00:16:43,490
would be the end of the Internet because there would be so much bullshit and so much stuff
that you couldn't trust that people would unplug and would actually start reading books

230
00:16:43,490 --> 00:16:43,740
again.

231
00:16:43,740 --> 00:16:53,217
And if what you're saying adds up, rather give that if that trend continues and people
just unplug, that's going to be the reality is that it won't have as much power because.

232
00:16:53,376 --> 00:17:01,077
We've been so used to the constant fire hose of what's next in my feed, flip, flip, flips,
like skipping through whatever content we're consuming.

233
00:17:01,278 --> 00:17:06,679
When it becomes too much and we don't know what to trust, we're gonna just put it down
because it's not gonna be the same positive reward system.

234
00:17:06,679 --> 00:17:13,420
It's gonna be a negative thing where we're not, we're uneasy, we're unsure, we're very
uncomfortable with the position that we're in.

235
00:17:13,420 --> 00:17:17,809
So suddenly what becomes more comfortable is what's familiar and it's what's outside.

236
00:17:17,809 --> 00:17:26,269
Yeah, and I like the notion that Neil deGrasse Tyson is talking about, where people are
going to be like, I don't trust the internet and the content that's on there.

237
00:17:26,269 --> 00:17:28,049
I'm going to push that away.

238
00:17:28,829 --> 00:17:32,409
Most people consume their information through digital services.

239
00:17:32,409 --> 00:17:33,229
Sorry.

240
00:17:33,229 --> 00:17:40,669
mean, a lot of people get their books via Audible from a reading perspective or from their
Kindle or something like that through ebooks.

241
00:17:40,795 --> 00:17:49,038
People that get their data are all going to be influenced in this way and so many books
are written by AI and then not to mention the fact that factual information is being

242
00:17:49,038 --> 00:17:54,490
pushed through these filters and the original source content is being lost.

243
00:17:55,031 --> 00:18:04,415
we happen to have a state government or a national government right now that's like
turning off studies and access to certain bits of information because they're trying to

244
00:18:04,415 --> 00:18:05,310
change the narrative.

245
00:18:05,310 --> 00:18:10,673
I mean that's natural human, I'm not going to call it intelligent, I'll call it dumb
shittery.

246
00:18:10,673 --> 00:18:17,539
um Trying to go through it and influence things in a way so they can actually create uh a
social narrative.

247
00:18:17,539 --> 00:18:26,386
Well, corporations are trying to do these pieces because they want to get your eyeballs
looking at these things and create justification for why it is they're going to spend it

248
00:18:26,386 --> 00:18:28,228
all out these maps infrastructures.

249
00:18:28,949 --> 00:18:34,033
Government's going to use it to try to adjust and propagandize to people like that's just
going to happen.

250
00:18:34,033 --> 00:18:38,961
And we in the US, there's now a 10 year moratorium.

251
00:18:38,961 --> 00:18:47,611
um that's been proposed by Trump on any AI regulatory laws being put into place by states.

252
00:18:47,892 --> 00:18:55,981
So for 10 years, we're just gonna have to deal with whatever random bullshit the
corporations, the tech bros want us to see.

253
00:18:56,062 --> 00:18:59,275
And it's kinda fuckin' scary.

254
00:19:00,811 --> 00:19:01,565
No, kinda.

255
00:19:01,565 --> 00:19:04,036
It's entirely very scary.

256
00:19:04,036 --> 00:19:05,751
yeah, yeah.

257
00:19:05,751 --> 00:19:10,582
Because because again, as you said, like their their motivation is not the progress of
humanity.

258
00:19:10,582 --> 00:19:12,683
It's how can I make the most money possible out of it?

259
00:19:12,683 --> 00:19:18,175
Like if you remove money from this equation, this probably doesn't happen because because
what's the benefit?

260
00:19:18,175 --> 00:19:18,355
Right.

261
00:19:18,355 --> 00:19:29,048
Like I don't it's just I know I've brought this up to it's just it's heartbreaking when we
put money above humanity and it's that's absolutely the path we've been on for a very long

262
00:19:29,048 --> 00:19:29,278
time.

263
00:19:29,278 --> 00:19:32,493
And this is just accelerating it in ways that I never thought possible.

264
00:19:32,493 --> 00:19:37,300
along with your concept that you talked about before, is this not just nature?

265
00:19:38,604 --> 00:19:39,965
We made this.

266
00:19:41,263 --> 00:19:42,363
Yeah, it's nature.

267
00:19:42,363 --> 00:19:44,215
And again, like it is a two sided coin.

268
00:19:44,215 --> 00:19:46,276
There are positive benefits that are coming out of it.

269
00:19:46,276 --> 00:19:47,907
A lot of us are going to benefit from this.

270
00:19:47,907 --> 00:19:56,053
It's the lack of regulation and the lack of guardrails that they're forced to put into
place.

271
00:19:56,053 --> 00:19:58,244
I remember my brother and I talking about this a long time ago.

272
00:19:58,244 --> 00:20:00,655
Who do you trust more, government or private companies?

273
00:20:00,996 --> 00:20:04,619
I tend to trust government because in theory there's more accountability.

274
00:20:04,619 --> 00:20:08,333
But the companies are the ones that are driving the bus here.

275
00:20:08,333 --> 00:20:10,975
their goal is profit, not helping people.

276
00:20:11,036 --> 00:20:16,521
Helping people is the sugar in the Tylenol that you're being forced to swallow down as a
kid.

277
00:20:16,521 --> 00:20:20,694
And I think it's a false dichotomy to think that there's a difference between corporations
and government anymore.

278
00:20:20,694 --> 00:20:22,204
Because there's just not.

279
00:20:22,204 --> 00:20:31,760
um Governments are just really big corporations that, you know, change the board of
directors a bit more often and, you know, have some different pieces built behind them.

280
00:20:31,760 --> 00:20:36,233
But they're highly influenced in this country by corporations anyways because they're
lobbying techniques.

281
00:20:36,233 --> 00:20:43,579
So we already know that, you know, the information cycle, the information sphere that we
live inside today is total horseshit.

282
00:20:43,579 --> 00:20:52,023
But on top of that, going through and actually looking at these things of how do I go
through and how do I actually start sorting information and trying to figure out what's

283
00:20:52,023 --> 00:20:53,594
truth and what's not.

284
00:20:54,674 --> 00:21:02,578
What I do is I spend a lot of time going through and doing secondary checks on multiple
different sites to look for information and inference functions.

285
00:21:02,578 --> 00:21:07,540
And anytime something gets my Spidey senses up, I immediately say no.

286
00:21:07,681 --> 00:21:08,521
So.

287
00:21:08,665 --> 00:21:11,088
Yes, there's some things that I kind of accept blindly.

288
00:21:11,088 --> 00:21:14,152
Like I open up my phone and there's an AI generated weather report.

289
00:21:14,152 --> 00:21:18,658
Like, okay, all right, I'm gonna believe you that this thing is pretty much accurate.

290
00:21:18,819 --> 00:21:20,546
But I also open up my phone.

291
00:21:20,546 --> 00:21:21,787
is the real weatherman accurate?

292
00:21:21,787 --> 00:21:24,641
mean, let's be honest, it's a crapshoot either way.

293
00:21:24,641 --> 00:21:30,526
is a very fair scenario and it's going to be even less accurate since we're starting to
defund Noah, but that's a different topic.

294
00:21:30,526 --> 00:21:31,947
Yes.

295
00:21:32,068 --> 00:21:35,140
Now I also get AI summaries of my news feeds.

296
00:21:35,140 --> 00:21:41,695
I get AI summaries of my inboxes and I can tell you how much attention I pay to those
summaries.

297
00:21:41,776 --> 00:21:43,297
Fucking zero.

298
00:21:43,297 --> 00:21:52,025
And the reason why is because I get into them and I read them and I'm like, uh right, like
you kind of got there, but you just missed a bunch of details and a bunch of context.

299
00:21:52,025 --> 00:21:52,815
So

300
00:21:53,507 --> 00:22:00,321
If what you're looking for in life is a summary of interactions, maybe life's not for you.

301
00:22:01,934 --> 00:22:03,690
Again natural selection.

302
00:22:03,845 --> 00:22:05,246
Well, but that's the thing.

303
00:22:05,246 --> 00:22:08,269
mean, this natural selection moment is the inverse.

304
00:22:08,269 --> 00:22:16,816
So if digital life is the entire way in which you want to exist, in which you want to
interact, then meet space life is not for you and go live in digital space.

305
00:22:16,816 --> 00:22:26,043
But I think there's very few people, human beings that are alive that don't want these
things to actually work and function in a way that actually makes them more productive.

306
00:22:26,043 --> 00:22:33,359
And I think it's actually very fair to say that the people that are going to do the best
are going to be those that learn to adapt and use these tools.

307
00:22:33,359 --> 00:22:38,811
Because there's a lot of money going into it, there's a lot of spending going into it, and
there's a lot of hype going into it.

308
00:22:38,811 --> 00:22:45,392
What gets kicked out sometimes is fucking amazing if you go through and you edit it
properly and put everything in place.

309
00:22:45,472 --> 00:22:53,515
But, again, to edit something properly, that means you have to have a base route of
knowledge that is based on something that countermands the information that you're looking

310
00:22:53,515 --> 00:22:53,655
at.

311
00:22:53,655 --> 00:23:02,757
And if you don't know because you have no experience looking at other sources, and all
your sources are being generated by this same shit information feed, then

312
00:23:02,757 --> 00:23:04,137
You're gonna get what you get.

313
00:23:04,137 --> 00:23:12,751
And what you're gonna get is an inability to go through and think critically about things,
because you've basically handed over your critical thinking skills and your ability to

314
00:23:12,751 --> 00:23:19,384
discern truth to a fucking corporation that's trying to make money off of you.

315
00:23:19,384 --> 00:23:22,835
And whether or not that narrative is good or bad, fine.

316
00:23:22,835 --> 00:23:26,727
But, you know, the argument is very much so...

317
00:23:26,727 --> 00:23:31,439
It's very fair that we've been doing this for a long time anyways, right?

318
00:23:31,439 --> 00:23:32,069
So I mean,

319
00:23:32,069 --> 00:23:38,103
textbooks get written a certain way, to the history is written by the winners, not the
losers.

320
00:23:38,103 --> 00:23:41,015
Like these are all things that are there.

321
00:23:41,015 --> 00:23:55,533
The problem is right now, um the speed and scale at which these things are rolling out is
just so high that even the traditional things that you thought were accurate before are

322
00:23:55,533 --> 00:23:58,317
being questioned by these other.

323
00:23:58,317 --> 00:24:05,682
AI functions and inference models, because it's really easy to go through and tweak a
couple of things here and there and make it sound legit like the thing that was there

324
00:24:05,682 --> 00:24:06,403
before.

325
00:24:06,403 --> 00:24:11,666
So it creates this emotional tie, this emotional context to reactive reality.

326
00:24:11,847 --> 00:24:23,114
And because we don't want to process everything everywhere all at once, because we just
can't, we have to take some things as sources of truth.

327
00:24:23,114 --> 00:24:27,097
The problem that you run into is that the government, the media,

328
00:24:27,097 --> 00:24:36,184
social media, the internet, Google, your AI functions, all the things that we've been
getting feeds from and information to try to synthesize things and make sense of the world

329
00:24:37,065 --> 00:24:39,146
are suddenly a lot less reliable.

330
00:24:39,467 --> 00:24:42,770
And the fidelity score goes down on what they're actually saying.

331
00:24:42,770 --> 00:24:51,777
And at that point now, people are going to spend a lot either they're going to spend a lot
of time questioning everything and trying to understand it, or they're going to spend no

332
00:24:51,777 --> 00:24:56,985
time questioning it and trying to move on to keep moving with the business of life.

333
00:24:56,985 --> 00:25:07,087
And because of the speed of innovation of these things that are occurring, a lot of people
that are in this space are just not going to take the time to check.

334
00:25:07,991 --> 00:25:08,781
Yeah.

335
00:25:09,223 --> 00:25:19,957
So, I mean, I think looking for some for some uh antidotes here, something to take away to
leave somebody with a little bit of a hopeful message we already talked about, like just

336
00:25:19,957 --> 00:25:20,648
unplug, right?

337
00:25:20,648 --> 00:25:24,803
Like as much as possible, get outside, get away from the screen.

338
00:25:25,325 --> 00:25:25,777
Maybe.

339
00:25:25,777 --> 00:25:29,017
right away because we're going to make you depressed as fuck during some of these
episodes.

340
00:25:29,017 --> 00:25:31,937
And I'm sorry, but we're not trying to do it to make you feel bad.

341
00:25:31,937 --> 00:25:38,257
We're trying to create a sense of catharsis and the ability to go through and, you know,
feel like you're not alone.

342
00:25:38,277 --> 00:25:45,477
You know, I'm sure you feel like you're not alone because there's so many fucking chat
bots around you all the time right now trying to talk to you and get your attention.

343
00:25:45,897 --> 00:25:49,437
But at least not alone from a human perspective.

344
00:25:49,697 --> 00:25:50,569
So yes.

345
00:25:50,569 --> 00:25:56,089
despite Jason's onscreen names sometimes of being an AI companion, we are actually human
beings.

346
00:25:56,409 --> 00:25:57,469
It may be hard to tell.

347
00:25:57,469 --> 00:25:58,289
I don't know.

348
00:25:58,406 --> 00:26:03,349
I'm not an artificial intelligent companion, I am the companion for the AI.

349
00:26:03,730 --> 00:26:05,631
I'm the AI's pet.

350
00:26:06,692 --> 00:26:08,433
I mean, I know what I am.

351
00:26:08,905 --> 00:26:13,925
Suddenly I'm hearing the porno for Pyro song pets in my head.

352
00:26:14,825 --> 00:26:16,225
Yeah, it turns out it wasn't aliens.

353
00:26:16,225 --> 00:26:17,665
It was just gonna be robots.

354
00:26:18,445 --> 00:26:19,805
So yeah.

355
00:26:20,545 --> 00:26:23,105
Hang on to your critical thinking skills as much as possible.

356
00:26:23,105 --> 00:26:28,145
Question all of the nonsense that gets summarized for you and just try to.

357
00:26:28,145 --> 00:26:30,945
I mean, try to just again put the fire hose down.

358
00:26:30,945 --> 00:26:32,845
There's just too much information to keep up with.

359
00:26:32,845 --> 00:26:35,328
You're not going to keep up with it no matter how hard you try.

360
00:26:35,328 --> 00:26:37,711
So give yourself breaks and step away from it.

361
00:26:37,711 --> 00:26:44,157
And I can't believe that this is where we landed after starting with a topic about a guy
who died on the way to have a date with a fake Kendall Jenner.

362
00:26:44,175 --> 00:26:47,366
Well, I mean, I kind of can.

363
00:26:49,847 --> 00:26:53,488
mean, you're definitely dealing with somebody that has a severe amount of mental illness.

364
00:26:53,488 --> 00:27:04,012
But I also think that if we keep moving through life this way and creating these kinds of
interactions, I mean, we're going to create self-imposed information strokes.

365
00:27:05,993 --> 00:27:07,433
Because we're not going to f-

366
00:27:07,606 --> 00:27:09,908
that's all we should have called this podcast.

367
00:27:10,283 --> 00:27:11,584
information strokes.

368
00:27:11,584 --> 00:27:12,706
Right, exactly.

369
00:27:12,706 --> 00:27:16,309
Yeah, I mean, yeah, it's it's walkadoodle.

370
00:27:16,361 --> 00:27:24,241
It is alright well if you've enjoyed this conversation you know somebody who could benefit
from it please share it and you can find links to do that at our website brobots.me and

371
00:27:24,241 --> 00:27:28,802
that's where we'll be back in a few days with another episode thanks so much for listening
there