How The Superman Movie Gave Me Hope About Our AI Future
My productivity hack: Use my code FITMESS20 for 20% off #magicmind ---- Who controls the machines when AI gets superpowers? We're living through the most significant technological shift in human history, and most people are arguing about all the...
My productivity hack: https://www.magicmind.com/FITMESS20 Use my code FITMESS20 for 20% off #magicmind
----
Who controls the machines when AI gets superpowers?
We're living through the most significant technological shift in human history, and most people are arguing about all the wrong things. While culture warriors battle over Superman's immigration status, the real story is staring us in the face: the war for control of artificial intelligence. This isn't some distant sci-fi fantasy anymore – it's happening right now, and the stakes couldn't be higher.
The newest Superman movie accidentally became the perfect metaphor for our AI moment. You've got Lex Luthor commanding an army of machines like he's playing the world's most dangerous video game, while Superman fights back with his own AI companions. Sound familiar? That's because we're already living it. The question isn't whether humans and machines will merge – it's whether the good guys or the Lex Luthors of the world get to decide how it happens.
Listen now to discover how a comic book movie reveals the three critical choices we're making about AI right now that will determine whether technology saves humanity or enslaves it.
10 Topics Discussed:
- The BroBots Rebrand - Why The Fit Mess is evolving into something bigger as we embrace the human-machine future
- Superman as AI Metaphor - How James Gunn's film accidentally became the perfect commentary on our current AI moment
- The Lex Luthor Problem - Why the people building AI might not be the people we want controlling it
- Intent vs Technology - How AI amplifies human nature, both good and evil, rather than changing it
- The Video Game Controller War - Lex Luthor's command system and what it reveals about human-machine interfaces
- Mr. Terrific's Cool Factor - Why the best AI integration makes humans more capable, not obsolete
- Biological Augmentation - The Engineer's sacrifice and what giving up humanity for technology really costs
- Real-World Supervillains - How tech billionaires are becoming the comic book antagonists we used to only fear in fiction
- Breaking Echo Chambers - Why putting down your screen and talking to real humans is the ultimate AI defense
- The Culture War Distraction - How fake outrage over Superman's "woke" themes distracts from the real technological threats
----
NEW WEBSITE:
www.brobots.me
----
MORE FROM THE FIT MESS:
Connect with us on Threads, Twitter, Instagram, Facebook, and Tiktok
Subscribe to The Fit Mess on Youtube
Join our community in the Fit Mess Facebook group
----
LINKS TO OUR PARTNERS:
-
Explore the many benefits of cold therapy for your body with Nurecover
-
Muse's Brain Sensing Headbands Improve Your Meditation Practice.
-
Get a Free One Year Supply of AG1 Vitamin D3+K2, 5 Travel Packs
-
You Need a Budget helps you quickly get out of debt, and save money faster!
1
00:00:00,141 --> 00:00:04,043
Who controls the machines when AI gets superpowers?
2
00:00:04,048 --> 00:00:07,770
we're living through the most significant technological shift in human history.
3
00:00:07,770 --> 00:00:10,972
And most people are arguing about all the wrong things.
4
00:00:10,972 --> 00:00:16,516
While culture warriors battle over Superman's immigration status, the real story is
staring us right in the face.
5
00:00:16,516 --> 00:00:19,938
The war for who gets to control artificial intelligence.
6
00:00:20,041 --> 00:00:22,523
This isn't some distant sci-fi fantasy anymore.
7
00:00:22,523 --> 00:00:25,765
It's happening right now and the stakes couldn't be higher.
8
00:00:26,581 --> 00:00:30,224
The new Superman movie is the perfect metaphor for this moment in our history.
9
00:00:30,224 --> 00:00:38,529
You've got Lex Luthor commanding an army of machines like he's playing the world's most
dangerous video game, while Superman fights back with his own AI companions.
10
00:00:38,947 --> 00:00:47,210
The question isn't whether humans and machines will merge, It's whether the good guys or
the Lex Luthor's of the world get to decide how it happens.
11
00:01:02,733 --> 00:00:57,963
movie reveals three critical choices we're making about AI right now that will determine
whether technology saves humanity or enslaves it.
12
00:01:03,949 --> 00:01:05,893
This is the Fit Mess for now.
13
00:01:05,893 --> 00:01:09,934
It's a podcast for men who want to use AI to safely manage their mental health.
14
00:01:10,241 --> 00:01:15,003
All right, Jason, I finally saw the Superman movie that everybody is fighting about the
wrong things about.
15
00:01:15,003 --> 00:01:16,174
I can't wait to talk about that.
16
00:01:16,174 --> 00:01:17,665
We're going to get into that today.
17
00:01:17,665 --> 00:01:20,126
But I also we have a big announcement.
18
00:01:20,126 --> 00:01:27,129
We are making some big changes here at the show that we want to make sure you are aware of
you long time listeners and anybody who's come along recently and notice some things have
19
00:01:27,129 --> 00:01:28,269
changed a bit.
20
00:01:28,269 --> 00:01:31,091
And so I'll just put it out there.
21
00:01:31,091 --> 00:01:33,151
We've been doing the fit mess.
22
00:01:33,312 --> 00:01:38,094
Zach and I and Jason, you and I for the last few months and our friend Joe has been
participating as well.
23
00:01:38,354 --> 00:01:40,205
And we've been doing the fit mess for
24
00:01:40,205 --> 00:01:40,956
seven or eight years.
25
00:01:40,956 --> 00:01:44,938
It's been a very long time and it's time for the fit mess to end.
26
00:01:45,058 --> 00:01:47,050
It doesn't mean that this show is going away.
27
00:01:47,050 --> 00:01:51,823
We're just going to be taking a different approach calling it something else and
rebranding.
28
00:01:51,823 --> 00:02:00,228
So rather than just say out loud what we're going to do I thought what I would do is I've
been doing a lot of research on this asking a lot of people getting some different
29
00:02:00,228 --> 00:02:01,429
different advice.
30
00:02:01,429 --> 00:02:05,372
And one of the pieces of advice I got is from a Hall of Fame podcaster.
31
00:02:05,372 --> 00:02:10,135
And so I wanted to share the clip with you of me going to him to find out what he
32
00:02:10,135 --> 00:02:12,335
thought of our crazy new idea.
33
00:02:12,335 --> 00:02:16,827
So let me just bring that up here and play this.
34
00:03:15,171 --> 00:03:17,062
And we're kind of sarcastic and fun.
35
00:03:17,062 --> 00:03:18,494
At least that's the goal.
36
00:03:18,494 --> 00:03:24,990
We don't always land there, it's certainly the goal every time we It's true.
37
00:03:24,990 --> 00:03:26,704
is the real fungible amount.
38
00:03:26,704 --> 00:03:29,941
mean, fun, you can't spell fungible without fun.
39
00:03:29,941 --> 00:03:30,435
right.
40
00:03:30,435 --> 00:03:32,131
We put the fun in fungible.
41
00:03:32,265 --> 00:03:40,731
That of course was Dave Jackson from the School of Podcasting during a uh webinar he
hosted for Podpage, which is an amazing podcast hosting website, by the way, if you're
42
00:03:40,731 --> 00:03:46,167
looking for one, we'll make sure we have a link in the show notes for that as well, in
case you're looking for a place to host your podcast.
43
00:03:46,309 --> 00:03:47,601
ah
44
00:03:47,601 --> 00:03:55,764
I mean, listen, this is a decision we make with a heavy heart because the brand is
something that's been, you know, part of my life for many years, part of Zach's life for
45
00:03:55,764 --> 00:03:56,125
many years.
46
00:03:56,125 --> 00:03:57,726
And Zach is still involved.
47
00:03:57,726 --> 00:03:59,527
He's just kind of on the beach right now.
48
00:03:59,527 --> 00:04:02,138
Life has significantly gotten in the way.
49
00:04:02,138 --> 00:04:08,251
And as we've talked about on the show many times, when when things are not serving you,
put them down and make some space.
50
00:04:08,251 --> 00:04:10,342
So when Zach is ready, he'll return.
51
00:04:10,342 --> 00:04:12,102
Joe was kind of in a similar place.
52
00:04:12,102 --> 00:04:14,033
uh Life has suddenly gotten super busy.
53
00:04:14,033 --> 00:04:16,905
And so, Jason, you and I have been sort of steering the ship for the last few months.
54
00:04:16,905 --> 00:04:17,525
And this is the direction.
55
00:04:17,525 --> 00:04:19,166
direction that the ship has gone.
56
00:04:19,166 --> 00:04:25,559
And so just every time I said the fit mess opening the show, it just felt like are we even
like we're not even really doing that anymore.
57
00:04:25,559 --> 00:04:27,491
This is this is a different thing.
58
00:04:27,491 --> 00:04:32,154
And so we just want to make it clear to the audience exactly what they're going to get
when they hit play with us.
59
00:04:33,255 --> 00:04:34,075
Right.
60
00:04:34,696 --> 00:04:39,059
is now augmented by artificial intelligence.
61
00:04:43,084 --> 00:04:44,485
Yes, exactly.
62
00:04:44,485 --> 00:04:48,687
So what you will find going forward is me probably still accidentally saying the fit mess
a lot.
63
00:04:48,687 --> 00:04:53,509
But the new website will be BroBots.me because we're all becoming robots.
64
00:04:53,509 --> 00:04:56,691
So BroBots.me and your feed will change.
65
00:04:56,691 --> 00:04:58,031
Suddenly you'll have a new logo for us.
66
00:04:58,031 --> 00:05:05,905
So things will seem a little out of sorts but it's still us still talking about the ways
that we can help benefit your mental health just primarily around technology and AI
67
00:05:05,905 --> 00:05:09,567
because that's where a lot of attention is going right now and people have a lot of
questions.
68
00:05:09,567 --> 00:05:14,229
We have a lot of questions and in trying to answer them we want to share what we find with
you.
69
00:05:14,409 --> 00:05:15,870
That's the big announcement.
70
00:05:18,171 --> 00:05:19,862
absolutely absolutely.
71
00:05:19,862 --> 00:05:25,841
We will definitely employ the robots to do the heavy lifting on the giant donuts for sure.
72
00:05:26,343 --> 00:05:31,956
Speaking of evolving and becoming something better, You know what's helped me stay sharp
during this whole rebrand process?
73
00:05:31,956 --> 00:05:33,137
That's right, MagicMind.
74
00:05:33,137 --> 00:05:40,760
Look, when you're making big changes like we are, new name, new focus, diving deep into AI
and technology, you need your mind firing on all cylinders.
75
00:05:40,760 --> 00:05:47,537
MagicMind combines nootropics, adaptogens, and functional mushrooms to help with focus and
mental clarity without the jitters.
76
00:05:47,537 --> 00:05:54,530
Honestly, it's been a game changer for me when trying to wrap my head around complex AI
concepts and well, Superman's robot army tactics.
77
00:05:54,864 --> 00:05:57,772
It's made a huge difference in my life and I think it will for you as well.
78
00:05:57,772 --> 00:06:06,946
So use that big brain of yours to head over to magicmind.com forward slash fitmes20, Use
the promo code fitmes20 to get 20 % off your order.
79
00:06:07,044 --> 00:06:12,172
Again, that's magicmind.com forward slash fit mass 20 to get 20 % off.
80
00:06:12,607 --> 00:06:14,159
OK so that's the big announcement.
81
00:06:14,159 --> 00:06:19,851
The other big thing we have to talk about is Superman because I finally saw it and this is
not going to be movie review time.
82
00:06:19,851 --> 00:06:22,553
But when I saw this thing.
83
00:06:23,477 --> 00:06:25,638
First of all, was like, everyone's arguing about the wrong stuff.
84
00:06:25,638 --> 00:06:29,510
Like, the immigrant and the nice guy and the woke bullshit, blah, blah, blah.
85
00:06:29,510 --> 00:06:36,464
Like, what a stupid fucking culture war to wage over a fantastic movie that I loved for a
million reasons.
86
00:06:36,464 --> 00:06:37,605
It's, it's...
87
00:06:38,157 --> 00:06:42,019
It's ridiculous in all the right ways and super cool in all the right ways.
88
00:06:42,019 --> 00:06:54,413
But when I saw what we've been talking about here for months, the blending of human
biology with machine, the use of AI, the use of technology for good and evil on display in
89
00:06:54,413 --> 00:06:57,445
such a just hit you over the head manner.
90
00:06:57,445 --> 00:06:58,695
I was like, we got to talk about this.
91
00:06:58,695 --> 00:07:00,526
I'm so I'm so glad you've already seen it, too.
92
00:07:00,526 --> 00:07:06,128
What was your reaction when you saw the way that they've sort of approached this part of
the story?
93
00:07:06,165 --> 00:07:06,438
So.
94
00:07:06,438 --> 00:07:07,651
um
95
00:07:09,443 --> 00:07:18,907
One, as far as like a record of uh sci fi that's been used for years as tropes and
everything else, it is like old sci fi concepts.
96
00:07:18,907 --> 00:07:24,440
A lot of it that are pulled way, back in the day from old literature like Isaac Asimov
level.
97
00:07:24,440 --> 00:07:27,591
Like we're talking like stuff from like the 60s and 70s.
98
00:07:27,971 --> 00:07:36,315
And it's put in a way that makes it easy to understand, like the parlance of it, like this
idea of having a bunch of people sitting around computer screens and each of them running
99
00:07:36,315 --> 00:07:37,535
different commands.
100
00:07:37,823 --> 00:07:40,964
and putting them in play and like running these playbooks all the way down.
101
00:07:42,245 --> 00:07:51,568
The implication is that Lex Luthor is so smart that he doesn't need the power of the
artificial intelligence because his naturally derived intelligence is so good.
102
00:07:51,729 --> 00:08:00,792
But there's AIs and automation and computer functions sitting everywhere, and he has to
have minions around him where he to say, run, play, E9 to like make these things happen,
103
00:08:00,792 --> 00:08:02,693
which is incredibly dramatic.
104
00:08:02,873 --> 00:08:07,575
The way the artificial intelligence would do it is it would not say run, play, E9.
105
00:08:07,655 --> 00:08:12,537
it would use its intelligence to go beep and then Superman would be toast.
106
00:08:13,358 --> 00:08:21,021
And it was crazy because he was I mean, you find out in the story the origin of the people
that he's fighting and the way that he's put these pieces together.
107
00:08:21,021 --> 00:08:30,785
And I don't want to give spoilers away, but the interactions and the way that they run
kind of the script and the playbook on it is interesting because.
108
00:08:31,805 --> 00:08:35,657
It's a holistic approach to try to tackle a problem using.
109
00:08:35,657 --> 00:08:46,736
really augmented intelligence functions and group think, which if you look at it, I mean,
I'm sure there's plenty of, you know, people in SOCOM, the Special Forces region.
110
00:08:46,736 --> 00:08:48,758
We look at that and laugh and go, ha, ha.
111
00:08:48,758 --> 00:08:52,541
We did that stuff like 35 years ago with like imaging satellites.
112
00:08:52,541 --> 00:08:55,547
But it but it's neat because it kind of brings you into this fold.
113
00:08:55,547 --> 00:08:57,985
And it none of it felt like.
114
00:08:58,726 --> 00:09:04,871
Like implausible, even though it was an entirely implausible cart, like real life cartoon
broad.
115
00:09:05,019 --> 00:09:05,339
Right.
116
00:09:05,339 --> 00:09:08,181
And I think that's that's kind of what I thought about that.
117
00:09:08,181 --> 00:09:09,882
Like I was just willing to go.
118
00:09:09,882 --> 00:09:10,362
Yep.
119
00:09:10,362 --> 00:09:13,544
I accept that this is the trope that you're using and it's fun.
120
00:09:13,544 --> 00:09:18,447
It's enjoyable and I can get behind it because it's like a bunch of adults playing a video
game.
121
00:09:18,510 --> 00:09:21,061
That was the thing like when he was shouting out those commands.
122
00:09:21,061 --> 00:09:24,923
I literally was envisioning holding like an Xbox controller and playing it.
123
00:09:24,923 --> 00:09:35,428
He's he's verbally playing a video game and and people are responding to his commands the
way the controller would respond to make the bad guy punch the good guy in the face.
124
00:09:35,428 --> 00:09:37,099
And it was it was wild to see that.
125
00:09:37,099 --> 00:09:47,213
But then also in contrast Superman has his own robots and his friends have their own
nanotech technology and like and it was such an amazing blending of humanity and machine.
126
00:09:47,213 --> 00:09:48,714
And it also like just
127
00:09:48,714 --> 00:09:52,800
It's so clearly uh displayed.
128
00:09:52,800 --> 00:10:00,589
What I think I have been saying on the show is this conflict that I a lot of us feel about
AI is like we're excited about the potential of it, but we're scared of what it could
129
00:10:00,589 --> 00:10:00,990
become.
130
00:10:00,990 --> 00:10:03,096
uh
131
00:10:03,096 --> 00:10:06,477
I've seen experts in this field talking more and more about this.
132
00:10:06,477 --> 00:10:09,277
It's not the tool, it's the people using it.
133
00:10:09,277 --> 00:10:14,049
It literally is a tool and however the people decide to use the thing is what it's going
to become.
134
00:10:14,049 --> 00:10:24,452
So if we let the Lex Luthors, who you know a lot of people are comparing to your Elon
Musk's and your know Bezos and super powerful tech lords, if we let them be the ones that
135
00:10:24,452 --> 00:10:29,524
decide what the thing is going to become, feels a little scary if I'm being honest.
136
00:10:29,524 --> 00:10:30,312
But if.
137
00:10:30,312 --> 00:10:33,583
be realistic, they're also the ones kind of building it.
138
00:10:33,583 --> 00:10:35,034
But and they are right.
139
00:10:35,034 --> 00:10:42,137
mean and that's I think that's kind of I'm not here to speak for James Gunn but I think
that's kind of the point that he's getting at with some of that.
140
00:10:42,137 --> 00:10:52,217
And then when the good guys use it it's you know it's medical droids and housekeepers and
companions like all the things that I think a lot of us are excited about it being as an
141
00:10:52,217 --> 00:10:54,242
assistant in our life that can improve it.
142
00:10:54,242 --> 00:10:58,634
And it was just so crazy to to see it and be in and it felt good.
143
00:10:58,634 --> 00:11:02,666
Of course spoiler if you haven't seen it by now too fucking late.
144
00:11:02,666 --> 00:11:03,536
When the good guys
145
00:11:03,536 --> 00:11:18,059
win you're like okay there's there's still hope the good guys can still run the machines
and win and help humanity save the day my god just to geek out for a minute whatever they
146
00:11:18,059 --> 00:11:18,569
do with mr.
147
00:11:18,569 --> 00:11:30,370
terrific shut up and take my money he is the best character I've seen on a screen in years
my god Eddie get that key I think yeah
148
00:11:30,370 --> 00:11:31,072
He's great.
149
00:11:31,072 --> 00:11:35,304
Like he's, he's been in other Marvel properties and every time I see him is fantastic.
150
00:11:35,304 --> 00:11:38,445
There was another show that he had, I can't remember the name of it where they were
wandering money.
151
00:11:38,445 --> 00:11:48,469
Um, but it's that, it's that same kind of empathetic, smart character that kind of, you
know, is, is dynamic and actually has all these different motions and pieces to him.
152
00:11:48,469 --> 00:11:52,455
And he brings that level of, of dynamism, you know, to
153
00:11:52,455 --> 00:11:54,410
using robots and getting them to do things.
154
00:11:54,410 --> 00:11:56,731
And just way cooler than everybody else in the room.
155
00:11:56,731 --> 00:11:57,959
Just way cooler.
156
00:11:57,959 --> 00:12:03,361
well, and I mean, in the DC realm, you know, you have Cyborg and Cyborg is an interesting
character.
157
00:12:03,361 --> 00:12:04,061
But Mr.
158
00:12:04,061 --> 00:12:08,272
Terrific is also a very, very interesting character that to be explored more depth.
159
00:12:08,272 --> 00:12:19,065
But the big thing that you brought up there was the intent of the people actually using
the technology and when the technology itself is hardwired into the things that you do and
160
00:12:19,065 --> 00:12:23,266
if they're they're meant to carry out your actions and your orders.
161
00:12:24,087 --> 00:12:25,417
I mean, if.
162
00:12:25,417 --> 00:12:29,359
If you're a terrible piece of shit, it's going to make you more of a terrible piece of
shit.
163
00:12:29,359 --> 00:12:30,749
Like that's not going to change.
164
00:12:30,749 --> 00:12:34,531
And the same is going to be true for how we all interact with AI.
165
00:12:34,531 --> 00:12:37,752
Like I can use these things to make myself bigger, faster and stronger.
166
00:12:37,752 --> 00:12:39,893
It's both a weapon and a tool.
167
00:12:39,893 --> 00:12:42,014
I mean, it's a knife, right?
168
00:12:42,014 --> 00:12:44,955
It can cut multiple different ways and you can use it for lots of different things.
169
00:12:44,955 --> 00:12:46,015
It can be used for defense.
170
00:12:46,015 --> 00:12:47,576
It can be used for offense.
171
00:12:47,576 --> 00:12:50,297
It can be used to cut steak and vegetables.
172
00:12:50,297 --> 00:12:55,019
And, you know, if I'm really intoxicated after I've cut a lime to mix my drink,
173
00:12:55,113 --> 00:12:56,037
You
174
00:12:56,037 --> 00:12:58,500
to make sure everything gets rolled around the right way.
175
00:12:58,581 --> 00:12:59,962
But that's the thing.
176
00:13:02,133 --> 00:13:05,165
The way they brought it forward in the movie was not at all.
177
00:13:05,165 --> 00:13:12,960
um Then it felt like unrealistic or outside the grasp of what the technology itself could
do, because.
178
00:13:12,960 --> 00:13:16,213
it robots, we've been seeing them for years.
179
00:13:16,213 --> 00:13:22,617
And, you know, this goes back to the idea of the thing we're talking about before, like
Mattel having these really smart toys.
180
00:13:22,897 --> 00:13:28,641
Superman's robotic companions, you know, played by the amazing Alan Tudyk.
181
00:13:29,162 --> 00:13:31,382
They were just toys like.
182
00:13:31,382 --> 00:13:31,933
Yeah.
183
00:13:31,933 --> 00:13:35,344
because like the other meta humans went in there and just ripped them to pieces.
184
00:13:35,885 --> 00:13:38,186
But also they served a really, really good function.
185
00:13:38,186 --> 00:13:44,828
So how many of our toys that are like this, you know, when the bad guys come in, they're
just going to get ripped to pieces.
186
00:13:45,269 --> 00:13:47,870
And I think that's a legitimate thing that you need to look at.
187
00:13:47,870 --> 00:13:49,991
And it's this exploitation of intent.
188
00:13:49,991 --> 00:13:50,991
That's really.
189
00:13:51,841 --> 00:13:55,319
Yeah, I just was blown away because I.
190
00:13:55,319 --> 00:14:05,494
You know, I've been a Superman nerd for I mean, when I was a kid, that's one of the first
movies I remember going to with Superman and to see it this way and and to just have read
191
00:14:05,494 --> 00:14:07,275
so many things before going in.
192
00:14:07,275 --> 00:14:11,462
And it's all about, you know, immigration and woke ism and all this.
193
00:14:11,462 --> 00:14:22,752
was just like, why like in the age we live in now with the way the headlines are right
now, like how this is being ignored, how this part of this story is being ignored for all
194
00:14:22,752 --> 00:14:25,533
this other culture war bullshit was just
195
00:14:25,934 --> 00:14:35,175
because the story really was who's going to control the machines like who's going to be in
charge of these things and what is the impact going to be on humanity going forward.
196
00:14:35,175 --> 00:14:43,450
And it might be a lot of robots that are written in the collateral material and the false
outrage to distract you from actually being worried about the robots by giving you a human
197
00:14:43,450 --> 00:14:44,820
thing to focus on.
198
00:14:48,943 --> 00:14:51,564
That, that is my favorite part.
199
00:14:51,764 --> 00:14:53,325
That is my favorite part.
200
00:14:53,325 --> 00:14:54,145
Yes.
201
00:14:54,646 --> 00:14:55,777
It's totally the Simpsons.
202
00:14:55,777 --> 00:14:56,017
Yeah.
203
00:14:56,017 --> 00:14:56,281
Yeah.
204
00:14:56,281 --> 00:14:56,814
Sorry.
205
00:14:56,814 --> 00:15:07,783
I love the the engineer because you've you've specifically talked about like blend
literally literally blending our biology with machines and that character so explicitly is
206
00:15:07,783 --> 00:15:16,131
like literally says I gave up my humanity for this to to blend with technology to become
this crazy supervillain.
207
00:15:16,131 --> 00:15:20,174
ah Again just to so.
208
00:15:20,769 --> 00:15:29,535
dumb it down to simplify these ideas of what we're talking about here of blending humanity
and technology to become something else for good or evil.
209
00:15:29,535 --> 00:15:33,862
That character really articulated that position brilliantly.
210
00:15:34,037 --> 00:15:38,409
Well, I mean, and she was chock full of nanotech.
211
00:15:38,510 --> 00:15:48,555
Like, if you're going to make everything go away and you can manipulate things at a
molecular structure level, which is what that nanotech is doing, why did they need the
212
00:15:48,555 --> 00:15:51,917
chemical guy to go in and actually create Krypton on his face?
213
00:15:51,917 --> 00:15:53,338
Why couldn't she just do this?
214
00:15:53,338 --> 00:15:57,380
Like, there's all kinds of like science nerdy things in my head that's like, this math is
all wrong.
215
00:15:57,380 --> 00:15:59,771
And I get the dramatic effect of it.
216
00:15:59,771 --> 00:16:02,667
Yes, like she had to give up a huge chunk.
217
00:16:02,667 --> 00:16:06,747
of herself or person to make these pieces happen.
218
00:16:07,747 --> 00:16:10,687
I watched Robocop 2014 again.
219
00:16:10,687 --> 00:16:12,567
was in I was in the year.
220
00:16:12,987 --> 00:16:14,487
It's amazing.
221
00:16:14,487 --> 00:16:17,227
Yeah, yeah, it's actually really, really good.
222
00:16:17,247 --> 00:16:28,367
The guy that plays Robocop is the same guy that played to get no, no, to the Keshi Kovacs
from the Altered Carbon stories.
223
00:16:28,527 --> 00:16:30,407
Oh, I can't even his name.
224
00:16:30,407 --> 00:16:31,327
Anyways.
225
00:16:32,509 --> 00:16:41,086
It's also one of those things where it's amazing because he's got control of himself and
the doctors go in and go, OK, we're going to convince him he has control of himself, but
226
00:16:41,086 --> 00:16:42,313
we're really going to take control.
227
00:16:42,313 --> 00:16:47,431
We're going to make all of our thoughts and patterns when it comes to fighting, feel like
his thoughts and ideas and patterns.
228
00:16:47,771 --> 00:16:51,174
And it's one of those things where it's like, yeah, that's where things are going to
evolve.
229
00:16:51,174 --> 00:16:52,505
And that's what the engineer was doing.
230
00:16:52,505 --> 00:16:56,678
Like the engineer was doing things as code, but she thought she had independence.
231
00:16:57,439 --> 00:17:01,092
That's what somebody like a Lex Luthor is going to do to you.
232
00:17:01,092 --> 00:17:02,133
There's no doubt.
233
00:17:02,133 --> 00:17:04,250
There's a lot more Lex Luthor's than you think.
234
00:17:04,250 --> 00:17:07,183
We all have a little Lex Luthor inside of us.
235
00:17:07,652 --> 00:17:16,158
there was a great headline I saw the other day that was something like uh Superman
displays that we actually have real life supervillains in our lives now.
236
00:17:16,719 --> 00:17:17,811
It's true.
237
00:17:17,811 --> 00:17:24,064
And I hate to paint them with that broad of a brush, but the actions seem to speak pretty
loudly.
238
00:17:24,064 --> 00:17:27,925
So it's hard to ignore them when you see the kinds of things that are being done.
239
00:17:27,925 --> 00:17:30,256
Well, and they don't think that they're supervillains.
240
00:17:30,256 --> 00:17:32,778
Like this all comes down to how they think about themselves.
241
00:17:32,778 --> 00:17:38,521
they're like, I, they feel like they're on a mission to save something and they're willing
to do whatever it takes to make sure those things happen.
242
00:17:38,521 --> 00:17:40,412
And, you know, right.
243
00:17:40,412 --> 00:17:44,044
But morality is all in the eye of the beholder.
244
00:17:44,044 --> 00:17:50,488
And I mean, morality is a manmade human concept that we came up with and we've just had to
ascribe value to things.
245
00:17:50,748 --> 00:17:55,765
Um, we can talk about religion and all those pieces, but those things in context, but
ultimately speaking, you know,
246
00:17:55,765 --> 00:18:04,517
people ascribe value to a thing that is very much so uh personal to their own perspective.
247
00:18:05,378 --> 00:18:08,518
Lex Luthor did not feel like he was doing anything wrong.
248
00:18:08,518 --> 00:18:10,219
does not feel like he's doing anything.
249
00:18:11,419 --> 00:18:13,200
That it was saving humanity.
250
00:18:13,200 --> 00:18:18,781
Yes, he thought he was saving humanity from from the evil aliens that are trying to come
in.
251
00:18:18,781 --> 00:18:25,293
And there's definitely something an argument that, you know, needs to be made that the
movie itself had had a message.
252
00:18:25,293 --> 00:18:25,713
Mm-hmm.
253
00:18:25,713 --> 00:18:38,752
said that um aliens, whatever you want to call them from extraterrestrial origins or from
across their borders, which is a somehow coming in to invade and take things over and do
254
00:18:38,752 --> 00:18:40,052
terrible things.
255
00:18:40,193 --> 00:18:46,277
And Superman was was very, very anti Nazi.
256
00:18:46,637 --> 00:18:53,722
Like you want to talk about people that were much very much so about isolationism and
nationalism and taking over things.
257
00:18:53,722 --> 00:18:55,273
Nazis are a pretty good example of that.
258
00:18:55,273 --> 00:18:57,324
And Superman fought the Nazis.
259
00:18:57,625 --> 00:19:07,653
Well, the new Nazis in this scenario were Lex Luthor, like trying to wipe these things
out, trying to control these pieces, trying to create a master race by creating the
260
00:19:07,653 --> 00:19:18,662
engineer and the uh the other character, which was well, yeah, like he was trying to
manufacture and run eugenics to create the best and the brightest of a thing.
261
00:19:18,662 --> 00:19:20,544
And he did.
262
00:19:20,544 --> 00:19:24,329
And it's it's this extrapolation of the idea that, you know,
263
00:19:24,329 --> 00:19:27,311
These concepts are ah are different or new.
264
00:19:27,311 --> 00:19:27,931
mean, they're not.
265
00:19:27,931 --> 00:19:34,706
But we're just plunging things together in such a way that we can put them together to
tell interesting stories and arcs that actually make sense and are relatable.
266
00:19:34,766 --> 00:19:44,632
And I think it would have been hard for them to make this movie and have it as relatable
as it is 10 years ago, because people would have been like, what is this nerd bullshit?
267
00:19:44,713 --> 00:19:45,203
A.I.
268
00:19:45,203 --> 00:19:46,694
moving stuff around like this.
269
00:19:46,694 --> 00:19:47,915
This is too far fetched.
270
00:19:47,915 --> 00:19:49,686
And there's plenty of movies that tried this.
271
00:19:49,686 --> 00:19:49,966
Right.
272
00:19:49,966 --> 00:19:51,087
And they did OK.
273
00:19:51,087 --> 00:19:54,299
But this one, like it was the central theme of
274
00:19:54,358 --> 00:19:56,520
What and like you said a lot of this is.
275
00:19:56,520 --> 00:19:58,031
old sci-fi being brought back to life.
276
00:19:58,031 --> 00:19:58,430
A ton.
277
00:19:58,430 --> 00:20:01,883
mean, this is, this is like deep comic book stuff from what I've read.
278
00:20:01,883 --> 00:20:08,786
I've certainly never read the comic books that, that James Gunn is pulling from, but, but
it is a lot of like even the Superman's robots.
279
00:20:08,786 --> 00:20:13,278
I never knew Superman had robots, but apparently in comic books a million years ago he
did.
280
00:20:13,278 --> 00:20:20,301
So I was watching it with fresh eyes going like, wow, that's such a cool idea to bring
these robots in and the way things are now.
281
00:20:20,301 --> 00:20:24,783
But it is, it's just adapting these old ideas to, to current times and making it very
relatable.
282
00:20:24,783 --> 00:20:26,484
And I mean, just so brilliant.
283
00:20:26,484 --> 00:20:32,747
displaying the war for the idea and who gets to control these things and how they impact
our lives.
284
00:20:32,747 --> 00:20:43,707
Well, that's the big thing is that like they were making moral and philosophical arguments
like the whole way through and they were executing things with unrealistic unfettered
285
00:20:43,707 --> 00:20:44,547
power.
286
00:20:45,287 --> 00:20:49,127
most of us don't have access to that.
287
00:20:49,487 --> 00:20:56,707
Like you have there's a certain economic threshold that you have to hit before you can,
you know, make a tack dress, right?
288
00:20:56,989 --> 00:20:59,401
And that economic threshold is coming down.
289
00:20:59,401 --> 00:21:09,788
Like we can look at the broader aspect of things like happening in Ukraine, where they
took drones and they pushed them in and they blew up a bunch of uh planes inside of
290
00:21:09,788 --> 00:21:10,649
Russia.
291
00:21:11,910 --> 00:21:14,112
this technology is becoming more and more accessible.
292
00:21:14,112 --> 00:21:24,351
m But it's becoming more and more accessible because it's making things smaller, easier to
understand, and you have an infrastructure that's pushing those pieces out.
293
00:21:24,351 --> 00:21:31,494
the far edges of technology, like if we have these drones that you can send in to do
things to bomb payloads on them, it's pretty for guerrilla warfare strategy.
294
00:21:32,714 --> 00:21:41,738
We don't have the ability for you to make a Superman or or an engineer, and when somebody
gets that technology, they're probably not going to share it.
295
00:21:42,619 --> 00:21:53,627
So, yeah, like I actually I think something more like like the universal soldier concept
where they're, you know, blanking human brains and, you know, using them for.
296
00:21:53,627 --> 00:21:56,951
terrible things and augmenting them with different cyber tech.
297
00:21:57,792 --> 00:22:06,242
I mean that's probably realistic but I think we'll probably bypass that all together and
just go to really really powerful not two-legged humanoid robots to
298
00:22:08,396 --> 00:22:15,417
Yeah, and I mean, there's so many analogies it brings up with like the Clone Wars and Star
Wars and humans versus robots.
299
00:22:15,417 --> 00:22:19,390
I these are concepts that have been explored over and over again.
300
00:22:19,690 --> 00:22:21,371
usually the good guys win.
301
00:22:21,371 --> 00:22:28,794
And I know it's usually fiction, but I'm putting a lot of stock in the good guys
overcoming.
302
00:22:29,522 --> 00:22:33,426
Maybe it's just the fact that whoever wins writes the history books and they end up being
the good guys.
303
00:22:33,426 --> 00:22:40,754
But uh I'd really like to believe that that we are going to end up in a better place uh
because of all this.
304
00:22:40,754 --> 00:22:44,297
And in the case of Superman, we do.
305
00:22:44,981 --> 00:22:45,301
Yeah.
306
00:22:45,301 --> 00:22:47,552
And, and bet again, that goes back to the narrative, right?
307
00:22:47,552 --> 00:22:48,303
Like, I mean, you're right.
308
00:22:48,303 --> 00:22:58,728
The Victor gets to write the history books, but even more importantly, as we're going
through and we're having conversations about these things, the way that you prevent those
309
00:22:58,728 --> 00:23:07,613
types of big conflicts from happening is actually having conversations, actually talking
to other people with other ideas and trying to find common ground and looking for ways to
310
00:23:07,613 --> 00:23:14,975
make these things better and realizing that you don't need to be xenophobic and afraid of
people just because they're from different countries or, you know,
311
00:23:14,975 --> 00:23:15,965
Maybe they're not straight.
312
00:23:15,965 --> 00:23:30,123
These are these are things that should not be an issue anymore, but they are because they
create web wedges between us and create these different social camps and this notion of
313
00:23:30,123 --> 00:23:31,044
tribes.
314
00:23:31,044 --> 00:23:41,389
And we can't let a I turn tribes into these like ultra powerful ideologies that are going
to fight each other.
315
00:23:41,389 --> 00:23:43,050
And some of that's happening today.
316
00:23:43,050 --> 00:23:43,370
Right.
317
00:23:43,370 --> 00:23:44,969
Like you're not going to get around that.
318
00:23:44,969 --> 00:23:46,400
certain state actors.
319
00:23:46,460 --> 00:23:55,128
But as people living in North America, like, you should be able to agree on things like
certain things are okay, and certain things are not.
320
00:23:55,128 --> 00:23:59,251
And we should try to be nice to each other and you know, live and let live that whole kind
of thing.
321
00:24:01,047 --> 00:24:13,224
And it doesn't take everyone in the tribe anymore to go through and make the argument that
things need to change ad nauseum and start creating collateral material that inundates
322
00:24:13,224 --> 00:24:14,975
people with different ideas anymore.
323
00:24:14,975 --> 00:24:17,075
Because AI enables that.
324
00:24:17,075 --> 00:24:23,750
Like, you can create bots that can go through and can artificially look like these are
people coming from this tribe saying these things, and this must be how everyone thinks in
325
00:24:23,750 --> 00:24:24,800
that space.
326
00:24:24,940 --> 00:24:30,423
And as a viewer or a listener, a target, um and a producer,
327
00:24:30,987 --> 00:24:32,510
You need to be aware of both sides of that.
328
00:24:32,510 --> 00:24:39,371
And as someone who actually takes this type of data and information coming in bound
towards them, you have to be very careful of your confirmation bias.
329
00:24:39,371 --> 00:24:43,667
And you got to check it all the time because these models will play.
330
00:24:45,794 --> 00:24:53,139
And where you started there, think is probably the most critical thing is you don't have
to use these to interact with other human beings.
331
00:24:53,139 --> 00:24:58,331
I think you will find the more that you put that down and walk away from this thing you're
looking at right now.
332
00:24:58,331 --> 00:25:07,543
If you're watching this on YouTube and go like, you know, talk to people, hang out with
your neighbors like hear what real people are saying and engage with them.
333
00:25:08,217 --> 00:25:09,888
We don't do that anymore.
334
00:25:10,470 --> 00:25:19,973
these screens, these barriers between us, what help us all feel like warriors and sit here
and sling arrows at each other because we don't have to actually see the look on the other
335
00:25:19,973 --> 00:25:23,126
person's face when we say the horrible things we say to each other.
336
00:25:25,349 --> 00:25:35,300
just re-engage, like unplug, re-engage with humanity and you'll start to find that like a
lot of the things that you're angry about or the things you hate about other people are
337
00:25:35,300 --> 00:25:43,319
ideas that you're being fed by someone else and when you actually meet those people and
they become your friends and your neighbors, you start to wonder like, what the hell was I
338
00:25:43,319 --> 00:25:43,909
so mad about?
339
00:25:43,909 --> 00:25:46,562
Like they want the same things I do.
340
00:25:46,713 --> 00:25:54,355
Hopefully it gets you to brush your teeth and put deodorant on which you probably don't do
often enough because you're behind the screen all the time thinking nobody can smell you.
341
00:25:55,315 --> 00:25:58,216
When the smell of vision happens on the podcast.
342
00:26:01,797 --> 00:26:04,378
Does it smell like coffee and whiskey?
343
00:26:04,378 --> 00:26:06,778
Because if not, it's probably not me.
344
00:26:08,019 --> 00:26:12,410
Smell like New York sewage, which I had to trench through like half of this and time it
was flooding there this week.
345
00:26:12,410 --> 00:26:13,320
No.
346
00:26:13,953 --> 00:26:15,581
Yeah, I mean, that.
347
00:26:16,089 --> 00:26:17,910
It's such a good point.
348
00:26:18,530 --> 00:26:22,389
if you want to engage in your humanity, you're not going to do that over the screen.
349
00:26:22,389 --> 00:26:24,243
Like, you'll get some of it, you'll some interaction.
350
00:26:24,243 --> 00:26:26,053
There's a great way to exchange ideas.
351
00:26:26,174 --> 00:26:35,027
But there's nothing like, you know, being across from somebody and shaking their hand or
giving them a hug or seeing their physical body language look like they're going to rip
352
00:26:35,027 --> 00:26:36,708
your head off because you said something the wrong way.
353
00:26:36,708 --> 00:26:40,860
Like, these are all cues that you actually have to understand and evolve as a creature.
354
00:26:40,860 --> 00:26:43,691
And there's a whole generation of kids that aren't doing that.
355
00:26:43,769 --> 00:26:52,197
And there's a whole generation of adults that don't practice it often enough and forget
and really do feel like they can hide behind their keyboard and say something because
356
00:26:52,197 --> 00:26:54,338
they're not afraid to get punched in the mouth.
357
00:26:55,220 --> 00:26:59,123
Well, maybe we need to start making people afraid of that again.
358
00:26:59,123 --> 00:27:01,149
Like, and actually try to be nice to.
359
00:27:01,149 --> 00:27:02,278
I mean that was...
360
00:27:02,278 --> 00:27:03,283
uh
361
00:27:03,375 --> 00:27:11,159
When I was a kid, I think when you were a kid, I think our generation was the last one
that you didn't really hear your parents talking about politics at a party.
362
00:27:11,159 --> 00:27:17,942
You didn't hear them talking about religion, because there are some things you just don't
fucking talk about, because they're too confrontational.
363
00:27:17,942 --> 00:27:20,443
And you're not going to change someone's mind.
364
00:27:20,443 --> 00:27:23,444
That person is going to come to that point on their own.
365
00:27:23,444 --> 00:27:28,337
They're going to find that their religion isn't what they thought it was, or their
political belief isn't what they thought it was.
366
00:27:28,337 --> 00:27:33,469
think in the feeds I'm seeing, again, confirmation bias, but there's a lot of mega right
367
00:27:33,469 --> 00:27:42,291
going like wait what the fuck this isn't what I thought like John Stewart the other day on
the Daily Show was talking about there's a lot of mega out there that's that's going like
368
00:27:42,291 --> 00:27:52,113
wait this guy's lying and he this isn't what he said he was gonna do and he's like yeah
this is the guy we've all been dealing with for the last 10 years welcome right like yeah
369
00:27:52,113 --> 00:28:03,155
yes it is like that that is the thing like yeah we could I don't I don't want to get into
politics because why talk about politics I'm not gonna change my mind but
370
00:28:03,343 --> 00:28:06,405
The point is there are some things that it's like, why?
371
00:28:06,405 --> 00:28:12,490
Like you're not going to make friends by continuing to blast your political opinion every
day on social media.
372
00:28:12,490 --> 00:28:13,673
Put the thing down.
373
00:28:13,673 --> 00:28:20,775
there's always a third rail of politics, conversations, whatever it is, like the hot wire
thing that you can tread your way into this.
374
00:28:20,775 --> 00:28:22,556
But don't touch that thing.
375
00:28:23,636 --> 00:28:24,536
In.
376
00:28:25,757 --> 00:28:40,021
Outrage algorithms tied to bots like live on the third rail, like they're like, wow, to
everybody all of the time on purpose because they know what's going to elicit a response.
377
00:28:40,061 --> 00:28:40,901
So.
378
00:28:41,335 --> 00:28:49,860
If you know that you're being fed this and you know that you're being sent supercharged
information, insulate yourself a little bit.
379
00:28:50,342 --> 00:28:51,745
Put some rubber boots on.
380
00:28:51,745 --> 00:28:52,400
Yeah.
381
00:28:52,400 --> 00:28:53,500
exactly.
382
00:28:53,641 --> 00:28:54,071
All right.
383
00:28:54,071 --> 00:28:56,382
Well, that's all I have to say about that for now.
384
00:28:56,382 --> 00:28:59,294
But if you have not seen the Superman movie, hopefully we haven't spoiled it too much.
385
00:28:59,294 --> 00:29:02,123
But absolutely go see it, because it is such a fun ride.
386
00:29:02,375 --> 00:29:08,687
Real quickly before we get out of here, I want to remind you that you can get 20 % off of
your order of magicmind, which is my favorite mental performance shot.
387
00:29:08,687 --> 00:29:11,058
I literally take it at least once a day.
388
00:29:11,058 --> 00:29:14,569
It has literally changed my life and I think it could do the same thing for you.
389
00:29:14,876 --> 00:29:22,569
Head over to magicmind.com forward slash fit mass 20, use that promo code fit mass 20 to
get that 20 % off of your order.
390
00:29:22,952 --> 00:29:24,312
For now, that's going to do it.
391
00:29:24,312 --> 00:29:27,054
This is our final sign off as the Fit Mess.
392
00:29:27,054 --> 00:29:32,256
For now, until the next episode publishes, you can still find us at thefitmess.com.
393
00:29:40,192 --> 00:29:35,866
So yeah, so we will be back as BroBots next week at brobots.me.
394
00:29:35,866 --> 00:29:42,420
But for now, you can find us at thefitmess.com Whichever one should work to get you to us
next week.
395
00:29:42,420 --> 00:29:43,391
Thanks so much for listening.
396
00:29:43,391 --> 00:29:46,783
Thanks so much for supporting us all these years uh as what we have been.
397
00:29:46,783 --> 00:29:53,266
We look forward to trying to entertain and inform you for a long time to come as brobots
at brobots.me.
398
00:29:53,266 --> 00:29:53,877
We'll see you there.
399
00:29:53,877 --> 00:29:54,641
Thanks.
400
00:29:54,641 --> 00:29:55,682
See you everyone, bye bye.