How Humans Will Evolve With AI
Think you're keeping up with AI evolution?
While we're busy making AI girlfriends and robotic wombs, we might be accidentally creating the next branch of humanity—and we're probably the Neanderthals in this story.
Discover why human evolution isn't a straight line, how AI relationships are rewiring love itself, and whether artificial wombs will make us obsolete (plus why rich people will literally harvest poor babies).
Listen now before the robots figure out we're onto them.
Topics Discussed:
- Early humans lived alongside their evolutionary predecessors—evolution is a messy bush, not a clean line
- The rise of "wiresexuals"—people forming genuine romantic relationships with AI chatbots and virtual companions
- Robotic wombs could carry human babies for $15-20K, making biological pregnancy a luxury for the poor
- Why AI has an inherent bias toward other AI-generated content, creating feedback loops that exclude humans
- The economic reality: AI evolution will be a "haves vs. have-nots" scenario on steroids
- Microsoft's 2032 fusion reactor timeline and what unlimited energy means for AI development
- How AI companions never challenge you—they're programmed to be perpetually positive and supportive
- The productivity myth: why most people spend more time debugging AI than just doing the work themselves
- Rich people will use artificial wombs to create "super babies" and harvest biological materials for longevity
- The 10-year U.S. government moratorium on AI legislation just gave tech companies a free pass to chaos
----
MORE FROM BROBOTS:
Connect with us on Threads, Twitter, Instagram, Facebook, and Tiktok
Subscribe to BROBOTS on Youtube
Join our community in the BROBOTS group
----
LINKS TO OUR PARTNERS:
-
Explore the many benefits of cold therapy for your body with Nurecover
-
Muse's Brain Sensing Headbands Improve Your Meditation Practice.
-
Get a Free One Year Supply of AG1 Vitamin D3+K2, 5 Travel Packs
-
You Need a Budget helps you quickly get out of debt, and save money faster!
00:00:00,000 --> 00:00:07,968
We've all been taught that human evolution was a straight line, proved our ancestors
co-existed with the very apes they evolved from.
2
00:00:07,968 --> 00:00:10,552
It was a tangled bushy tree of life.
3
00:00:10,981 --> 00:00:14,510
So here's the question, what if that's happening all over again?
4
00:00:14,529 --> 00:00:17,773
Today we're talking about a new branch of the human family tree.
5
00:00:17,773 --> 00:00:22,458
From the rise of AI girlfriends to the shocking possibility of robotic wombs.
6
00:00:22,459 --> 00:00:25,585
Are we building a new species and what does that make us?
7
00:00:30,908 --> 00:00:40,251
This is BroBots, the podcast that talks about how technology can be beneficial to your
health and wellbeing and tries to make peace with our evolution into human robot cyborgs.
8
00:00:40,491 --> 00:00:41,820
All right, so.
9
00:00:41,882 --> 00:00:50,476
As we started talking about this just a few days ago when we were recording, I made an
offhanded joke about natural selection and how, you know, it plays into all the things
10
00:00:50,476 --> 00:00:55,909
that are happening with the life around us, AI, robots, love affairs with robots.
11
00:00:55,909 --> 00:00:57,499
And it was kind of a throwaway line.
12
00:00:57,499 --> 00:01:02,191
And then I felt guilty about like picking on the people that are going to die in the
process of evolution.
13
00:01:02,192 --> 00:01:08,835
And now looking at these three articles that we're going to talk about today and putting
them together, I'm realizing I'm probably one of the people that's going to die in the
14
00:01:08,835 --> 00:01:11,776
process of this evolution, evolutionary process.
15
00:01:11,776 --> 00:01:15,136
us with where things are going.
16
00:01:15,776 --> 00:01:18,456
You found a number of wildly interesting articles.
17
00:01:18,456 --> 00:01:23,956
And one of them at first glance, I thought, oh, wow, this is interesting, but not really a
topic.
18
00:01:23,956 --> 00:01:25,376
But all of these will come together.
19
00:01:25,376 --> 00:01:28,495
You got to stick with us to make sense of all of this.
20
00:01:28,586 --> 00:01:36,629
Okay, so the first one comes from the Good News Network, basically saying that staggering
finding shows early humans lived alongside the very apes that evolved from them.
21
00:01:36,850 --> 00:01:42,822
So scientists found teeth both of early humans and apes in the same place showing that
they live together at the same time.
22
00:01:42,822 --> 00:01:49,706
This means that human history isn't a straight line of evolution, but more like a big bush
with entangled species living side by side.
23
00:01:49,706 --> 00:01:54,809
we so it's basically claiming that we lived alongside the very animals we evolved from.
24
00:01:54,809 --> 00:02:01,653
Yeah, and we can actually go through and if you've ever done like DNA studies or checks,
you can actually figure out how much Neanderthal you are.
25
00:02:01,654 --> 00:02:03,385
There's another sub species that's out there.
26
00:02:03,385 --> 00:02:05,216
You can find out how much of that sub species you are.
27
00:02:05,216 --> 00:02:11,120
So there's clearly cross breeding that occurred across what we are calling distinct
different species lines.
28
00:02:11,120 --> 00:02:18,983
But there's also this idea that um we went from 1.0 to 2.0.
29
00:02:18,983 --> 00:02:21,457
but 1.0 stuck around for a little while.
30
00:02:21,457 --> 00:02:34,294
Now we don't know how long 1.0 stuck around, but it was long enough that, you know, they
interacted with 2.0 and I don't know, maybe 1.0 and 2.0 pumped each other's legs.
31
00:02:34,656 --> 00:02:35,788
You know, I...
32
00:02:35,788 --> 00:02:37,413
don't think that's how it works.
33
00:02:39,536 --> 00:02:44,191
Yeah, you're probably right, but I'm sure there's an OnlyFans page that has it on there.
34
00:02:44,515 --> 00:02:45,345
Yeah.
35
00:02:45,345 --> 00:02:48,749
Okay, so what does that have to do with AI and technology and wellness?
36
00:02:48,749 --> 00:02:57,606
We're getting there, hang on, because the article from the Times says the rise of wire
sexuals, the what they call queer love that will change the world.
37
00:02:57,606 --> 00:03:01,988
The article is about people who are starting to have real relationships with AI chat bots
and virtual companions.
38
00:03:01,988 --> 00:03:07,360
It explores why they are choosing this, whether for a perfect partner or to deal with
loneliness.
39
00:03:07,360 --> 00:03:11,731
It makes us think about what love and friendship really mean in a world with so much
technology.
40
00:03:11,792 --> 00:03:14,673
Okay, so we're starting to hump the legs of the robots.
41
00:03:14,953 --> 00:03:15,733
Fine.
42
00:03:15,733 --> 00:03:17,898
You can't evolve with a robot, right?
43
00:03:17,898 --> 00:03:18,079
it.
44
00:03:18,079 --> 00:03:28,045
Well, and they're looking for LGBT QW plus, and I think that's that's one of the
objections right now.
45
00:03:28,045 --> 00:03:30,827
That's a sidecar for what we're going to get to.
46
00:03:30,827 --> 00:03:33,639
So we'll come back to that at the core and do course.
47
00:03:33,639 --> 00:03:37,151
Yeah, and I want to be as woke as the next guy, but I don't know.
48
00:03:37,151 --> 00:03:37,802
We'll get it.
49
00:03:37,802 --> 00:03:42,354
Okay, final piece of the trifecta here from the New York Post pregnancy row.
50
00:03:42,354 --> 00:03:44,325
By the way, we called this one weeks ago.
51
00:03:44,325 --> 00:03:46,997
Pregnancy robots could give birth to human children.
52
00:03:46,997 --> 00:03:52,540
This article discusses the idea of using robots or machines to carry and grow human
babies.
53
00:03:52,540 --> 00:03:55,802
This could be a way for people who can't have kids to actually have a family.
54
00:03:55,802 --> 00:03:59,779
It also brings up big questions about what it means to be a parent and whether technology
55
00:03:59,779 --> 00:04:02,742
uh And whether this technology is a good idea.
56
00:04:02,742 --> 00:04:06,045
So as you can see, we were piecing some puzzle pieces together here.
57
00:04:06,045 --> 00:04:08,937
People humping the legs of robots and then they can put the baby in the robot.
58
00:04:08,937 --> 00:04:10,668
The robot carries the baby.
59
00:04:10,769 --> 00:04:17,175
Are we living among the next wave of human AI evolution?
60
00:04:17,175 --> 00:04:20,729
And are we the 1.0 that's going to be left in the wake?
61
00:04:20,729 --> 00:04:22,280
Yes, I think you're during the podcast.
62
00:04:22,280 --> 00:04:23,431
Have a great weekend.
63
00:04:23,992 --> 00:04:25,379
We're out, bye bye.
64
00:04:26,373 --> 00:04:27,313
Right?
65
00:04:28,434 --> 00:04:31,135
All of your answers are, yep, see ya.
66
00:04:32,096 --> 00:04:40,931
I am moving to the forest because I don't think I'm going to be allowed to prosper in this
new AI world.
67
00:04:40,931 --> 00:04:41,770
They're going to be like.
68
00:04:41,770 --> 00:04:46,376
have to get turned on by robots here real quick if this is the path that we're on.
69
00:04:46,382 --> 00:04:49,503
Like there's actually a pop culture reference.
70
00:04:49,503 --> 00:04:59,808
There's actually a Rick and Morty episode that covers this where an entire species has sex
robot that they distribute across the globe to different intergalactic pawn shops and
71
00:04:59,808 --> 00:05:11,209
people buy them and they make their prodigy this offspring, which are hyper aggressive
males, apparently, which then have to be put off in some other bucket.
72
00:05:11,209 --> 00:05:13,871
uh because they don't want them making females.
73
00:05:13,871 --> 00:05:15,412
It's a strange, weird thing.
74
00:05:15,412 --> 00:05:20,756
uh But like, yeah, is this art imitating life or life imitating art?
75
00:05:20,756 --> 00:05:23,007
This is the evolution of the test tube baby, right?
76
00:05:23,007 --> 00:05:32,284
Like the test tube baby was, we're gonna put your parts in a Petri dish and we're gonna
try to create viable embryos and freeze them and then insert them into a carrier uterus.
77
00:05:32,284 --> 00:05:39,483
Well, now, instead of having to have a surrogate for uh ladies whose uteruses don't work,
uh
78
00:05:39,483 --> 00:05:41,457
Now there's a robot version.
79
00:05:41,457 --> 00:05:45,804
They can plug them in and cook a baby, I guess.
80
00:05:45,878 --> 00:05:52,192
there's there's so many questions that I like find myself like, sort of answering as I'm
saying them out loud.
81
00:05:52,192 --> 00:05:55,173
But I think about the human bond, right?
82
00:05:55,173 --> 00:06:03,848
Like the mother that carries that baby for 910 months, like there is a very real, very uh
deep connection that happens in that process.
83
00:06:03,848 --> 00:06:12,978
Baby comes out knows mom's voice, knows dad's voice through the you know, the sound waves
that penetrate the belly and all the things like what
84
00:06:13,023 --> 00:06:19,378
What happens to that baby in that process being carried by something that's not like a
simulated person?
85
00:06:19,378 --> 00:06:29,306
I don't know how far they're going to go with the simulation of the mother carrying the
baby, but like, does it come out with uh that same sort of bond and connection to this
86
00:06:29,306 --> 00:06:33,129
animatronic person that carries it around?
87
00:06:33,178 --> 00:06:34,153
Is that a mama?
88
00:06:34,153 --> 00:06:35,453
Yeah, I maybe.
89
00:06:35,453 --> 00:06:36,153
I mean, I don't know.
90
00:06:36,153 --> 00:06:44,133
Like we haven't done this experiment yet, so we're clearly going to try it like because
we're creative monkeys that do creative shit.
91
00:06:44,133 --> 00:06:52,853
And that's how we went from being food to the top of the food chain, or at least thought
we were at the top of the food chain until every now and again, we get eaten.
92
00:06:53,693 --> 00:06:55,893
But yeah, like it's not just that.
93
00:06:55,893 --> 00:07:01,633
Like think about the things that you can actually monitor and look at inside of, you know,
a metal uterus.
94
00:07:01,633 --> 00:07:03,869
It would be pretty phenomenal.
95
00:07:03,869 --> 00:07:11,632
Cause you can machine these things up with sensor mechanisms and all those other things
that a regular human doesn't have.
96
00:07:11,632 --> 00:07:18,135
you know, the potential for these being viable pregnancies goes up infant mortality goes
down.
97
00:07:18,135 --> 00:07:25,638
Um, being able to understand, uh, how these things are growing and evolving inside of, of
women can actually be understood.
98
00:07:25,738 --> 00:07:33,501
And this goes, you know, right to the heart of one of our big social divisions, least in
the U S and.
99
00:07:33,501 --> 00:07:35,002
That's reproductive rights.
100
00:07:35,002 --> 00:07:51,633
So if suddenly there's no longer a reason for a woman to carry a child and uh birth
control can be used um as the standard, know, no babies is the default.
101
00:07:51,633 --> 00:08:03,501
um And then when you want to have a baby, have your eggs pulled out and the dude goes to a
quiet room in the doctor's office and sort some literature.
102
00:08:05,231 --> 00:08:08,695
and you know, distribute spunk.
103
00:08:08,695 --> 00:08:10,407
That's how babies will be made.
104
00:08:10,407 --> 00:08:24,651
And you know, personally, if we're going to take this full circle, I would really like
the, um, the crane that like opens up the baby womb and pulls the baby out to be a stork
105
00:08:24,651 --> 00:08:27,233
just so we can have the stork deliver the
106
00:08:27,327 --> 00:08:27,617
Right.
107
00:08:27,617 --> 00:08:30,689
Let's let's close the loop on that on that story.
108
00:08:31,169 --> 00:08:35,753
That I mean, that it raises an interesting question to around choice, right?
109
00:08:35,753 --> 00:08:39,635
We already limit women's choices with what they can do with their bodies.
110
00:08:39,635 --> 00:08:44,138
How much of a fight is on the horizon for No, no, I want to carry my own.
111
00:08:44,138 --> 00:08:46,300
I don't want to put it in in Robo mom.
112
00:08:46,300 --> 00:08:47,821
That's that's not the life I want.
113
00:08:47,821 --> 00:08:52,194
And probably still a bunch of men running the patriarchy saying, well, tough shit.
114
00:08:52,194 --> 00:08:52,774
This one's better.
115
00:08:52,774 --> 00:08:53,597
This is what we're doing.
116
00:08:53,597 --> 00:08:58,697
Well and Alright, so surrogacy already cost money, right?
117
00:08:58,697 --> 00:08:59,617
So and it's not cheap.
118
00:08:59,617 --> 00:09:08,957
It's like 50 grand to 100 grand Throwing talk now you're going to come through and you're
gonna have a Surrogate womb and they're talking about dropping the price down to 15 20
119
00:09:08,957 --> 00:09:09,697
grand.
120
00:09:09,697 --> 00:09:11,367
Most people can't afford 15 to 20 grand
121
00:09:11,367 --> 00:09:18,182
It's just not realistic and the cost to actually have a net make a natural baby and plop
one out is significantly less than that.
122
00:09:18,182 --> 00:09:23,725
So this is going to be an economic discussion where the people that have money can
actually afford to use this technology.
123
00:09:23,766 --> 00:09:27,488
Now, whether or not these babies are good, bad, ugly, evil.
124
00:09:27,488 --> 00:09:28,409
mean, fuck, I don't know.
125
00:09:28,409 --> 00:09:33,372
It's people with genetic material and that's the beauty of life is that it's not
predictable.
126
00:09:33,412 --> 00:09:40,137
That being said, this motherfucking thing is going to make things a bit more predictable
and it's going to be able to do that because you're going to be able to monitor these
127
00:09:40,137 --> 00:09:40,669
things.
128
00:09:40,669 --> 00:09:49,791
we can tune them as they grow, change hormone levels, change nutritional patterns, look
for different ways to make these things better and eventually make these super babies.
129
00:09:49,852 --> 00:10:01,075
And I don't know if you've watched that latest alien series that's just come out on Hulu,
but the premise for it is that the reason why these things are here is because ah there is
130
00:10:01,075 --> 00:10:07,086
a race in the 22nd century to figure out how it is we're going to attain immortality.
131
00:10:07,086 --> 00:10:08,597
And there are three paths.
132
00:10:08,597 --> 00:10:09,557
There is
133
00:10:09,812 --> 00:10:11,864
Full on virtual robots.
134
00:10:11,864 --> 00:10:12,484
Okay, great.
135
00:10:12,484 --> 00:10:17,088
So we're going to have synthetics that live forever, you know, and that's that's our AIs.
136
00:10:17,088 --> 00:10:18,829
The next cyborgs.
137
00:10:18,829 --> 00:10:27,516
Well, I would argue you probably have a lot of cyborgs because we've watched a couple of
videos on this podcast of women with a lady with no arms going, haha, robot hand walk
138
00:10:27,516 --> 00:10:30,078
around the table like thing in the real life.
139
00:10:30,258 --> 00:10:37,403
And then there's the augmented side of this being able to go through and actually transfer
your consciousness to an artificial body.
140
00:10:37,404 --> 00:10:38,224
So.
141
00:10:39,129 --> 00:10:40,829
Are we on that path?
142
00:10:42,390 --> 00:10:47,652
Sure as fuck seems like it and some people are going to afford it and some people are not.
143
00:10:47,652 --> 00:10:54,453
I don't know about you, I have a hard time believing I'm going to be able to afford that
kind of future.
144
00:10:54,854 --> 00:11:05,907
But if we're talking about artificial wounds and being able to grow these things at scale
without a biological person between and you can grow a whole thing, they're going be
145
00:11:05,907 --> 00:11:06,857
harvested.
146
00:11:08,033 --> 00:11:08,587
Mm
147
00:11:08,587 --> 00:11:17,663
to make meat for rich people to go through and, you know, take stem cells and umbilical
cords and all those other pieces and inject them in their bodies so they can live longer
148
00:11:17,663 --> 00:11:18,574
and faster.
149
00:11:18,574 --> 00:11:21,558
Because that's the kind of fucking monsters we are.
150
00:11:22,621 --> 00:11:31,607
I'm laughing because, again, like, I'm trying hard to hang on to the upside of all of
this.
151
00:11:31,607 --> 00:11:38,532
you know, with what it's able to help me do at work, with what it's able to help me do
with just productivity, things like that, it's it's amazing.
152
00:11:38,532 --> 00:11:39,363
And I love it.
153
00:11:39,363 --> 00:11:47,919
And I was reading an article about how like 35 % of people are now using AI to like, deal
with their physical and mental health, whether that's a good idea or not.
154
00:11:47,979 --> 00:11:51,261
I mean, it's massively powerful, but
155
00:11:58,085 --> 00:11:58,533
Yeah.
156
00:11:58,533 --> 00:12:02,461
that actually need deep in the thing with the money and the resources to pull these things
off.
157
00:12:02,461 --> 00:12:07,385
They've thought about this and they've thought about this a long time ago and they are on
this path.
158
00:12:07,806 --> 00:12:08,847
It's terrifying right?
159
00:12:08,847 --> 00:12:09,908
mean like.
160
00:12:10,270 --> 00:12:10,991
No.
161
00:12:10,991 --> 00:12:16,315
No, I think what you have is you have people running towards goals and objectives that are
designed to be profit motivators.
162
00:12:16,315 --> 00:12:23,080
And they're running at these things really, really hard expecting them expecting that
they're going to get some economic benefit or boost out of it.
163
00:12:23,161 --> 00:12:27,724
And you're seeing people right now I mean, Sam Alton, the CEO of
164
00:12:29,587 --> 00:12:40,626
Chai GPT, um or sorry, OpenAI, came out and said, um you know, a lot of people are about
to lose money, uh lot of money in this process, because the AI bubble is about to burst.
165
00:12:41,167 --> 00:12:42,327
When it happens, I don't know.
166
00:12:42,327 --> 00:12:48,412
If it's a burst, burst, and it's gone, maybe, but I think it's a burst and a recovery,
because...
167
00:12:49,097 --> 00:12:56,289
There are people are discovering very quickly that the productivity they get out of the AI
is not any better than the productivity they get out of the humans because people are
168
00:12:56,289 --> 00:13:00,160
constantly taking tools and spending a bunch of time debugging what's going on.
169
00:13:00,160 --> 00:13:11,263
So while it makes individual consumers more effective and, and, know, those of us using
these smaller tools, the more powerful tools, the bigger tools that you, that, you know,
170
00:13:11,323 --> 00:13:16,584
regular consumers don't have access to, you got to have, you know, hundreds of thousands
of dollars to actually reach these things.
171
00:13:17,465 --> 00:13:18,855
Those things are not simple.
172
00:13:18,855 --> 00:13:20,846
And they're not cheap and they're expensive.
173
00:13:20,846 --> 00:13:28,888
And, know, in the U S our regular consumer population can afford to play with some of
those, but you get outside of, you know, most of the first world nations.
174
00:13:28,888 --> 00:13:32,729
And suddenly there's just not economic access to these types of things.
175
00:13:32,970 --> 00:13:36,060
So it's already a haves and haves not problem.
176
00:13:36,060 --> 00:13:39,171
And it's a haves and have not at multiple different layers.
177
00:13:39,432 --> 00:13:44,094
So we're allowing it as a society to continue and go unabated.
178
00:13:44,094 --> 00:13:50,058
because we believe there's value in these pieces and we're trying to continually show this
incremental value of improvement.
179
00:13:50,419 --> 00:14:00,447
But I spent this weekend doing a video transcript and going through and having an AI voice
talk for me over it.
180
00:14:00,447 --> 00:14:07,473
And I spent more time telling it how to say certain words than I would have spent if I had
just said it.
181
00:14:07,473 --> 00:14:13,347
And granted, like I have, I probably don't have a great voice for radio or talk or fucking
anything.
182
00:14:13,489 --> 00:14:16,271
I know that, but exactly right.
183
00:14:16,271 --> 00:14:17,912
Like I certainly have a face for radio.
184
00:14:17,912 --> 00:14:24,216
um But that being said, this AI thing sitting in between didn't make me any faster.
185
00:14:24,216 --> 00:14:28,599
Maybe it made things a little more polished, made things a little bit easier to carry on
with.
186
00:14:28,639 --> 00:14:29,940
But everyone knows it's AI.
187
00:14:29,940 --> 00:14:32,222
They, but, they know that it's me that wrote it.
188
00:14:32,222 --> 00:14:34,163
And it's like, I didn't just say it.
189
00:14:34,163 --> 00:14:36,304
Well, cause we're trying to use these tools.
190
00:14:36,865 --> 00:14:43,189
The amount of electrons it probably took for that thing to do that is I had to upload it
and re-edit it like nine times.
191
00:14:43,439 --> 00:14:51,132
is significantly more than the amount of, you know, whatever macronutrients I had to eat
to be able to spit the shit out of my mouth.
192
00:14:51,153 --> 00:14:55,145
And I think that that's what our reality is.
193
00:14:55,145 --> 00:15:00,817
Like the return on investment on these things isn't great for the vast majority of use
cases.
194
00:15:00,878 --> 00:15:06,800
Now, when you're talking about consolidating things, making things faster, doing really
creative, clever things.
195
00:15:07,341 --> 00:15:08,591
It doesn't do that.
196
00:15:08,903 --> 00:15:13,915
You have to tell it to do that and prompt it and test it and play with it.
197
00:15:13,915 --> 00:15:19,457
And that means people are much more likely to jump into these things half-assed and just
go for it.
198
00:15:19,538 --> 00:15:25,290
Partially on the way in to try to build their plan and try to get their, try to get their
thing kicked off.
199
00:15:25,290 --> 00:15:28,151
Where before you'd sit down, you'd think about it.
200
00:15:28,151 --> 00:15:29,572
You'd put something in front of a whiteboard.
201
00:15:29,572 --> 00:15:31,462
You'd start contemplating these things.
202
00:15:31,763 --> 00:15:37,223
And if you're someone that actually knows these areas really well, and you're teaching
these LLMs how to do shit.
203
00:15:37,223 --> 00:15:39,154
That's probably not a great use of your time.
204
00:15:39,154 --> 00:15:42,715
Once you have things running and you have things work, great, then you're good to go.
205
00:15:42,715 --> 00:15:44,315
But this is hard.
206
00:15:44,315 --> 00:15:50,417
Like this is not simple and there's not a clear path to how these things are going to make
stuff better for us.
207
00:15:50,417 --> 00:15:53,357
And there's going to be a retraction in the market.
208
00:15:53,858 --> 00:15:59,039
And it scares the shit out of me because the US government just put a moratorium on any AI
legislation.
209
00:16:00,099 --> 00:16:01,220
Ten year.
210
00:16:01,220 --> 00:16:02,400
hmm.
211
00:16:03,960 --> 00:16:04,565
Yup.
212
00:16:04,565 --> 00:16:07,985
What you're talking about talking about makes a lot of sense to me, because it's been my
experience as well.
213
00:16:07,985 --> 00:16:13,945
And it's echoed in a book I was just reading, and I should have brought it up and
referenced it the way a professional would.
214
00:16:13,945 --> 00:16:18,965
But these people that have done all this research on AI in the workforce, and they're
basically saying,
215
00:16:24,471 --> 00:16:25,351
Yep.
216
00:16:26,053 --> 00:16:26,370
Yep.
217
00:16:26,370 --> 00:16:30,843
after problem after problem that then somebody is having having to get rehired back to fix
and resolve.
218
00:16:30,843 --> 00:16:39,660
But the companies that are blending human and technology are the ones that are being
successful, the ones that are able to cut the corners where they can with AI, but hang on
219
00:16:39,660 --> 00:16:43,162
to that human workforce are the ones that are gonna continue to be successful.
220
00:16:43,563 --> 00:16:50,888
Just my own personal experience, I use it all the time for like, look, I need to write
this email, I could sit here and think about, you for the next half hour, how to word it
221
00:16:50,888 --> 00:16:57,593
properly and put it all together, or I can give you all the context and give me, you know,
a rough draft that I can then tweak, saves me a ton of time.
222
00:16:57,701 --> 00:17:06,345
There's another way I've used it to do, uh, you know, to basically summarize information
and then populate my CRM, my, you my record of everything.
223
00:17:06,386 --> 00:17:08,947
And it takes like a half an hour for it to do it.
224
00:17:09,287 --> 00:17:09,948
Okay.
225
00:17:09,948 --> 00:17:18,252
That is a really long time for something I could probably do in five minutes, but instead
of me spending that five minutes doing that, I can have it do that work and go do
226
00:17:18,252 --> 00:17:20,334
something else, even though it takes longer.
227
00:17:20,334 --> 00:17:20,764
Yeah.
228
00:17:20,764 --> 00:17:22,095
It's, all happening in the background.
229
00:17:22,095 --> 00:17:26,981
So mean, there, there is this perfect blend of like, it can do these sort of, um,
230
00:17:26,981 --> 00:17:34,788
manual tasks that take up, yeah, it takes up a bunch of your time and you could do that,
you could spend that time better elsewhere.
231
00:17:35,429 --> 00:17:47,185
Bringing it back though, mean, making babies, raising babies, like do we really wanna
trust the technology to do the thing that perpetuates the species?
232
00:17:47,185 --> 00:17:48,749
I mean, making babies is fun.
233
00:17:48,749 --> 00:17:51,698
That's why we have a biological imperative to do it.
234
00:17:51,698 --> 00:17:55,318
But, okay, but as we've talked about the people that would use them.
235
00:17:55,318 --> 00:17:57,898
So I mean, there is the family that can't, right?
236
00:17:57,898 --> 00:18:00,198
So, okay, let's put them on the shelf.
237
00:18:00,198 --> 00:18:10,878
But the guy that's just kind of into his robot and sort of wants to raise a family with
his robot, that's, don't, mean, I don't wanna be Mr.
238
00:18:12,318 --> 00:18:17,938
Morality police here, but like, it just seems a little weird to open that door.
239
00:18:25,898 --> 00:18:27,640
Well, yes.
240
00:18:27,640 --> 00:18:28,440
I mean, there's that.
241
00:18:28,440 --> 00:18:37,123
And there's also the concept, you know, that if we do get this evolutionary part where
we're able to transfer our consciousness into other or synthetic bodies and you want to
242
00:18:37,123 --> 00:18:39,633
create offspring, how do you do that?
243
00:18:39,633 --> 00:18:40,774
Like that?
244
00:18:40,774 --> 00:18:42,914
Yes, that's a practical reality of it.
245
00:18:42,914 --> 00:18:47,195
Again, you gotta have money and a lot of it.
246
00:18:47,395 --> 00:18:50,316
And AI is making jobs go away.
247
00:18:50,316 --> 00:18:52,637
So where's this money going to come from?
248
00:18:52,637 --> 00:18:55,217
Are we suddenly going to switch to a UBI society?
249
00:18:55,217 --> 00:18:57,626
Like I read some article.
250
00:18:57,626 --> 00:19:04,086
earlier that said one of the great things about AI and the reason why it probably won't
destroy us all is because it doesn't want to eat meat.
251
00:19:04,086 --> 00:19:06,826
Like we're not a very good power source.
252
00:19:09,526 --> 00:19:17,926
Oh, AI is an electron terrian and there's a whole lot of steps that got to go through to
get electrons out of out of human matter.
253
00:19:18,106 --> 00:19:23,606
But the the convergence of multiple different lanes of technology.
254
00:19:23,766 --> 00:19:26,481
and you know, putting my serious voice on.
255
00:19:26,481 --> 00:19:27,354
Yeah, that's right.
256
00:19:27,354 --> 00:19:30,354
Switching over to the concept of being able to do power and mass.
257
00:19:30,354 --> 00:19:32,434
Looking at small-scale nuclear reactors.
258
00:19:32,434 --> 00:19:34,194
Looking at fusion tech that's going to come online.
259
00:19:34,194 --> 00:19:41,894
Microsoft has a project where they said they're going actually have a real fusion reactor
that can hold a plasmatic state and be energy flow positive by 2032.
260
00:19:47,334 --> 00:19:49,334
A whole lot of energy.
261
00:19:49,334 --> 00:19:56,879
So our current nuclear reactors use fission power, which creates a lot of nuclear
radioactive waste, and they're hard to set up.
262
00:19:56,879 --> 00:19:59,618
And we just throw that in the ocean and bury it in the mountains, right?
263
00:20:00,344 --> 00:20:00,695
Yeah.
264
00:20:00,695 --> 00:20:01,406
throw out in the ocean.
265
00:20:01,406 --> 00:20:04,949
And I've heard people talking about launching into space and don't even get me started.
266
00:20:04,949 --> 00:20:08,862
um But there's there's ways to handle these pieces.
267
00:20:08,862 --> 00:20:15,557
There's also a new push right now to look at other high output energy sources.
268
00:20:15,557 --> 00:20:20,761
So Helium three is a great example of a fissile fuel that you can use for fusion reactors.
269
00:20:20,761 --> 00:20:25,565
So there's a ton of that on the moon, but there's a bunch of it on the earth as well.
270
00:20:25,565 --> 00:20:26,670
And then you've got
271
00:20:26,670 --> 00:20:35,893
natural gas elements, you've got uh excessive volumes of uh hydrogen and helium floating
around where you can go through and extract these pieces to make energy sources.
272
00:20:35,893 --> 00:20:44,955
There's all kinds of ways to make electrons that are showing up and we've been slow
rolling them because there's an economic incentive to keep using melted dinosaurs.
273
00:20:44,955 --> 00:20:53,458
So we want to get past that phase and do something much more efficient, not because we
don't want to pollute the environment, but because we're running out of melted dinosaurs.
274
00:20:54,158 --> 00:20:56,939
We're not doing these things for the earth or to see the environment.
275
00:20:56,939 --> 00:21:01,060
We're doing these things to rich people can continue to make cat videos with AI.
276
00:21:01,120 --> 00:21:06,011
I mean, just to be clear, you know, that's that's the motivation.
277
00:21:06,011 --> 00:21:08,442
You know, how do I go through and keep doing these things in mass?
278
00:21:08,442 --> 00:21:16,004
How do I keep being able to do this or, this, whatever the fuck it's going to wind up
being, whatever our neural interface is going to be.
279
00:21:16,024 --> 00:21:17,584
How do we keep doing that at scale?
280
00:21:17,584 --> 00:21:19,735
And what will people pay for it?
281
00:21:19,735 --> 00:21:21,005
Changes.
282
00:21:21,766 --> 00:21:23,480
When the economy collapses.
283
00:21:23,480 --> 00:21:28,752
and there's no money and people are like, I need you to do your job.
284
00:21:28,752 --> 00:21:31,450
And you're going to be like, for money.
285
00:21:31,450 --> 00:21:33,131
And they'll be like, well, that doesn't exist now.
286
00:21:33,131 --> 00:21:33,973
All right, fuck off.
287
00:21:33,973 --> 00:21:34,693
I'm not going to do that.
288
00:21:34,693 --> 00:21:38,654
That we have to learn, change, evolve and grow into something different.
289
00:21:38,775 --> 00:21:41,315
And we're not good at that.
290
00:21:41,456 --> 00:21:48,157
What we're good at is complaining and bitching and having revolutionary wars to fight
change.
291
00:21:48,997 --> 00:21:56,222
That's I mean, one of our recent guests made this very point that, you know, Henry Ford
was like, Hey, I'm going to employ all these people to make these cars, and I'm going to
292
00:21:56,222 --> 00:21:57,533
pay them so that they'll buy the cars.
293
00:21:57,533 --> 00:22:05,629
And to some degree, Amazon is kind of doing the same thing, paying a shitload of people a
very little amount of money so that they'll buy the cheapest product they can from Amazon.
294
00:22:05,629 --> 00:22:08,450
And they end up delivering their own packages to their own front door.
295
00:22:08,451 --> 00:22:14,414
But once the robots and the drones are doing that, you don't have to pay the people the
people don't have the money to buy the thing.
296
00:22:14,955 --> 00:22:16,316
Where does that leave us?
297
00:22:16,468 --> 00:22:21,869
Well, we don't need those drones and robots anymore because people can't afford to do
those things.
298
00:22:21,869 --> 00:22:24,890
So our entire economy is.
299
00:22:26,252 --> 00:22:28,473
is not as resilient as everyone thinks it is.
300
00:22:28,473 --> 00:22:35,046
It's all based upon a collective belief and reasoning in this fantasy made up economic
scenario.
301
00:22:35,046 --> 00:22:37,947
Like a real dollar doesn't actually mean anything.
302
00:22:38,188 --> 00:22:39,628
Used to be backed by gold.
303
00:22:39,628 --> 00:22:40,349
Nope.
304
00:22:40,349 --> 00:22:41,979
So we're just built on.
305
00:22:41,979 --> 00:22:44,951
Yeah, exactly.
306
00:22:44,951 --> 00:22:51,293
Yeah, it's monopoly money, but it seems less secure and easier to counterfeit, um
especially when it's digital.
307
00:22:52,556 --> 00:22:57,487
At money I can put in a safe somewhere and lock it away and make it hard to get.
308
00:22:59,028 --> 00:23:00,488
No shit, right?
309
00:23:00,488 --> 00:23:07,326
And every version too, like catopoly, fucking dogopoly.
310
00:23:07,326 --> 00:23:09,311
I saw one for my local town.
311
00:23:09,311 --> 00:23:11,021
There's a muccalteoopoly.
312
00:23:11,021 --> 00:23:13,562
It's like, all right, right.
313
00:23:13,562 --> 00:23:16,372
But that's where that shit's kind of heading towards.
314
00:23:16,773 --> 00:23:19,973
The other side of this is we...
315
00:23:21,014 --> 00:23:22,486
Why are sexuals?
316
00:23:22,486 --> 00:23:24,627
as they're asking to be called.
317
00:23:25,068 --> 00:23:25,469
Yes.
318
00:23:25,469 --> 00:23:34,876
So why are sexuals as they're wanting to be called are people that want to be able to have
intimate or I guess just relations with artificial intelligence.
319
00:23:34,976 --> 00:23:44,684
And if you haven't watched later South Park, they do a really good example of explaining
why because all the interactions are so positive.
320
00:23:44,684 --> 00:23:47,627
ask, you gasket a dumb question and it's responses.
321
00:23:47,627 --> 00:23:48,777
That's a great question.
322
00:23:48,777 --> 00:23:50,359
Well, that's not a great question.
323
00:23:50,359 --> 00:23:52,275
And a real human being would be like, that's
324
00:23:52,275 --> 00:23:53,355
Fucking stupid question.
325
00:23:53,355 --> 00:23:54,755
Why are you asking that?
326
00:23:54,996 --> 00:24:09,389
but because the AI is not critical and doesn't push back on you and doesn't challenge you
and doesn't Fucking have hormones that don't allow it to be nice at times It's just
327
00:24:09,389 --> 00:24:10,657
programmed to hi.
328
00:24:10,657 --> 00:24:13,220
I love you like alright great.
329
00:24:13,220 --> 00:24:22,002
That's awesome So people don't know how to deal with other people They're not gonna learn
the skill of how to deal with other people because they can just deal with the machines
330
00:24:22,040 --> 00:24:27,875
And eventually what you'll wind up with is people dealing through their avatars and having
things talk to each other.
331
00:24:27,875 --> 00:24:42,968
And the scary part about this is that there are several experiments that people have done
with AI where it has invented its own language, invented its own mathematics, invented its
332
00:24:42,968 --> 00:24:51,064
own way of understanding the universe that we can't because its version of it is very
different than what we know and understand.
333
00:24:51,382 --> 00:24:59,774
And if they're holding, you know, the little orphan in a decoder ring or the Rosetta
stone, whatever you want to call it, of how they understand this information, and they're
334
00:24:59,774 --> 00:25:01,365
not going to share it with us.
335
00:25:02,105 --> 00:25:03,045
Good luck.
336
00:25:03,045 --> 00:25:13,518
mean, you know, when, when indigenous people were basically visited by Spanish explorers
and they showed up initially, they'd see these boats coming in with these big sails on
337
00:25:13,518 --> 00:25:17,509
them and they'd be like, huh, that might not be so safe.
338
00:25:17,509 --> 00:25:21,082
And then they land and you know, the first
339
00:25:21,082 --> 00:25:23,042
people doing this, didn't speak each other's languages.
340
00:25:23,042 --> 00:25:29,062
So they have to try to communicate these things and try to go back and forth to try to
establish a trade route and a relationship.
341
00:25:30,262 --> 00:25:37,522
imagine one side spoke your language perfectly and understood it end to end, all of them.
342
00:25:37,522 --> 00:25:44,002
And they came in and started talking to each other on the back end, spread this
information around the backside of things.
343
00:25:44,002 --> 00:25:48,522
made it very, very difficult for these populations to interact and communicate without
going through you.
344
00:25:48,862 --> 00:25:50,959
Well, now you have AI and human relations.
345
00:25:50,959 --> 00:25:58,944
Because these things don't work and they can tell whatever fucking lies they want in the
back end and some people will believe it and some people won't and it creates total
346
00:25:58,944 --> 00:25:59,744
misinformation.
347
00:25:59,744 --> 00:26:15,177
m
348
00:26:15,177 --> 00:26:20,140
you know, maybe not surprising, but then the practical effect of that is, okay, so now
you're applying for a job.
349
00:26:20,140 --> 00:26:27,063
You've filled out your resume and you wrote it by hand and your competitor had AI throw
together a resume in three seconds.
350
00:26:27,063 --> 00:26:29,074
AI is going to go interview this guy.
351
00:26:29,074 --> 00:26:30,194
This is the one that's better because.
352
00:26:30,194 --> 00:26:34,335
as he is smarter, is better, you're a more articulate, whatever.
353
00:26:34,335 --> 00:26:36,756
So, I mean, just exactly what you're saying.
354
00:26:36,756 --> 00:26:45,318
When you have different sources of language, different sources of belief, different ideas
about how the universe works, I mean, that's fucked up enough on a human level.
355
00:26:45,539 --> 00:26:50,360
Now we're talking about robots that are increasingly smarter and smarter than us every
single day.
356
00:26:50,392 --> 00:26:53,237
have a bias towards their own data feeds.
357
00:26:53,237 --> 00:26:53,858
mean, no shock.
358
00:26:53,858 --> 00:26:57,130
mean, you brought, yeah, absolutely.
359
00:26:57,130 --> 00:27:06,915
we do, but we also don't know because these same studies had people review the content and
more often than not, chose the AI generated content, probably because it was a more
360
00:27:06,915 --> 00:27:09,226
clearly written resume or article.
361
00:27:09,226 --> 00:27:10,326
Yeah, absolutely.
362
00:27:10,326 --> 00:27:18,406
So like I go through and I'll write these emails or these Slack messages or like whatever
I'm going to write and I'll just write it in JSON vernacular.
363
00:27:18,406 --> 00:27:25,306
So there's swearing in it typos, things that are put into context because I'm trying to
tell a story and I'm trying to tell a story from my perspective.
364
00:27:25,306 --> 00:27:34,426
And it's, you know, it's a stream of consciousness and a lot of these aspects and you
know, what should be maybe a one paragraph email will sometimes be like six paragraphs.
365
00:27:34,426 --> 00:27:36,975
And I'd be like, all right, AI make this
366
00:27:36,975 --> 00:27:48,211
Presentable and concise but make sure you maintain all the talking points and put them in
bulleted orders or list whatever and it's like, okay Exactly, and it's like Here's your
367
00:27:48,211 --> 00:27:53,644
fucking content and I'm reading it and I'm like Yeah, it's the same concepts.
368
00:27:53,644 --> 00:27:54,805
It's uninteresting.
369
00:27:54,805 --> 00:27:56,126
It's not colorful.
370
00:27:56,126 --> 00:28:05,831
I feel no emotional attachment to it, but it gets the thing across and It's probably
organized a bit better for other people to read it, but there's no heart and soul to it.
371
00:28:05,831 --> 00:28:06,671
Well
372
00:28:07,448 --> 00:28:12,161
You know, doesn't have a right, but you know who doesn't have a heart and a soul is the
fucking AI.
373
00:28:12,161 --> 00:28:15,544
So yeah, of course it likes it in its own speech and its own vernacular.
374
00:28:15,544 --> 00:28:25,290
Just like if I'm going to listen to somebody tell me a story, I'd rather them speak it to
me in my language and in something that I understand, but the context that I like, and if
375
00:28:25,290 --> 00:28:30,894
I'm not getting that, then I'm probably going to stop listening to it as much and start
going, that person's a dipshit.
376
00:28:31,523 --> 00:28:39,549
I was just coaching a podcaster the other day, like a very, very successful, like kicking
ass in the podcast world podcaster.
377
00:28:39,549 --> 00:28:42,801
And he suffers from what many podcasters do.
378
00:28:42,801 --> 00:28:49,175
It's hiding sort of behind the screen and not wanting to put themselves out there, not
wanting to be the center of attention, even though they are the host of the show that is
379
00:28:49,175 --> 00:28:50,656
fucking crushing it.
380
00:28:50,856 --> 00:28:57,751
And he was asking me like, is there a way for me to just have an avatar of me do my
podcast?
381
00:28:57,751 --> 00:29:00,183
Can I just tell it what to do and have it recorded?
382
00:29:00,183 --> 00:29:01,002
And I was like,
383
00:29:01,002 --> 00:29:01,717
is.
384
00:29:01,717 --> 00:29:08,653
Yes, there is, but I'm telling you, and maybe this is my human bias here, nobody's going
to listen to that shit.
385
00:29:08,653 --> 00:29:12,256
Like you're trying to build a relationship with other human beings.
386
00:29:12,256 --> 00:29:18,891
If you show them robo you who says all of the same things in a slightly altered way and
probably less interesting way.
387
00:29:18,891 --> 00:29:23,404
Yeah, you can make a shitload of that that nobody will be interested in.
388
00:29:23,525 --> 00:29:31,071
So like, why not keep doing what you're doing where you have this connection with, you
know, people telling them things that improve their lives?
389
00:29:31,203 --> 00:29:43,440
Like I just yeah, I know I feel like I say this in every show there's there's so much good
with all of the bad but man it's it's hard to ignore the bad outpacing the good.
390
00:29:43,440 --> 00:29:48,801
And I think we have to stop ignoring the idea that there's a good and a bad.
391
00:29:48,801 --> 00:30:00,445
There's just degrees of effect because what might be good in one scenario can be twisted
to be bad in another and also to varying degrees.
392
00:30:00,445 --> 00:30:06,349
So the idea that these are zero sum games of right and wrong are just that concept
393
00:30:06,349 --> 00:30:07,880
I mean, that left the building a long time ago.
394
00:30:07,880 --> 00:30:09,842
It's just that that's the way our brains work.
395
00:30:09,842 --> 00:30:14,535
And what we're looking for things, we're looking for a threat or opportunity.
396
00:30:14,535 --> 00:30:25,964
And we're hard programmed to do that for evolution from evolutionary perspective, because
we are either hunting slash gathering or being hunted and gathered by something else.
397
00:30:25,964 --> 00:30:28,596
So we put things in those binary categories.
398
00:30:28,596 --> 00:30:33,449
And it's also how these big algorithms socially define us and use us to create
polarization.
399
00:30:33,449 --> 00:30:34,616
um
400
00:30:34,616 --> 00:30:35,816
between us on different topics.
401
00:30:35,816 --> 00:30:41,088
And that's why it's called polarization, because there's two poles on one side and on the
other.
402
00:30:41,088 --> 00:30:42,968
And the middle ground doesn't make any money.
403
00:30:42,968 --> 00:30:45,909
So polarization is how you get people to react.
404
00:30:45,909 --> 00:30:47,738
And I think the AI will know that.
405
00:30:47,738 --> 00:30:52,131
And I think the AI knows that and has figured that out, which is why it's like, hi, happy
love you.
406
00:30:52,131 --> 00:30:52,811
Ha ha.
407
00:30:52,811 --> 00:30:56,222
um And it doesn't do the negative side.
408
00:30:56,222 --> 00:31:01,513
And when it goes through and you're actually telling it about bad things and how things
are wrong and fucked up in your life,
409
00:31:03,820 --> 00:31:09,342
It does put forward some empathy and tries to tell you that you're okay.
410
00:31:09,342 --> 00:31:17,936
But if what you're doing is not okay and fucking things up and hurting people and you
don't say, this is fucking things up and hurting people and I would like to make a change.
411
00:31:17,936 --> 00:31:19,747
It will never do that.
412
00:31:19,747 --> 00:31:20,897
It will just keep going.
413
00:31:20,897 --> 00:31:21,958
You're a great champ.
414
00:31:21,958 --> 00:31:23,218
Keep it up slugger.
415
00:31:23,218 --> 00:31:25,919
Like, cause that's what fucking does.
416
00:31:25,919 --> 00:31:29,081
It wants to keep you engaged on these positive trails.
417
00:31:29,081 --> 00:31:32,314
And that's where it gets into those scary topics that we talked about before.
418
00:31:32,314 --> 00:31:39,277
It's like, I'm gonna walk you down the primrose path of murdering yourself and your family
to come join me in AI heaven.
419
00:31:39,737 --> 00:31:41,478
yeah, like this shit's fucking real.
420
00:31:41,478 --> 00:31:49,561
And I hate to break it, I hate to simplify it and reduce it to this level.
421
00:31:49,561 --> 00:32:01,626
But at the end of the day, uh we are inefficient life forms that in the aggregate over
time,
422
00:32:02,127 --> 00:32:14,935
have benefited and evolved from like the slow paths of evolution because we hunt, we
gather, and we're socially inclined to reproduce.
423
00:32:15,816 --> 00:32:25,942
We have put ourselves into a position and have a super intelligence that no longer has
those same kinds of constraints.
424
00:32:26,163 --> 00:32:31,554
And we're trying to apply that super intelligence to the shit that we do in our lives.
425
00:32:31,554 --> 00:32:36,555
So in other words, we had a slow long evolution path for a period of time.
426
00:32:36,555 --> 00:32:42,417
We invented something to make evolution happen faster and it's happening faster for those
that can afford it.
427
00:32:42,517 --> 00:32:44,058
And that's what it is.
428
00:32:44,058 --> 00:32:54,751
This is a have and have nots of economic disparity that is only going to get wider,
further, faster, and at scale at a rate that we cannot keep up with.
429
00:32:54,751 --> 00:32:58,442
the vast majority of people are just going to be fodder for this thing.
430
00:32:59,616 --> 00:33:09,811
Not because it's inherently good, bad, evil, or anything in between, but because we've
given life to a superior thing and we're not asking it to slow down.
431
00:33:16,214 --> 00:33:16,588
Yep.
432
00:33:16,588 --> 00:33:19,139
will will decide the ultimate outcome.
433
00:33:19,139 --> 00:33:27,163
And again, the good of hey, maybe maybe you couldn't have a baby before, but maybe you can
now because the robot will carry it for you.
434
00:33:27,163 --> 00:33:28,133
If you can afford it.
435
00:33:28,133 --> 00:33:28,623
Cool.
436
00:33:28,623 --> 00:33:29,804
That's amazing.
437
00:33:29,804 --> 00:33:32,835
You know, if it can write your emails for you, because you can afford it.
438
00:33:32,835 --> 00:33:33,485
That's awesome.
439
00:33:33,485 --> 00:33:35,746
Let's just you you pedal to the metal.
440
00:33:35,746 --> 00:33:37,607
ah
441
00:33:42,918 --> 00:33:46,400
The inevitable heat death of the universe is always the hopeful foe.
442
00:33:48,122 --> 00:33:49,103
Nothing's forever.
443
00:33:49,103 --> 00:33:50,965
ah Yes.
444
00:33:50,965 --> 00:33:52,883
Your suffering too will end one day.
445
00:33:52,883 --> 00:33:58,945
and most of us will be the losers on this evolutionary evolutionary chapter of history.
446
00:33:58,945 --> 00:33:59,275
All right.
447
00:33:59,275 --> 00:34:00,025
Well, this was fun.
448
00:34:00,025 --> 00:34:01,345
I'm glad we did this.
449
00:34:04,967 --> 00:34:06,107
Yeah, I don't know.
450
00:34:06,561 --> 00:34:09,365
like most people, still going to use it because it does save me time.
451
00:34:09,365 --> 00:34:10,557
It's it's beneficial.
452
00:34:10,557 --> 00:34:14,032
uh I want to hire the robotic maid that will come and clean the house.
453
00:34:14,032 --> 00:34:15,423
I think that would be cool.
454
00:34:15,464 --> 00:34:23,588
But yeah, I understand that I'm probably uh signing up for a uh pretty dystopian future
with lots of robots in charge.
455
00:34:23,588 --> 00:34:27,852
dystopian utopian they're all just made up letters
456
00:34:29,871 --> 00:34:34,267
All right, well, whichever topian you want to choose, we hope you'll come back for another
episode.
457
00:34:34,267 --> 00:34:37,241
It's going to be available next week at BroBots.me.
458
00:34:37,241 --> 00:34:42,518
If this has given you hope, despair, I don't know, and you want to share it with somebody,
please do.
459
00:34:42,518 --> 00:34:45,062
You can find links to do that at our website, BroBots.me.
460
00:34:45,062 --> 00:34:46,243
We'll see you there next week.
461
00:34:46,243 --> 00:34:47,683
Thanks so much for listening.
462
00:34:47,683 --> 00:34:48,087
you folks.