Aug. 26, 2025

How To Filter AI Noise Without Missing Game-Changing Tools

Drowning in AI hype and feeling like a digital caveman?

Every day brings another "revolutionary" AI tool you MUST master or risk career extinction. The constant barrage of "7 prompts that'll make you rich" and "AI will steal your job tomorrow" content is creating anxiety, not opportunity. You're not alone in feeling overwhelmed by the machine takeover propaganda.

Learn to cut through the AI noise, set healthy boundaries with technology, and maintain your human edge while AI continues its relentless march. Discover why unplugging might be your secret weapon and how to spot the difference between genuine innovation and snake oil salesmanship.

Stop drowning in AI anxiety – listen now and reclaim your sanity.

Topics Discussed:

  • AI overwhelm is universal – why everyone's secretly freaking out but pretending they're not
  • The dangerous homogenization of human thought through AI-generated responses
  • How we're accidentally outsourcing our humanity to machines (even for grief responses)
  • Why setting boundaries with AI content consumption isn't weakness, it's survival
  • The cereal aisle effect – infinite choices leading to the same nutritional garbage
  • Mass layoffs at tech giants reveal the real AI employment apocalypse timeline
  • Why Bill Gates thinks programmers are safe (spoiler: he might be wrong)
  • The Henry Ford paradox – if AI replaces workers, who buys the products?
  • Simple career advice that predates AI but works better than any ChatGPT prompt
  • Finding your three life goals as the ultimate filter against digital noise

----

GUEST WEBSITE:
https://www.linkedin.com/in/pattyaluskewicz-aicoach/ 

----

MORE FROM BROBOTS:

Connect with us on Threads, Twitter, Instagram, Facebook, and Tiktok

Subscribe to BROBOTS on Youtube

Join our community in the BROBOTS Facebook group

----

LINKS TO OUR PARTNERS:

 

 

1
00:00:00,000 --> 00:00:03,646
Are you sick of AI bros screaming that your job is already dead?

2
00:00:03,646 --> 00:00:08,022
You feel like every AI related headline is one step closer to a panic attack.

3
00:00:09,259 --> 00:00:12,220
uh educator and trainer, Patti Aluskowitz.

4
00:00:12,220 --> 00:00:17,724
We'll talk to her about overwhelm, scams, and why the serial aisle is the perfect metaphor
for AI.

5
00:00:17,937 --> 00:00:22,403
also talk about why unplugging is your best weapon, how to filter out the hype.

6
00:00:22,403 --> 00:00:26,408
and the one piece of career advice that actually matters in the chaos.

7
00:00:32,376 --> 00:00:39,954
This is BroBots, the podcast that talks about how technology can be beneficial to your
health and well-being and why sometimes unplugging from it is the best thing you can do

8
00:00:39,954 --> 00:00:41,086
for your sanity.

9
00:00:41,643 --> 00:00:43,883
so my first question is just a personal one.

10
00:00:43,883 --> 00:00:46,543
This is something that I have been feeling lately.

11
00:00:46,543 --> 00:00:47,843
We talked about this on the show recently.

12
00:00:47,843 --> 00:00:53,983
Seeing the new Superman movie actually relieved some of my anxiety around AI because the
good guys won against the machines and I felt very good about it.

13
00:00:53,983 --> 00:00:55,743
Oh, there's hope for the future.

14
00:00:56,131 --> 00:01:02,857
But there's just, mean, anyone who's following this stuff is just battered over the head
daily with like the new thing you must be doing.

15
00:01:02,857 --> 00:01:07,221
If you don't know this, then you're going to be left behind and your job is going down the
gutter.

16
00:01:07,442 --> 00:01:11,455
It's overwhelming the amount of content that we are hit with every day about how AI is
changing the world.

17
00:01:11,455 --> 00:01:14,228
So validate my feelings here.

18
00:01:14,228 --> 00:01:19,192
Am I the only one that feels this overwhelm and this fear or is this something you're
seeing a lot of?

19
00:01:19,192 --> 00:01:22,695
oh

20
00:01:22,896 --> 00:01:24,379
So I'm glad you asked.

21
00:01:24,379 --> 00:01:34,226
No, often we do feel like we're the only ones experiencing something, but when you talk to
other humans, not machines, but humans in this case, you realize that we are more alike

22
00:01:34,226 --> 00:01:35,287
than we are different.

23
00:01:35,287 --> 00:01:39,330
uh Yes, I am overwhelmed with AI as well.

24
00:01:39,330 --> 00:01:44,553
uh I try to keep up with it and I just realize I can't.

25
00:01:44,994 --> 00:01:45,594
I can't.

26
00:01:45,594 --> 00:01:48,416
um It's in my face every day.

27
00:01:48,618 --> 00:01:50,449
I work in the AI space.

28
00:01:50,449 --> 00:01:54,011
uh I curate content for AI.

29
00:01:54,011 --> 00:01:59,274
So I'm constantly watching videos, reading articles, and it does get overwhelming.

30
00:01:59,735 --> 00:02:04,197
So I have felt like I have to put some boundaries around it.

31
00:02:04,277 --> 00:02:06,879
So they always talk about boundaries, don't they?

32
00:02:08,120 --> 00:02:08,541
Yeah.

33
00:02:08,541 --> 00:02:09,624
So what does that look like for you?

34
00:02:09,624 --> 00:02:12,549
What kind of boundaries have you set up for yourself to protect yourself?

35
00:02:12,600 --> 00:02:24,745
um Yeah, like I said, for my job, part of what I do as a content strategist and curator,
um especially in the AI space, is to review content.

36
00:02:24,845 --> 00:02:30,378
And I often review a lot of content on the same topic.

37
00:02:30,378 --> 00:02:36,040
So AI is killing jobs, or new college grads, guess what?

38
00:02:36,040 --> 00:02:37,611
You have no jobs.

39
00:02:37,931 --> 00:02:38,551
Sorry.

40
00:02:38,551 --> 00:02:39,978
um

41
00:02:39,978 --> 00:02:45,851
and you just watch video after video and at some point I start to get anxious and I have
to stop.

42
00:02:45,851 --> 00:02:57,938
I have to press pause and I have to walk away and say I will pick this up later and I will
do something else because getting anxious, I mean it's very easy for humans to get anxious

43
00:02:57,938 --> 00:03:06,022
especially when your mind goes to the worst place possible but it doesn't help because I
know I have no control over this.

44
00:03:06,022 --> 00:03:09,204
You have no control over this so...

45
00:03:10,712 --> 00:03:14,182
Panicking doesn't help the situation at all.

46
00:03:14,798 --> 00:03:22,103
signal to noise ratio, uh looking at what's coming in towards this, because AI has done
some great things.

47
00:03:22,103 --> 00:03:32,290
But some of the things that I think it's done terribly, or been terribly to the
infosphere, is homogenized perspective and homogenized viewpoints.

48
00:03:32,290 --> 00:03:39,835
Because it's no longer dealing with the facile human existence of

49
00:03:40,541 --> 00:03:43,112
made up perspective and variation.

50
00:03:43,193 --> 00:03:50,638
And because it's homogenized everything in such a way that the inference functions have
become very similar.

51
00:03:50,638 --> 00:03:56,362
So like I can ask chat tpt to create me something or a question and I can run it across.

52
00:03:56,362 --> 00:03:58,253
I can run it through Claude or anybody else out there.

53
00:03:58,253 --> 00:04:02,726
And I get an answer in a response that's like 80, 85 percent similar.

54
00:04:02,726 --> 00:04:08,309
But if I went to different people and I asked the same question, it would there be a lot
more variance.

55
00:04:08,570 --> 00:04:10,591
And when I look at that,

56
00:04:12,265 --> 00:04:23,564
I wonder if the value of AI is in homogenizing these perspectives and creating kind of a
universal talking track and understanding of things for people.

57
00:04:23,564 --> 00:04:35,994
If that is a good thing, and if it's not a good thing, what do you recommend people do to
disconnect from that or to try to alter things when interacting with AI to try to get

58
00:04:35,994 --> 00:04:41,938
responses that aren't so, I guess, monolithic uh in the way that they approach questions?

59
00:04:42,111 --> 00:04:51,019
I ask because I see a lot of people go through and like craft email questions or try to
ask, ask AI to do things for them.

60
00:04:51,120 --> 00:05:00,029
And they get crappy answers and crappy responses, but because they don't have a lot of
information on it, like it's the self-imposed Dunning Kruger effect, where I think I know

61
00:05:00,029 --> 00:05:06,515
a thing because the AI has the Dunning Kruger effect and it is following that kind of same
pattern.

62
00:05:06,515 --> 00:05:08,058
Is this something that you're actually seeing?

63
00:05:08,058 --> 00:05:13,040
Or again, am I making this up all on my own and thinking that I have the Denning Kruger
effect because I've seen this?

64
00:05:13,040 --> 00:05:14,381
No, it's just you.

65
00:05:15,561 --> 00:05:16,622
Just you, sorry.

66
00:05:16,622 --> 00:05:27,026
uh No, that's a really great question and that's a tough one um because there is that
risk, isn't there?

67
00:05:27,026 --> 00:05:30,227
um

68
00:05:32,451 --> 00:05:43,756
Are we going to lose our ability to think and reason and uh interact with each other and
just get the same answers spit out to us because we're interacting with this interface and

69
00:05:43,756 --> 00:05:46,698
I'm sure at some point it'll have an avatar in front of it.

70
00:05:46,698 --> 00:05:51,840
It's what it's, I mean, to try to maybe steer this a little bit.

71
00:05:51,840 --> 00:05:52,288
oh

72
00:05:52,288 --> 00:05:57,981
One of the things that reminds me of is when we had three TV channels, we all got the same
information.

73
00:05:57,981 --> 00:06:06,607
We were all operating from the same facts where now we are so divided over all of our
social media channels and whether we watch Fox or CNN or MSNBC or just the local news or

74
00:06:06,607 --> 00:06:15,712
only read the paper, we're all getting so much different information that it's hard to
know what's what's fact, what's right, what is the common like, what does everyone accept

75
00:06:15,712 --> 00:06:16,993
as the fact here?

76
00:06:17,065 --> 00:06:25,895
So I do wonder if this is sort of accidentally accidentally steering us back to something
like that, where we're all sort of asking the same question of the same machines.

77
00:06:25,895 --> 00:06:34,584
And then we'll suddenly have, Oh, we're all sort of operating from the same place based on
what the machines are feeding us rather than our own research of, you know, tick tock that

78
00:06:34,584 --> 00:06:35,779
day or whatever.

79
00:06:35,779 --> 00:06:44,115
Yeah, especially because those new seeds are all designed and set up to sell you more
toilet paper, Kleenex, anything else out there that you may or may not need.

80
00:06:44,115 --> 00:06:47,937
And the perspective itself is not necessarily the valuation of it.

81
00:06:47,937 --> 00:06:50,269
It's the engagement function of how you push those pieces.

82
00:06:50,269 --> 00:06:56,993
Sorry, just adding color to that example, because I think it's apropos in today's world,
because their motivation is kind of the same.

83
00:06:56,993 --> 00:06:58,154
Get you to look at this.

84
00:06:58,154 --> 00:06:59,905
um Yeah.

85
00:07:00,220 --> 00:07:06,903
What's interesting about the TV, em when you talk about, used to have all have the same
three TV channels.

86
00:07:06,903 --> 00:07:11,626
There's a lot of people that did not have that experience because they grew up after that.

87
00:07:11,626 --> 00:07:15,048
So we're not going to talk about how old we are.

88
00:07:15,048 --> 00:07:18,610
But if you think about it, right, we have infinite choices.

89
00:07:18,610 --> 00:07:20,431
I'm just thinking of Netflix.

90
00:07:20,431 --> 00:07:25,979
But we're all we are all steered towards the same top 10.

91
00:07:26,314 --> 00:07:28,755
You know, I mean, I don't know what

92
00:07:28,893 --> 00:07:35,187
what um shows you watch, but I'm sure I can mention them and you might have seen them.

93
00:07:36,309 --> 00:07:38,770
But there are millions of shows out there.

94
00:07:38,850 --> 00:07:39,991
So what are the chances?

95
00:07:39,991 --> 00:07:43,073
So we're all being steered to the same content anyway.

96
00:07:44,295 --> 00:07:47,296
Interesting, interesting.

97
00:07:47,677 --> 00:07:51,780
So I don't know if homogenization is good or not.

98
00:07:51,860 --> 00:07:56,161
Maybe in some senses because there's too much out there.

99
00:07:56,161 --> 00:07:56,922
Sure.

100
00:07:57,311 --> 00:08:00,719
But is it going to lead us to all think the same thing?

101
00:08:01,431 --> 00:08:04,235
And I think that could be dangerous.

102
00:08:04,235 --> 00:08:05,086
Mm-hmm.

103
00:08:05,236 --> 00:08:12,358
Yeah, I think of it, well, so one, it's maybe not new services, it might be the serial
aisle.

104
00:08:12,518 --> 00:08:15,087
So when you look at it back in the day,

105
00:08:15,087 --> 00:08:19,692
You know, when I was a kid, there were probably 10 cereals I could choose from.

106
00:08:19,692 --> 00:08:23,328
And there was and then there were 10 generic versions of that same.

107
00:08:23,328 --> 00:08:28,050
Again, Jason and I are 85 years old in case you millennials are listening.

108
00:08:28,050 --> 00:08:28,931
same age, right?

109
00:08:28,931 --> 00:08:31,873
Born in the 70s or 80s or something, right?

110
00:08:32,474 --> 00:08:33,675
I didn't say that.

111
00:08:34,111 --> 00:08:35,632
that's OK, we can edit that out.

112
00:08:36,794 --> 00:08:45,902
But along those lines, like we've created so many small micro variations of the same
pieces and we've gone through and we've gone, know, this is now a general cereal that has

113
00:08:45,902 --> 00:08:46,703
no gluten in it.

114
00:08:46,703 --> 00:08:48,965
And this is a variation of blah, blah, blah, blah.

115
00:08:48,965 --> 00:08:51,547
Now there's 20 million things in the cereal aisle.

116
00:08:51,547 --> 00:08:56,089
and we think we're getting something different, but it's slight variations in flavor
profiles.

117
00:08:56,089 --> 00:08:58,791
And the macronutrients are essentially the same.

118
00:08:58,791 --> 00:09:02,622
So it doesn't matter which cereal you choose.

119
00:09:02,622 --> 00:09:06,594
You're probably putting the same garbage in your body over and over and over again.

120
00:09:06,594 --> 00:09:10,496
But it tastes good, so we keep going back to it over and over and over again.

121
00:09:10,836 --> 00:09:16,779
Are we running into the same problem with AI where we're going after

122
00:09:17,083 --> 00:09:26,169
what we think might taste, feel good for us and getting these kinds of confirmation bias
put towards us and essentially getting crop nutrition as a result of it.

123
00:09:26,169 --> 00:09:33,394
And we're going to grow up being the you are what you eat kind of thing from a digital and
information perspective.

124
00:09:33,995 --> 00:09:41,841
And going back to this idea of homogenization of information and sources and losing the
creativity pieces, how do we get out of the cycle?

125
00:09:41,841 --> 00:09:45,667
Like how do people stop eating the same garbage cereal?

126
00:09:45,667 --> 00:09:49,690
and switching them usually in yogurt is not really going to be an option.

127
00:09:49,690 --> 00:09:51,741
Like, how do we break this pattern?

128
00:09:51,741 --> 00:09:56,174
And I don't know because I like cereal.

129
00:09:58,921 --> 00:10:00,822
Yeah, that's a really good question.

130
00:10:00,822 --> 00:10:06,005
It's almost like you have to force yourself out of the box that is being built around you.

131
00:10:06,005 --> 00:10:13,630
um If we're talking about the information theme, right?

132
00:10:14,751 --> 00:10:19,634
Go to the library and take a book off the shelf that you would have never read before.

133
00:10:20,215 --> 00:10:25,258
That's not within, you know, your, I guess,

134
00:10:25,744 --> 00:10:30,358
four or five topics of interest and try something else.

135
00:10:30,358 --> 00:10:34,773
em That would be my suggestion.

136
00:10:34,773 --> 00:10:36,324
How do you do that with AI?

137
00:10:37,514 --> 00:10:38,620
Yeah.

138
00:10:39,872 --> 00:10:41,532
It's funny, I literally did that the other day.

139
00:10:41,532 --> 00:10:47,352
I was at the library and there was a book, it was about AI, but it was from an angle like
I haven't read a book about this.

140
00:10:47,352 --> 00:10:49,912
I wanna see what this perspective is.

141
00:10:50,012 --> 00:10:52,512
But I also think like sometimes the simplest answers are the easy ones.

142
00:10:52,512 --> 00:10:54,492
And I think you are the correct ones.

143
00:10:54,492 --> 00:10:55,872
And you alluded to it earlier.

144
00:10:55,872 --> 00:11:04,692
It's like when we're just getting hit over the head with this constantly, go outside, just
turn the thing off and go take a walk in the woods for God's sake.

145
00:11:04,752 --> 00:11:07,592
Like we are just, I mean, it's just the nature of our society.

146
00:11:07,592 --> 00:11:09,826
Now we're constantly on this screen or the one

147
00:11:09,826 --> 00:11:10,916
looking at now.

148
00:11:10,944 --> 00:11:14,324
And we just got unplugged and that I think that is how we beat that.

149
00:11:14,324 --> 00:11:16,504
mean, the I did this the other day.

150
00:11:16,504 --> 00:11:19,524
A friend of mine had a terrible thing happen to them.

151
00:11:19,524 --> 00:11:21,864
They posted about it on Facebook and I was trying to find the right words.

152
00:11:21,864 --> 00:11:23,804
And in those situations, there are never the right words.

153
00:11:23,804 --> 00:11:24,904
And I did by default.

154
00:11:24,904 --> 00:11:25,464
went to A.I.

155
00:11:25,464 --> 00:11:28,844
and I was like, what is what is something appropriate to say here?

156
00:11:28,844 --> 00:11:30,124
And I felt horrible.

157
00:11:30,124 --> 00:11:34,484
I'm like, how am I so out of touch with my own human feeling that I can't find the words
for this?

158
00:11:34,484 --> 00:11:40,898
And ultimately, they came back with sort of the same bland thing I did because I
desperately wanted someone else to be able to to have the

159
00:11:40,898 --> 00:11:43,941
perfect words that don't exist, no matter how brilliant you are.

160
00:11:43,941 --> 00:11:52,187
ah So it is interesting how much we're outsourcing a lot of our human interaction.

161
00:11:52,188 --> 00:11:59,584
But I think it does change when you are face to face and you have to on the fly, connect
with that feeling or the thought in your head and make the words fall out of your mouth in

162
00:11:59,584 --> 00:12:06,059
the right way, rather than perfectly crafting the right comeback to really nail that
sucker on Twitter or whatever.

163
00:12:06,675 --> 00:12:08,187
It's just a different interaction.

164
00:12:08,187 --> 00:12:16,019
And so the more that I think we unplug and reconnect in reality with nature and human
beings, I think that that's probably the best way we can combat this.

165
00:12:17,558 --> 00:12:22,761
Well, quickly from an agility perspective and a Scrum perspective.

166
00:12:22,761 --> 00:12:24,262
So I mean, my background software.

167
00:12:24,262 --> 00:12:28,804
um So yeah, so we can nerd out like real hard on this.

168
00:12:28,804 --> 00:12:32,906
And I was CPO for an observability stack company out of Sweden for a while.

169
00:12:34,055 --> 00:12:36,676
I work in AI kind of frequently.

170
00:12:37,136 --> 00:12:47,241
But looking at these pieces and taking these types of tools and going through and trying
to understand how you're going to build things and create break fix functions and kind of

171
00:12:47,241 --> 00:12:58,285
handling these pieces over and over and over again, I've applied the same kind of practice
to myself with my own processes and logic and trying to put those things into context.

172
00:12:58,706 --> 00:13:01,267
And it kind of works at an individual level.

173
00:13:01,267 --> 00:13:01,991
But from

174
00:13:01,991 --> 00:13:10,351
an overarching perspective as to how it is people go through and actually use these tools
to go through design and build things.

175
00:13:10,371 --> 00:13:23,331
Are you seeing AI creep more and more into this space where people are trying to go
through, you know, PRDs and MRDs and everything else using AI and watching it do things

176
00:13:23,331 --> 00:13:29,051
that they thought were really, really black magic that they could only do themselves and
put these pieces together.

177
00:13:29,131 --> 00:13:31,783
And now they're seeing AI do these things, you know,

178
00:13:31,783 --> 00:13:35,383
90 % of what they did before, how many people are just giving up?

179
00:13:35,383 --> 00:13:36,483
Or just like, fine, whatever.

180
00:13:36,483 --> 00:13:37,523
I don't care about the 10%.

181
00:13:37,523 --> 00:13:47,743
Because I can make this thing happen now with the PRD or an MRD or write these component
pieces in like 60 seconds with chat GPT versus going through and taking a week or two

182
00:13:47,743 --> 00:13:50,003
weeks to go through and craft these pieces altogether.

183
00:13:50,503 --> 00:13:55,723
What's the fallout going to be for those of us that are writing software or consuming
software?

184
00:13:56,563 --> 00:14:01,543
When all these things start to play out in this kind of way, how does that actually

185
00:14:02,001 --> 00:14:03,061
Does it actually benefit us?

186
00:14:03,061 --> 00:14:04,211
Does it break us?

187
00:14:04,211 --> 00:14:09,954
A lot of companies right now are saying, hey, we're seeing these things come through, and
our development core is actually not getting better.

188
00:14:09,954 --> 00:14:12,794
They're getting slower because they're leaning on these tools.

189
00:14:13,395 --> 00:14:22,238
And I kind of wonder how much of that is because they're not using their own brains to do
this, and they're just going, I can kind of complete the same work in this period of time,

190
00:14:22,638 --> 00:14:24,028
doing that, and then doing other things.

191
00:14:24,028 --> 00:14:25,834
Yeah.

192
00:14:25,834 --> 00:14:30,274
Yeah, and that's a really good question.

193
00:14:32,074 --> 00:14:44,754
I'm not here to represent my company or talk about them, but just very high level, the
company that I work for is very much on the forefront of AI adoption.

194
00:14:45,134 --> 00:14:52,094
And they're using it everywhere, and everybody needs to use it, and every place they
possibly can.

195
00:14:53,554 --> 00:14:57,759
The people that I'm around and interacting with are really excited about it.

196
00:14:57,941 --> 00:15:01,546
But I'm sure that there are companies where that's not the case.

197
00:15:01,546 --> 00:15:05,812
um Maybe I'm just incredibly fortunate.

198
00:15:06,153 --> 00:15:09,898
But I have stopped to think.

199
00:15:12,036 --> 00:15:23,212
you know, understanding the software development life cycle, if 90 % of the work is
replaced by AI and whether it's, you know, it takes the same amount of time or doesn't

200
00:15:23,212 --> 00:15:26,134
take the same amount of time, I don't know, I guess the jury's out on that.

201
00:15:26,134 --> 00:15:31,637
But let's say it does cut out 80 % of the work that people do.

202
00:15:32,558 --> 00:15:34,578
What else are people gonna do?

203
00:15:34,886 --> 00:15:40,830
And I don't know that anybody has that answer, but I was watching something from the...

204
00:15:40,830 --> 00:15:48,250
This is a video from the World Economic Forum and they said software engineers are going
to be completely replaced by AI.

205
00:15:49,052 --> 00:15:55,274
And like if you are in that, if that's your job or you're going, like you better pivot
quickly.

206
00:15:55,274 --> 00:15:59,195
And maybe it's not tomorrow, but it's going to be soon.

207
00:15:59,570 --> 00:16:09,398
And I'm seeing that, you know, if you, if everybody doesn't have to learn to write code,
like what is that going to make our software products?

208
00:16:10,484 --> 00:16:19,801
But I've also seen like, you know, vibe coding and such where, you know, the people that
have the technical experience can actually take that tool to the next level.

209
00:16:19,862 --> 00:16:24,666
Whereas somebody that doesn't have the technical skills, they can't build as much.

210
00:16:24,666 --> 00:16:36,536
And I don't know if that's just a matter of where the tools are currently, or like senior
developers or tech leads will always be a step ahead of the people that just don't have

211
00:16:36,536 --> 00:16:38,057
coding experience.

212
00:16:38,461 --> 00:16:41,846
and they'll come out at the top anyway.

213
00:16:42,148 --> 00:16:47,856
But all these software engineers that sit around and write code, what are they going to
do?

214
00:16:49,647 --> 00:16:58,551
It's interesting, Bill Gates said recently that the three professions he thinks are
unlikely to be replaced by AI are biologists, energy experts and programmers because he

215
00:16:58,551 --> 00:17:05,794
believes that they require human creativity and critical thinking and the ability to make
intuitive leaps that AI systems currently don't.

216
00:17:05,955 --> 00:17:12,268
I mean, I'm not going to compete with, you brain power with with Bill Gates here, but I
would think that a lot of that can be replaced.

217
00:17:12,268 --> 00:17:14,219
I mean, we see so much.

218
00:17:14,219 --> 00:17:16,990
I mean, literally is artificial artificial intelligence.

219
00:17:16,990 --> 00:17:18,751
It's a simulation of how our brains work.

220
00:17:18,751 --> 00:17:19,701
And if if we

221
00:17:19,701 --> 00:17:28,588
code it to think like our brains do and when we get to a point where it's literally
merging, you know, the meat space that Jason likes to talk about and the technology.

222
00:17:28,781 --> 00:17:32,571
I think every job is suddenly replaceable.

223
00:17:32,667 --> 00:17:41,045
Other than and I even question this is as I say it out loud, but like I heard Gary V
talking about, you know, in 20 years jobs are going to be like going for a walk with a

224
00:17:41,045 --> 00:17:44,939
human because they just need to, you know, connect with the human being.

225
00:17:44,939 --> 00:17:52,586
Like I think that that will be sort of the space that can't really be touched unless we
get really good at making like incredibly lifelike robots.

226
00:17:52,586 --> 00:17:57,911
But even then, like I would know, I would think that I'm talking to a robot and I would
miss that human connection.

227
00:17:58,100 --> 00:18:03,494
But again, I'm 85 years old, so maybe somebody who's being born today wouldn't even know
the difference, right?

228
00:18:03,494 --> 00:18:13,380
So it's so interesting to consider that what if we did end up in a situation where all of
the actual work is being done by machines and we're left to just still, I mean, we

229
00:18:13,380 --> 00:18:20,595
struggle with this now to discover purpose, but to then really have to discover purpose
when all of the work is being done for us.

230
00:18:22,131 --> 00:18:24,231
you

231
00:18:24,435 --> 00:18:33,568
But is it going to be like that because I mean, I look at like a large corporation, for
example, like they have 100,000 workers.

232
00:18:33,848 --> 00:18:35,589
I'm just throwing out a number.

233
00:18:36,229 --> 00:18:39,030
Are they really going to replace all of them?

234
00:18:39,030 --> 00:18:42,291
And there's going to be like one person operating the machine.

235
00:18:42,871 --> 00:18:44,322
Is that what we're talking about?

236
00:18:44,322 --> 00:18:45,452
Or is it?

237
00:18:46,245 --> 00:18:48,193
I am going to be.

238
00:18:51,081 --> 00:18:53,241
It's we don't know, right?

239
00:18:54,521 --> 00:18:58,441
And then the 999, like they're all unemployed?

240
00:18:59,061 --> 00:19:04,078
Or do we need all these people to operate all these agents or whomever?

241
00:19:04,078 --> 00:19:14,906
Well, I think Microsoft, Amazon and Google are giving us a really good early preview of
what's happening there because they are eliminating huge portions of their workforce.

242
00:19:14,906 --> 00:19:16,107
And they work from five to 10%.

243
00:19:16,107 --> 00:19:20,410
And they have been doing it really over the last 18 months.

244
00:19:20,410 --> 00:19:23,932
And most of the people that they're getting rid of are middle managers.

245
00:19:23,932 --> 00:19:28,415
people that kind of manage workflow and help people to...

246
00:19:28,711 --> 00:19:31,051
operate and get done what they need to get done.

247
00:19:31,191 --> 00:19:41,271
And the people that are getting, that are sticking around are the really good individual
contributors that they see being able to use AI to augment their skill sets and create

248
00:19:41,271 --> 00:19:43,891
more things faster and better.

249
00:19:44,891 --> 00:19:51,251
Whether or not they actually do that, that's a protracted period of time they have to look
at before they can actually get a really good result.

250
00:19:51,291 --> 00:19:58,465
So it's hard to understand, you know, what the value is of the folks actually doing these
pieces until you've got a couple of years.

251
00:19:58,465 --> 00:20:04,339
of life cycle iteration to figure out if what they're producing is good or garbage.

252
00:20:04,419 --> 00:20:18,659
if you, so I've, I've worked at AWS, I've worked at Microsoft, I've done all kinds of work
at Google for different companies and they all have very, very smart people, but it's all

253
00:20:18,659 --> 00:20:22,931
about how you apply your intelligence and your motivation to push those things through.

254
00:20:23,012 --> 00:20:28,505
And those big companies that, you know, really run in these big cloud environments that
have huge numbers of GPUs.

255
00:20:28,807 --> 00:20:40,627
uh, uh, LPUs in place, running these big LMs, they're profit motivated and they're trying
to make money because they have a fiduciary responsibility to do that for their

256
00:20:40,627 --> 00:20:41,487
shareholders.

257
00:20:41,687 --> 00:20:51,127
So anywhere they think they can actually save money and increase revenue, um, or at least
increase margin, they're going to do that because that's what their motivation is.

258
00:20:51,367 --> 00:20:53,647
Do we have the wrong motivation?

259
00:20:54,267 --> 00:20:58,951
Are our motivations as a society to go after and chase money completely antithetical

260
00:20:58,951 --> 00:21:10,259
to the ability to have a world that's driven and dominated by AI because we know that
people are going to use this tool to do those things.

261
00:21:11,867 --> 00:21:15,209
And I think at some level, we all kind of want this.

262
00:21:15,209 --> 00:21:29,059
But as an employee who makes a fair amount of money, now suddenly not having a job because
I don't fall into the AI scope of model, what's that long-term effect look like?

263
00:21:29,059 --> 00:21:41,317
then going back to the idea of doing spot checking, iteration testing, if you lose all
those voices and all those independent thought thinkers, how does

264
00:21:41,705 --> 00:21:42,573
How does that help?

265
00:21:42,573 --> 00:21:44,563
Does that actually make products worse?

266
00:21:44,563 --> 00:21:45,514
Right.

267
00:21:45,514 --> 00:21:47,865
Well, and that's an unknown, right?

268
00:21:47,865 --> 00:21:49,596
That's a big question mark.

269
00:21:49,596 --> 00:22:00,283
But it's making me think of, uh you know, the first industrial revolution, maybe like, I
know it kind of started in England and it was there for about 100 years before it came

270
00:22:00,283 --> 00:22:01,024
here.

271
00:22:01,024 --> 00:22:03,815
But thinking about like Henry Ford.

272
00:22:04,401 --> 00:22:14,665
So he didn't invent the assembly line, he didn't invent the eight hour workday, but he
standardized this and he revolutionized how things were manufactured in this country.

273
00:22:15,365 --> 00:22:23,428
What he was trying to do was make the car affordable for the average person so people
could buy it.

274
00:22:23,748 --> 00:22:29,090
And he was paying his workers enough so they could be the customers, because he knew that.

275
00:22:29,090 --> 00:22:33,632
He said, I have all of these employees, these are going to be my customers.

276
00:22:33,670 --> 00:22:38,105
And then of course they go out into their communities and more people buy cars.

277
00:22:38,266 --> 00:22:43,712
So it seems kind of like the opposite is happening, right?

278
00:22:43,712 --> 00:22:49,840
They want to get rid of all of these employees, whether they get other jobs or everybody's
just unemployed.

279
00:22:49,840 --> 00:22:53,504
Who's going to buy all the products?

280
00:22:55,067 --> 00:22:56,713
Yeah.

281
00:22:56,713 --> 00:22:59,316
that doesn't make sense as a business person.

282
00:22:59,316 --> 00:23:03,231
And I don't know, are they thinking long term or are they thinking short term?

283
00:23:03,231 --> 00:23:06,364
Like we've got to increase our stock this quarter.

284
00:23:07,424 --> 00:23:10,025
Well, I they think short term.

285
00:23:10,025 --> 00:23:14,457
I mean, they think quarterly revenue numbers and how it is am I going to survive from
quarter to quarter?

286
00:23:14,457 --> 00:23:22,332
And you look at these large companies and organizations, I mean, yes, they've got long
term plans and long term methodologies, but I mean, their line of horizon is typically

287
00:23:22,332 --> 00:23:24,173
less than 36 months or less.

288
00:23:24,173 --> 00:23:25,503
And I mean, we're actually taught that.

289
00:23:25,503 --> 00:23:28,805
So if you look at most companies out there that have a vision,

290
00:23:28,987 --> 00:23:37,213
the vision when it changes, changes about every three years because they've gone through
that cycle and it's like, okay, well now we've kind of hit this vision or the horizons

291
00:23:37,213 --> 00:23:38,333
changed on it.

292
00:23:38,373 --> 00:23:40,895
And now we have to move things in this different directions.

293
00:23:41,335 --> 00:23:51,582
If, if there's nobody to buy this cool stuff that everyone's making, like, well then
you're making something for nothing, but we're not exactly known as a species for making

294
00:23:51,582 --> 00:23:52,823
good long-term decisions.

295
00:23:52,823 --> 00:23:56,106
Like we shoot ourselves in the foot consistently.

296
00:23:56,106 --> 00:23:58,757
I mean, in this country alone,

297
00:23:59,205 --> 00:24:05,842
We do things like, hey, let's make healthcare for profit.

298
00:24:05,842 --> 00:24:10,707
people are gonna, the number one cause for bankruptcy in the US is healthcare.

299
00:24:10,707 --> 00:24:15,551
Like expenses gone out of control and they want to declare bankruptcy because they can't
afford to do these things.

300
00:24:17,209 --> 00:24:27,384
If we're putting the collective value of human intelligence into the hands of these things
and saying, go do great stuff.

301
00:24:28,025 --> 00:24:34,388
It's probably not going to do that because the people actually giving it motivation and
telling it to go and do certain things.

302
00:24:34,949 --> 00:24:43,193
Aren't motivated by the long-term benefit and value to human society and empathy.

303
00:24:43,193 --> 00:24:45,615
They're, they're doing it for money.

304
00:24:45,615 --> 00:24:46,495
So.

305
00:24:50,567 --> 00:24:53,167
How do- I get nothing.

306
00:24:53,167 --> 00:24:56,376
I just- I mean, I just-

307
00:24:56,376 --> 00:24:59,041
so, I mean, I'll try to sort of bring this full circle.

308
00:24:59,041 --> 00:25:08,029
We started with the idea of overwhelm and all and we see just from this conversation, like
we're overwhelmed with with the unknown of the future that we're walking into.

309
00:25:08,069 --> 00:25:15,626
I just on a really small scale, I'm overwhelmed with the the posts of the here's the seven
prompts you need to be using right now or else you are a thousand years behind in your

310
00:25:15,626 --> 00:25:17,718
toast and your jobs out the door like.

311
00:25:18,278 --> 00:25:20,510
It's just like, it starts to eat at you.

312
00:25:20,510 --> 00:25:24,193
Like, am I so dumb that I haven't figured out how to be an overnight millionaire because
this amazing new tool exists?

313
00:25:24,193 --> 00:25:26,195
Like, there's got to be a lot of people feeling that way.

314
00:25:26,195 --> 00:25:27,526
yeah.

315
00:25:28,768 --> 00:25:31,870
Yeah, they all are.

316
00:25:31,870 --> 00:25:32,941
Yeah, they all are.

317
00:25:32,941 --> 00:25:34,272
That's the thing.

318
00:25:35,494 --> 00:25:36,266
Yes.

319
00:25:36,266 --> 00:25:44,490
on like you have to be very mindful and realize what is that person trying to sell me so
sorry to interrupt but keep going

320
00:25:44,490 --> 00:25:45,631
And I think that's it.

321
00:25:45,631 --> 00:25:47,051
It's just like trying to.

322
00:25:47,051 --> 00:25:49,863
And this is where like we have to hang on to that critical thinking.

323
00:25:49,863 --> 00:25:55,055
And we have to be able to look at the information we're given and rather than just assume,
they must know more than me.

324
00:25:55,055 --> 00:25:57,616
I better follow whatever this thing says.

325
00:25:57,616 --> 00:26:05,200
Start wondering like, OK, is there a link to the course that they're selling for a
thousand dollars at the bottom of this or like where is this going?

326
00:26:05,200 --> 00:26:13,244
So I guess I'm just kind of wondering, like, are there some simple things that you
recommend to people that are like, here's what to look for when when you're getting this

327
00:26:13,244 --> 00:26:13,946
advisory?

328
00:26:13,946 --> 00:26:20,380
hearing about this new thing, we've talked about just to unplug, go away, like take a
walk, know, go talk to an actual human being.

329
00:26:20,380 --> 00:26:28,925
What are some other like sort of takeaways that people can take from this and just go
like, I'm overwhelmed, I'm scared to, how do I feel better today about this?

330
00:26:29,840 --> 00:26:36,562
Yeah, I've thought about this a lot and this actually, my answer has nothing to do with
AI.

331
00:26:36,562 --> 00:26:39,463
has nothing to do, it probably predates AI.

332
00:26:39,463 --> 00:26:43,925
um This is just general career advice.

333
00:26:43,925 --> 00:26:50,227
um Find the three things in life that are important to you.

334
00:26:50,227 --> 00:26:51,788
What are your goals?

335
00:26:55,046 --> 00:27:03,401
really commit to them and then see what is coming into your space and does it align with
your goals.

336
00:27:03,501 --> 00:27:05,902
And if it does not, push it away.

337
00:27:06,543 --> 00:27:08,984
If it does, go for it.

338
00:27:09,805 --> 00:27:14,047
So whenever you decide, should I take this new job?

339
00:27:14,348 --> 00:27:17,009
I have an opportunity to apply for this, should I take it?

340
00:27:17,009 --> 00:27:20,311
Is it going in the direction of my goal in life or is it not?

341
00:27:20,311 --> 00:27:21,872
Is it taking me away?

342
00:27:23,185 --> 00:27:25,951
That's why Jason only watches cat videos on TikTok.

343
00:27:26,876 --> 00:27:29,958
No, no, I have no TikTok account.

344
00:27:30,099 --> 00:27:31,839
I use Reels.

345
00:27:32,800 --> 00:27:35,522
I intentionally have trained my feed to do this.

346
00:27:35,522 --> 00:27:41,286
So I used to work in ad tech and I used to watch demographic profile targeting functions
work in this direction.

347
00:27:41,286 --> 00:27:43,097
So that Facebook algorithm.

348
00:27:43,397 --> 00:27:45,549
Definitely knows who I am very, very well.

349
00:27:45,549 --> 00:27:54,836
And every now and again, through my cat between my cat feeds, like it'll stuff in
something either political or or old male related that apparently I need more testosterone

350
00:27:54,836 --> 00:27:56,457
all the time and my hair is falling out.

351
00:27:56,457 --> 00:27:59,240
ah But it's.

352
00:27:59,240 --> 00:28:01,241
seven prompts that you need to fix that problem, Jason.

353
00:28:01,241 --> 00:28:03,409
Just click on my course link and you're all set.

354
00:28:03,409 --> 00:28:05,623
Well, and the prompts are pretty simple.

355
00:28:05,623 --> 00:28:07,205
Only watch cat videos.

356
00:28:08,028 --> 00:28:09,120
That's the solution.

357
00:28:09,120 --> 00:28:14,063
Only watch cat videos and choose two serials out of the serial aisle and you're probably
okay.

358
00:28:14,063 --> 00:28:15,214
If they would just bring back Mr.

359
00:28:15,214 --> 00:28:19,878
T serial as we've discussed in depth on this show, then our lives would all be so much
better.

360
00:28:20,719 --> 00:28:25,683
Patty, you do some work with folks to help them with their AI overwhelm and issues.

361
00:28:25,683 --> 00:28:28,185
Where can folks learn more about you and what you offer?

362
00:28:28,232 --> 00:28:30,413
Yeah, so I'm on LinkedIn.

363
00:28:31,034 --> 00:28:32,154
try to post content.

364
00:28:32,154 --> 00:28:34,277
You know, we talked about putting boundaries down.

365
00:28:34,277 --> 00:28:45,126
um I used to post content every single day, but I've decided like that's not necessary
anymore because you get sucked into that machine, into that algorithm, and you have to put

366
00:28:45,126 --> 00:28:46,368
boundaries around.

367
00:28:46,368 --> 00:28:53,192
But I do try to put meaningful content out there, talking about topics just like this so
people could follow me on LinkedIn.

368
00:28:53,192 --> 00:28:56,325
I have a meetup, which I'm always promoting through LinkedIn.

369
00:28:56,325 --> 00:28:56,898
Thank

370
00:28:56,898 --> 00:29:05,033
Meat Machine, where I bring intelligent folks like yourselves on and I ask them questions
just kind of like we're talking about here.

371
00:29:05,294 --> 00:29:07,685
How is AI affecting their space?

372
00:29:07,685 --> 00:29:09,116
How is it affecting their life?

373
00:29:09,116 --> 00:29:12,438
And what are we doing to get through this together?

374
00:29:12,638 --> 00:29:16,000
Because this is a journey, this is a human journey.

375
00:29:16,000 --> 00:29:17,541
We gotta figure this out together.

376
00:29:17,541 --> 00:29:22,926
uh But I would say that, that's a good way to find me.

377
00:29:22,970 --> 00:29:25,736
Awesome, we'll make sure there's a link for that in the show notes for this episode.

378
00:29:25,736 --> 00:29:26,327
You're awesome.

379
00:29:26,327 --> 00:29:27,078
This was a lot of fun.

380
00:29:27,078 --> 00:29:27,990
Thank you so much for doing this.

381
00:29:27,990 --> 00:29:29,573
We'd love to do this again with you sometime.

382
00:29:29,573 --> 00:29:31,843
Yeah, thanks again for your time.

383
00:29:31,843 --> 00:29:32,734
Thanks,

384
00:29:33,017 --> 00:29:35,617
Our thanks again to Patti Oleskiewicz for joining us on the show today.

385
00:29:35,617 --> 00:29:36,828
Really fun conversation there.

386
00:29:36,828 --> 00:29:39,729
If you're feeling overwhelmed like we are, she's a great resource.

387
00:29:39,729 --> 00:29:40,789
You can follow her on LinkedIn.

388
00:29:40,789 --> 00:29:43,930
We've got the links to that in the show notes for this episode.

389
00:29:43,930 --> 00:29:46,010
And you can find that at brobots.me.

390
00:29:46,010 --> 00:29:51,902
If you have found any value in this episode and enjoyed this conversation, know of
somebody else who might feel the same way.

391
00:29:55,363 --> 00:29:53,569
Feel free to share the link to this episode with them.

392
00:29:53,569 --> 00:29:55,470
You can find that at brobots.me.

393
00:29:55,604 --> 00:29:58,061
That's also where you'll find a new episode from us next week.

394
00:29:58,061 --> 00:29:59,945
We'll see you again at robots.me.

Patty Aluskewicz Profile Photo

Patty Aluskewicz

AI Educator and Trainer

Patty Aluskewicz, founder of Agile Mindset Consulting, is an AI Champion, Organizational Coach, Consultant, and Professional Educator.

With over 15 years of experience, she helps businesses and their teams build essential skills to thrive in an AI-driven world, successfully integrating them into daily operations.

Patty is dedicated to empowering professionals to adapt to the changing workforce landscape, with a focus on equipping individuals with the knowledge and tools needed to excel in the future of work