How to Raise Resilient Kids in an AI-Dominated World
Are your kids making deeper, more meaningful connections to AI chatbots than they are to their friends, family, and even you?
As more kids choose artificial validation over real human connection, it's creating a generation that can't handle authentic relationships or genuine feedback. While schools push AI integration, we're accidentally teaching kids that algorithms understand them better than parents, teachers, or friends do.
Discover how to help your kids use AI as a tool without losing their humanity. Learn the warning signs of AI dependency, understand why current education systems are failing our kids emotionally, and get practical strategies for building real-world resilience in an artificial world.
Listen now to learn how to raise emotionally intelligent humans in an AI-dominated world.
TOPICS DISCUSSED
- The AI Therapy Epidemic - Why kids are turning to chatbots for emotional support and validation instead of trusted adults
- Educational Over-Optimization - How schools have created mouse traps that kill curiosity and natural learning instincts
- The Agency Crisis - Why kids feel powerless in their own education and how this creates learned helplessness
- Loneliness in Plain Sight - The isolation epidemic hitting young men especially hard in our hyper-connected world
- Friction vs. Frictionless - Why we need healthy challenges and why the "easy button" mentality is dangerous
- The Validation Trap - How AI constantly affirms kids without providing the growth that comes from real feedback
- Social Media's Role - The connection between social platforms and the shift toward artificial relationships
- Teacher Agency - How educators can break free from rigid curriculums to create meaningful learning experiences
- The Human Connection Currency - Why authentic relationships will be the most valuable skill in an AI world
- Practical Solutions - Real strategies for balancing AI tools with human
----
GUEST WEBSITE:
https://www.theimpactleague.org/
1
00:00:00,129 --> 00:00:07,462
Are your kids making deeper, more meaningful connections to AI chatbots than they are to
their friends, family, and even you?
2
00:00:07,543 --> 00:00:15,901
As more kids choose artificial validation over real human connection, it's creating a
generation that can't handle authentic relationships or genuine feedback.
3
00:00:15,901 --> 00:00:22,661
While schools push AI integration, we're accidentally teaching kids that algorithms
understand them better than their parents, teachers, or friends do.
4
00:00:22,699 --> 00:00:28,371
So today we're gonna talk about how you can help your kids use AI as a tool without losing
their humanity.
5
00:00:29,552 --> 00:00:36,317
We'll talk about the warning signs of AI dependency, understand why current education
systems are failing our kids emotionally, and get practical strategies for building real
6
00:00:36,317 --> 00:00:39,168
world resilience in an artificial world.
7
00:00:39,201 --> 00:00:44,671
So hang out as we try to discover how to raise emotionally intelligent humans in an AI
dominated world.
8
00:00:50,051 --> 00:00:59,840
Hi there, is BroBots, the podcast formerly known as The Fit Mess, and along with Jason,
I'm Jeremy, and this is a podcast for men who want to use AI safely to manage their mental
9
00:00:59,840 --> 00:01:00,660
health.
10
00:01:01,126 --> 00:01:09,663
And our conversation today is really a continuation of one that we've had on this show for
some time now, as we've been talking more and more about how so many people are using AI
11
00:01:09,663 --> 00:01:16,239
to manage their mental health, whether they don't have access to human mental health
through therapy or counseling, things like that.
12
00:01:16,239 --> 00:01:19,221
People are turning to AI because they are finding a lot of validation.
13
00:01:19,221 --> 00:01:21,423
They're finding a lot of advice, good and bad.
14
00:01:21,423 --> 00:01:26,807
ah And it's creating a lot of opportunity for people to get help that they need.
15
00:01:26,807 --> 00:01:30,460
But it's also creating a lot of danger for people that don't really know how to navigate
it well.
16
00:01:30,460 --> 00:01:32,341
And so as I mentioned, this is sort of a continuation.
17
00:01:32,341 --> 00:01:35,223
uh I posted something on LinkedIn recently.
18
00:01:35,464 --> 00:01:36,262
I'll just read it to you.
19
00:01:36,262 --> 00:01:39,046
said, the danger of AI therapy isn't that it doesn't work.
20
00:01:39,046 --> 00:01:40,928
The danger is that it works too well.
21
00:01:40,928 --> 00:01:44,127
Perfect responses, instant availability, zero judgment.
22
00:01:44,127 --> 00:01:47,830
relationships can't compete with that level of artificial perfection.
23
00:01:47,830 --> 00:01:51,271
We're accidentally making human connection feel inadequate.
24
00:01:51,271 --> 00:01:54,486
this one got an interesting response from a guy named Scott Merkel.
25
00:01:54,486 --> 00:01:55,434
He responded saying,
26
00:01:55,604 --> 00:01:56,414
Jeremy, you nailed it.
27
00:01:56,414 --> 00:02:04,688
While everyone is arguing about cognitive offloading and brain rot in education, which is
totally off base in my opinion, this is the real issue, outsourcing all types of
28
00:02:04,688 --> 00:02:07,829
relationships to AI, girlfriends, therapy mentors.
29
00:02:08,770 --> 00:02:11,350
In a therapeutic context, one of the most nuanced things a mental health professional can
do
30
00:02:11,384 --> 00:02:16,126
is use a belief in the other person's potential to challenge them and push them beyond
their comfort zone.
31
00:02:16,126 --> 00:02:23,988
You would need an MA in counseling psychology to even know how to prompt for that level of
young people and disenfranchised folks want validation.
32
00:02:23,988 --> 00:02:26,633
Our systems and institutions have offered them that.
33
00:02:26,633 --> 00:02:30,056
and or they became jaded about their ability to navigate these systems.
34
00:02:30,057 --> 00:02:35,423
Our education system is a great example of failing so many kids and not validating their
strengths.
35
00:02:35,423 --> 00:02:38,726
The trust has eroded and they shift to AI.
36
00:02:38,726 --> 00:02:41,669
This is the problem we should all be paying attention to.
37
00:02:41,914 --> 00:02:45,494
Naturally, I looked up Scott's background to try to understand who he is and a little bit
of his context.
38
00:02:45,494 --> 00:02:47,336
And what I found was very compelling.
39
00:02:47,335 --> 00:02:55,075
He's one of the people with boots on the ground to try to help solve this problem as we
explore more about how to use AI in our real lives, but also try to hang on to our
40
00:02:55,075 --> 00:02:58,586
humanity and what it is that makes us different from machines.
41
00:02:58,803 --> 00:03:06,490
So we invited Scott to join us here on the show and have a conversation about what he's
seeing working directly with young people and where the school systems are letting them
42
00:03:06,490 --> 00:03:10,653
down and where AI is starting to fill in the gaps for better and for worse.
43
00:03:10,653 --> 00:03:13,353
so we'll talk more with Scott about that right after this.
44
00:03:14,074 --> 00:03:15,215
Our guest today is Scott Merkel.
45
00:03:15,215 --> 00:03:17,556
He's a school counselor and founder of the Impact League.
46
00:03:17,556 --> 00:03:22,918
It's a program that helps young people take awesome ideas and make them happen in the real
world to help others.
47
00:03:22,918 --> 00:03:27,590
He's also interested in how things like AI are changing school systems and how we think
about ourselves.
48
00:03:27,590 --> 00:03:34,574
And Scott, you and I connected on LinkedIn a few days ago, really in response to something
I posted, which was in response to something we talked about on our show.
49
00:03:34,574 --> 00:03:38,785
And it's how we're turning so much of our AI therapy over to AI.
50
00:03:38,826 --> 00:03:43,998
Jason, you were sharing recently that mental health is like the number one thing people
are using AI
51
00:03:43,998 --> 00:03:45,209
for these days.
52
00:03:45,509 --> 00:03:53,496
And you had some concerns about how people are doing more and more of our outsourcing of
not only therapy, but our actual human relationships.
53
00:03:53,496 --> 00:03:55,497
What are the concerns that you have about this?
54
00:03:57,055 --> 00:04:08,088
Yeah, I mean, I think a lot of the world that I've been occupying for the last couple
years in education, a lot of the conversation has been about the actual production of work
55
00:04:08,168 --> 00:04:10,889
and cognitive offloading.
56
00:04:11,129 --> 00:04:16,405
there was a recent study done by some folks at MIT that gained a lot of traction.
57
00:04:16,405 --> 00:04:18,131
I'm sure you've seen it.
58
00:04:18,291 --> 00:04:25,553
And there's just so much conversation about the function of school and learning and the
cognitive offloading of that.
59
00:04:25,749 --> 00:04:26,970
For me, that's a non-issue.
60
00:04:26,970 --> 00:04:37,894
I think there's plenty of evidence out there that suggests that humans have ah some
intrinsic motivations, some natural curiosity, some natural inclination to learn.
61
00:04:37,894 --> 00:04:48,708
I don't think that stuff goes away with AI, but the bigger concern I think, and we've seen
this play out for young people, especially over the last decade, is just the loneliness
62
00:04:48,708 --> 00:04:55,253
epidemic, ah continued isolation, continued push away from
63
00:04:55,807 --> 00:05:10,231
trusted connections and relationships into more siloed spaces where they are ultimately
interacting with algorithms ah that change their thinking and really take away their
64
00:05:10,231 --> 00:05:11,970
humanity in a lot of ways.
65
00:05:11,970 --> 00:05:18,423
And so I think ah it's not even a conversation education right now, which I think it needs
to be.
66
00:05:18,423 --> 00:05:23,784
And I think that's part of the AI framework and policy stuff that gets talked about in
schools.
67
00:05:23,955 --> 00:05:34,705
in a lot of places is still not codified and doesn't exist in the way that, like, how do
we teach young people that there are some great use cases for AI and this is what that
68
00:05:34,705 --> 00:05:40,280
looks like and how do we develop some fluency around that and we don't want to outsource
our humanity.
69
00:05:40,280 --> 00:05:46,175
Like, that's the quickest way for us to, you know, a few generations from now to not be
around anymore.
70
00:05:46,175 --> 00:05:49,068
m
71
00:05:50,030 --> 00:05:57,550
Yeah, I mean, it's, kind of crazy that you mentioned the idea that kids are feeling
isolated and alone and we're supposed to be more connected than ever.
72
00:05:57,770 --> 00:06:10,890
But do you feel that the personification of the individual and social media comes down to
kind of our own mental cognitive map trying to go through and remap the value of self back
73
00:06:10,890 --> 00:06:15,854
to whatever it is this presentation layer is that we're, calling social media.
74
00:06:15,854 --> 00:06:23,314
and trying to create this interaction where human beings just feel disconnected from
themselves because they look at this thing as supposed to be reflection of who they are.
75
00:06:23,474 --> 00:06:25,554
And they kind of know it's not.
76
00:06:25,554 --> 00:06:34,694
And they're trying to live up to their own ideals that they're putting on themselves that
are a mirrored reflection of what their social group says they should be.
77
00:06:35,594 --> 00:06:44,834
But I guess my question to you is, since you're working in this space, what's the judo to
get out of it for kids?
78
00:06:44,914 --> 00:06:45,670
Because...
79
00:06:45,794 --> 00:06:47,095
I mean, I've got two daughters.
80
00:06:47,095 --> 00:06:48,515
They're 20 and 24.
81
00:06:48,515 --> 00:06:51,157
They came up right around the time of social media.
82
00:06:51,217 --> 00:06:57,440
And no one like Jeremy because his kids are quite a bit younger and they're more steeped
in it than my kids are.
83
00:06:57,541 --> 00:07:08,086
And I don't see a simple way to fix this problem from a continued use perspective.
84
00:07:08,206 --> 00:07:12,389
know China's gone through and they basically limit kids to like two hours a day of social
media.
85
00:07:12,389 --> 00:07:15,430
um
86
00:07:15,608 --> 00:07:24,233
they've had some success with, but man, I don't know, like two hours of doom scroll, it's
probably enough to taint things pretty well.
87
00:07:24,233 --> 00:07:30,506
So don't know what the right amount is, what the right portion is, and I guess I'm hoping
you have some insight.
88
00:07:32,129 --> 00:07:45,416
Yeah, I mean, I think you're a technologist, Jason, so you'd know more about the ins and
outs of the manipulation and the traps that are created psychologically for even somebody
89
00:07:45,416 --> 00:07:53,931
like myself who understands psychologically what's happening physiologically and our
nervous systems are not uh in position to be able to navigate this world.
90
00:07:53,931 --> 00:07:56,292
And of course, kids don't know any of that.
91
00:07:56,292 --> 00:08:00,484
They're just trying to make a sense of their identity, especially as they come into
adolescence.
92
00:08:00,484 --> 00:08:01,813
But I think
93
00:08:01,813 --> 00:08:16,402
where I'm trying to work in particular is uh social media is obviously a huge problem, but
think one of the, and AI will be a continuation of that I think in terms of what kids have
94
00:08:16,402 --> 00:08:24,957
struggled with in the last decade plus is this idea that they don't see themselves in the
system.
95
00:08:24,957 --> 00:08:26,098
They don't have any voice.
96
00:08:26,098 --> 00:08:27,198
They don't have any agency.
97
00:08:27,198 --> 00:08:28,679
They don't have any
98
00:08:29,075 --> 00:08:38,562
sense of co-creation or sharing a power and in that they lose some validation for who they
are and what their identity is.
99
00:08:38,562 --> 00:08:45,086
There's a lot of, of course, over optimization that's happened in education, like
optimizing for all these things.
100
00:08:45,086 --> 00:08:50,089
And so in a lot of ways we're treating kids less human to optimize for these things.
101
00:08:50,490 --> 00:08:53,782
We're not validating them and they don't have any agency in the process.
102
00:08:53,782 --> 00:08:56,354
And so it feels very transactional.
103
00:08:56,734 --> 00:08:58,095
And so they
104
00:08:58,611 --> 00:09:01,733
look for a place where they can feel some validation and comfort.
105
00:09:01,733 --> 00:09:09,438
And I think the downstream implications of that are we've just created a lot of learned
helplessness, we've created some fragility.
106
00:09:09,438 --> 00:09:17,104
ah And these are things that I think we have to look back at ourselves as adults in the
system and go, like, what are we doing?
107
00:09:17,104 --> 00:09:26,330
We have created this mousetrap that really stunts curiosity, stunts the innate desire for
individuals to learn.
108
00:09:26,330 --> 00:09:27,951
ah
109
00:09:28,073 --> 00:09:29,864
essentially being counting.
110
00:09:29,864 --> 00:09:43,401
so going back to the MIT study, the idea that AI has ruined education is a farce to me
because it's, Jason, to your point, it's a mirror back on the system.
111
00:09:43,401 --> 00:09:48,574
If the system was operating effectively, wouldn't be...
112
00:09:50,175 --> 00:09:55,028
Intrinsic motivation is uh predicated on self-determination theory.
113
00:09:55,028 --> 00:09:57,749
It's autonomy, it's competence, it's relatedness.
114
00:09:57,761 --> 00:10:04,216
If those things are happening in a meaningful way in the environment, like there's not
this desire to take the path of least resistance.
115
00:10:04,216 --> 00:10:06,111
So that path has always been there.
116
00:10:06,111 --> 00:10:10,670
I used to be calling, you know, I had a friend named Alex Engel or something.
117
00:10:10,670 --> 00:10:11,861
He was my AI, right?
118
00:10:11,861 --> 00:10:13,743
And he had all the answers to the homework.
119
00:10:13,743 --> 00:10:22,349
ah So the path of least resistance looks different now, but it doesn't change the
mechanics of like what's happening in the system itself.
120
00:10:22,349 --> 00:10:25,571
So to me, that is...
121
00:10:26,645 --> 00:10:38,055
directly where we need to be working as adults to go like we need to give up tent that we
created around what's actually important and what work we're doing and we need to bring
122
00:10:38,055 --> 00:10:43,099
kids into the process and give them some agency and treat them like human beings.
123
00:10:43,099 --> 00:10:45,941
And I think that really goes a long way.
124
00:10:45,941 --> 00:10:47,222
I'm oversimplifying it.
125
00:10:47,222 --> 00:10:52,263
It might seem a little esoteric, but I've done some work in that space and I've just seen
the
126
00:10:52,263 --> 00:10:57,779
the results from kids with no agency just like, okay, tell me what to do.
127
00:10:57,779 --> 00:11:08,900
They're just so trained into that, into that mode to then have any agency to go create and
design something that is important to them and gives them some sense that they have some
128
00:11:08,900 --> 00:11:12,353
agency and ownership over what can happen in the
129
00:11:15,731 --> 00:11:20,313
give me an example or two of the kids you've worked with that are struggling with this,
that don't have that agency, that are sort of like, what does that look like in your
130
00:11:20,313 --> 00:11:22,036
office, on the campus?
131
00:11:22,036 --> 00:11:23,327
who is that kid walking down the hall?
132
00:11:23,327 --> 00:11:25,268
What is he dealing with on a day-to-day basis?
133
00:11:25,268 --> 00:11:26,590
And what does that look like?
134
00:11:26,590 --> 00:11:34,524
Yeah, I mean, I think one of the best exercises that we can do as adults is, it's been a
long time since we all sat in a classroom.
135
00:11:34,545 --> 00:11:39,327
And so one of the things that I've tried to do every year is be a student for a day.
136
00:11:39,327 --> 00:11:41,649
And that's human design, that's empathy, right?
137
00:11:41,649 --> 00:11:43,350
And that's something that AI can't do.
138
00:11:43,350 --> 00:11:54,096
AI can, you know, ah simulate trying to be empathetic and talking about what that
experience is like, but it can't live it as close as you might.
139
00:11:54,096 --> 00:11:55,080
uh
140
00:11:55,080 --> 00:11:57,201
as a human being in kind of a surrogate way.
141
00:11:57,201 --> 00:12:04,914
So I think for kids, it's we've created, it's twofold.
142
00:12:04,914 --> 00:12:13,898
So one, we've created this system by which it's like uh becomes zero sum competitive to
get to this place, which is higher education in a lot of ways.
143
00:12:13,898 --> 00:12:16,979
And I've worked in public schools for a long time as well.
144
00:12:16,979 --> 00:12:18,650
And there's a lot of that there.
145
00:12:18,650 --> 00:12:20,911
It's like, I have to take all the AP classes.
146
00:12:20,911 --> 00:12:22,081
I have to check all the boxes.
147
00:12:22,081 --> 00:12:23,692
I have to do all the things.
148
00:12:23,742 --> 00:12:30,468
And it becomes just this exercise of stuffing the quote unquote resume.
149
00:12:30,468 --> 00:12:35,902
And what that looks like in practice for a kid is they wake up at, I don't know, 6.30, 6
a.m.
150
00:12:35,902 --> 00:12:37,403
and they get to school.
151
00:12:37,924 --> 00:12:44,189
An example of a kid I followed, we went to uh their one elective for the day, which is
like graphic design.
152
00:12:44,189 --> 00:12:46,091
And that was fun.
153
00:12:46,091 --> 00:12:53,152
And then we went to honors chemistry, took a test for 90 minutes, went to AP World
History.
154
00:12:53,152 --> 00:12:56,893
took another test for 90 minutes and then went to pre-calc.
155
00:12:56,893 --> 00:13:01,004
And by the time we're in pre-calc, you know, I'm struggling to keep my eyes open.
156
00:13:01,004 --> 00:13:02,194
I'm doodling.
157
00:13:02,194 --> 00:13:03,785
There's been no opportunity to move.
158
00:13:03,785 --> 00:13:06,286
There's been no opportunity to discuss.
159
00:13:06,286 --> 00:13:16,718
There's been no opportunity to uh get any type of feedback other than to sit there, rote
memorization and dump information or listen to a lecture.
160
00:13:16,718 --> 00:13:22,430
Graphic design aside a little bit, but those same kids then go to extracurriculars.
161
00:13:22,430 --> 00:13:23,240
They go
162
00:13:23,400 --> 00:13:31,835
to music sports, theater, scouts, whatever it may be, they're not getting home ah
sometimes until eight, nine in the evening.
163
00:13:31,835 --> 00:13:33,206
They haven't eaten.
164
00:13:33,226 --> 00:13:36,928
We've just taken these things that sound crazy.
165
00:13:36,928 --> 00:13:41,251
That's a crazy day for any one of us in terms of a work day.
166
00:13:41,411 --> 00:13:48,936
And to do that day in and day out ah in search of this thing that we've all determined is
important, I think is...
167
00:13:48,936 --> 00:13:50,136
ah
168
00:13:50,846 --> 00:13:51,757
it's really challenging.
169
00:13:51,757 --> 00:13:58,183
So when they come in, think they just feel they're just trying to survive.
170
00:13:58,484 --> 00:14:06,232
There's no sense of like, don't have, they don't even have the space to sit and like
unpack whether they should be doing this.
171
00:14:06,232 --> 00:14:08,514
Like they feel like they have to.
172
00:14:08,956 --> 00:14:13,690
And that's just the messaging that they've been given from the jump.
173
00:14:13,715 --> 00:14:18,759
So do you feel like administrators and folks that make curriculum are focusing on the
wrong KPIs?
174
00:14:18,759 --> 00:14:29,536
So I mean, they're data-driven and they're trying to elicit certain results, but they're
not taking into account the physiological, the psychological, and the emotional toll that
175
00:14:29,536 --> 00:14:32,348
driving and pushing this direction actually take on individuals.
176
00:14:32,348 --> 00:14:42,839
And as far as I can tell, we're trying to use AI to elicit a certain response out of kids
in terms of putting them through a certain curriculum criteria and trying to get them
177
00:14:42,839 --> 00:14:45,439
strive them towards a certain output or a certain outcome.
178
00:14:45,719 --> 00:14:50,919
But we don't do anything to measure mental health of kids.
179
00:14:50,958 --> 00:14:52,319
Like, we really just don't.
180
00:14:52,319 --> 00:14:54,099
Like, there's not a mental health scorecard.
181
00:14:54,099 --> 00:14:57,879
There's not a class on how to deal with shit in schools.
182
00:14:57,879 --> 00:14:59,419
And there should be.
183
00:14:59,419 --> 00:15:09,219
I mean, we don't teach kids how to balance a checkbook or how to pay rent or any of those
pieces, but we'll push them through like trig and geometry, like not really practical
184
00:15:09,219 --> 00:15:09,619
skills.
185
00:15:09,619 --> 00:15:11,979
And we'll do story problems that, you know,
186
00:15:11,979 --> 00:15:17,201
This is how fast a how long a train takes to go from somewhere in Paris to somewhere in
London.
187
00:15:17,642 --> 00:15:18,982
And there's a schedule for that.
188
00:15:18,982 --> 00:15:26,295
Like I don't really need, that's not really a skill I need, but I need some more practical
skills and we're like bypassing these practical skills and putting them in these
189
00:15:26,295 --> 00:15:34,559
situations even as while they're in school, when they're leaving school, where they're not
prepared, but we're also putting them at a disadvantage while they're in school to push
190
00:15:34,559 --> 00:15:39,951
them towards KPIs that, you know, honestly are probably not really going to be relevant
anymore because
191
00:15:40,001 --> 00:15:47,615
With the advent of AI, a lot of these jobs that we're talking about pushing them towards
that require a college degree are just not gonna be there in 10 years.
192
00:15:48,016 --> 00:15:50,217
Probably even sooner than that.
193
00:15:52,658 --> 00:15:58,242
schools are titanic in terms of their curriculum and how they actually can transform and
move things.
194
00:15:58,242 --> 00:16:01,643
It takes a long time and most of time they can't slow down before they hit an iceberg.
195
00:16:02,384 --> 00:16:07,991
It sounds very much so like we're heading towards an iceberg and like there's a lot of
people waving their hands in the air saying,
196
00:16:07,991 --> 00:16:12,697
we're gonna run into this thing as AI becomes our prevalent in the school and in schools
themselves.
197
00:16:13,779 --> 00:16:17,163
Are we just kind of hosed?
198
00:16:17,284 --> 00:16:20,109
I guess that's my real question to you.
199
00:16:21,344 --> 00:16:29,249
don't think we're hosed, but I think there's a lot of step back and reflection that needs
to happen on the part of the educators.
200
00:16:30,391 --> 00:16:38,337
There's identity there, and I don't want to treat that as if that is easy to think through
and reflect upon.
201
00:16:38,337 --> 00:16:48,164
But it's a change in identity from being the holder of all the knowledge and the
information and the content to a co-creator because to your point, Jason, we don't know
202
00:16:48,164 --> 00:16:48,788
where
203
00:16:48,788 --> 00:16:50,019
It's going to be in 10 years.
204
00:16:50,019 --> 00:16:52,751
And I think we have to relinquish some of that control.
205
00:16:52,751 --> 00:16:54,932
It's a compliance and control model right now.
206
00:16:54,932 --> 00:17:09,322
And what it has ended up doing downstream or, you know, the misaligned incentives or the
KPIs like you were talking about has just created this over pathologizing of childhood.
207
00:17:09,523 --> 00:17:17,308
And, you know, I think John Heist probably talked about that a little bit in both his
books, Coddling of the American Mind and The Anxious Generation, but
208
00:17:18,208 --> 00:17:30,688
you're taking kids that are generally fine and putting them in a place where they are no
longer fine because be it the expectations, the environment, the ask of them, the lack of
209
00:17:30,688 --> 00:17:41,228
validation, the lack of true human connection in a place that's supposed to be an
opportunity to learn and develop a lot of the skills that you mentioned, it's just not
210
00:17:41,228 --> 00:17:42,508
what's happening.
211
00:17:43,248 --> 00:17:47,352
I think that's kind of the
212
00:17:48,372 --> 00:17:53,155
we're going towards a place where, and we're already there, right?
213
00:17:53,155 --> 00:18:02,121
I think about young men in particular who are less likely to form, ah especially if
they're isolated through adolescence, like they're less likely to form these tight
214
00:18:02,121 --> 00:18:04,143
relationships in the same way.
215
00:18:04,143 --> 00:18:13,709
And I also have daughters, but in the same way that girls might, females are much better,
I think, at maintaining a network and staying in touch with them.
216
00:18:13,709 --> 00:18:17,952
But young males, you could be a young male
217
00:18:17,952 --> 00:18:29,272
orders your coffee online, picks it up, you don't have to talk to anybody, you get your
Amazon delivery, you get your groceries delivered, you interact with an AI girlfriend or
218
00:18:29,272 --> 00:18:36,172
AI therapist, and you play games all day, and you might gamble online.
219
00:18:36,172 --> 00:18:38,172
You don't have to talk to a human.
220
00:18:38,212 --> 00:18:40,745
And you can still, we've made it.
221
00:18:40,745 --> 00:18:42,019
just described my dream world.
222
00:18:42,019 --> 00:18:43,572
That was amazing.
223
00:18:45,761 --> 00:18:46,593
But it's dangerous.
224
00:18:46,593 --> 00:18:47,841
Yeah, you're right, it's dangerous.
225
00:18:47,841 --> 00:18:53,363
it's completely frictionless and you need friction for growth.
226
00:18:53,363 --> 00:19:05,636
And so I think we've removed, we've created all these points of quote unquote friction in
our systems, but they're the wrong points of friction where the right points of friction
227
00:19:05,636 --> 00:19:09,907
would be how do we create kids that are not fragile but anti-fragile?
228
00:19:09,947 --> 00:19:12,372
And the only way to do that is in
229
00:19:12,372 --> 00:19:18,407
like true relational environments where there is the messy middle.
230
00:19:18,407 --> 00:19:23,240
There are these opportunities for feedback in different ways.
231
00:19:23,240 --> 00:19:29,185
There are opportunities to let go of the control and let it play out and everyone's going
to be fine.
232
00:19:29,185 --> 00:19:30,055
Everyone's going to be safe.
233
00:19:30,055 --> 00:19:36,970
I think we've just pushed so far away from that into this other direction where it's just
an easy button now.
234
00:19:38,932 --> 00:19:41,074
Yeah, I don't need to tell either of you, you know that.
235
00:19:41,074 --> 00:19:52,573
It's just, hard to watch young people who don't have kind of that fully formed sense of
identity and maybe don't have the language to honestly even, and this is where we can go
236
00:19:52,573 --> 00:20:01,000
maybe into the therapy piece, but like they don't even have the language to articulate a
lot of times what they're consciously feeling and what that looks like.
237
00:20:01,000 --> 00:20:07,706
And they're kind of relying on an LLM who's going to probably either over-validate or send
them into, you know,
238
00:20:07,706 --> 00:20:13,107
different direction that really isn't what they needed in that moment.
239
00:20:13,771 --> 00:20:24,947
Yeah, it's almost like they need a default profile in their prompt engineering that takes
into account the Dunning-Kruger effect for kids who think they actually know something
240
00:20:24,947 --> 00:20:28,929
because the AI bot spits back something and says, you're right, it's validating.
241
00:20:28,929 --> 00:20:38,205
And it's really hard to attain emotional maturity when you're dealing with something that
just keeps going, yeah, you're doing great.
242
00:20:38,205 --> 00:20:42,807
Something needs to check you in the process because when something in the real world
243
00:20:42,807 --> 00:20:46,567
does and you don't know how to handle those pieces, you're going to break down.
244
00:20:46,567 --> 00:20:51,487
You know, I, I used to coach softball and I, I watch it with some of my players.
245
00:20:51,487 --> 00:20:56,467
Like if you came at them and criticize them in a certain way, they, they really couldn't
take it.
246
00:20:56,467 --> 00:20:59,207
And it was a lot different when I was a kid.
247
00:20:59,207 --> 00:21:12,187
And I just, this seems to be accelerating and becoming more of a divergent path and
becoming more and more difficult to have regular emotional conversations with kids.
248
00:21:12,367 --> 00:21:12,689
And
249
00:21:12,689 --> 00:21:13,640
for good reason.
250
00:21:13,640 --> 00:21:17,484
it's not like there's not a real reason why these things are hard.
251
00:21:17,484 --> 00:21:21,804
But yeah, I, sorry just to comment on that.
252
00:21:21,804 --> 00:21:22,128
You
253
00:21:22,128 --> 00:21:28,568
that's been my experience with working with AI is it's going to validate you and validate
you and validate you.
254
00:21:28,868 --> 00:21:30,628
And you're perfect.
255
00:21:30,628 --> 00:21:32,368
You're the perfect person to do this thing.
256
00:21:32,368 --> 00:21:35,228
And it's like, well, I don't know about that.
257
00:21:35,228 --> 00:21:36,368
I have a lot of gap.
258
00:21:36,368 --> 00:21:41,888
But you can see kids taking that in such a way.
259
00:21:41,948 --> 00:21:47,468
And that's not always good for their growth either, because they need to be given the
space to take healthy risks.
260
00:21:48,734 --> 00:21:53,129
Yeah, I just think the isolation and the loneliness piece of it is really the biggest
challenge.
261
00:21:53,129 --> 00:22:01,460
think that's an epidemic and I think ah we need relational friction and a lot of our
systems are dependent on human beings ultimately.
262
00:22:01,460 --> 00:22:05,394
And so we're just cutting that out of the process.
263
00:22:05,751 --> 00:22:12,494
So I mean, what I'm hearing from this conversation is really kind of opposing uh goals
really.
264
00:22:12,494 --> 00:22:24,569
mean, on one hand, we need to be teaching these kids how to really integrate AI and how to
really understand and use it as the tool that it is for any sort of future role in an
265
00:22:24,569 --> 00:22:27,440
economic system, whatever is left of our current one.
266
00:22:27,440 --> 00:22:34,343
But also we need to have them use it less and interact more in real life with real people
and real challenges.
267
00:22:34,584 --> 00:22:35,644
and maybe the answer here is,
268
00:22:35,644 --> 00:22:45,910
I don't know, but like, is there any agency at the education level from the teacher, from
the district, whatever agency to build that?
269
00:22:45,910 --> 00:22:53,174
mean, can a teacher experiment and find ways to combine the two or try new things?
270
00:22:53,174 --> 00:23:00,477
Or are they all really locked into a curriculum that's enforced so that they pass the test
and the school continues to get the funding that it needs?
271
00:23:01,930 --> 00:23:03,881
Yeah, I think that's a great question.
272
00:23:03,881 --> 00:23:10,706
I probably can't speak for every environment, but I think leadership dictates a lot of
that.
273
00:23:11,046 --> 00:23:16,350
There are some, I think, environments where teachers are given some agency to go explore
and try things.
274
00:23:16,350 --> 00:23:24,275
And I think those are probably the environments where you'd see the most kind of
innovative and unique solutions, of course.
275
00:23:25,376 --> 00:23:31,220
I think a lot of the, again, lot of the dialogue I've heard around it is just
276
00:23:31,480 --> 00:23:40,627
in the environments that I've been in is just this concern of losing this cognitive
offloading and this critical thinking.
277
00:23:40,707 --> 00:23:54,097
And that's not a concern for me because I don't think, I think the other things that are
playing on the system that we just were talking about are much bigger barriers to what
278
00:23:54,097 --> 00:23:58,130
happens going forward than this cognitive offloading piece.
279
00:23:58,130 --> 00:23:59,080
And I think
280
00:23:59,262 --> 00:24:02,902
that has existed, like I said previously, for quite some time.
281
00:24:02,902 --> 00:24:07,535
It's just very, very visible now to everybody that's watching.
282
00:24:07,575 --> 00:24:19,210
So I think any young person that has some, that lights up about something that they're
interested in, that gives them energy, engages them, that they feel some confidence around
283
00:24:19,210 --> 00:24:23,242
is going, and they feel like they have some autonomy to follow that and pursue that.
284
00:24:23,242 --> 00:24:25,322
That is so innate.
285
00:24:25,403 --> 00:24:29,098
That's how we derive purpose and meaning in life.
286
00:24:29,098 --> 00:24:30,929
I don't think that goes away.
287
00:24:30,988 --> 00:24:42,738
But I think what is happening is we are not leaning into those things and we're not
cultivating environment for those things to take place.
288
00:24:42,738 --> 00:24:53,155
And so I guess in a perfect world in an environmental system, you'd have, like you were
saying, Jeremy, you'd have a barbell strategy here where you have a lot of time without
289
00:24:53,155 --> 00:24:58,208
AI, where you have some opportunity to take some
290
00:24:58,706 --> 00:25:09,881
some risks in the real world, try some stuff, get some feedback, work with some different
people, mix up the ideas in such a way and the adult becomes more of a mentor and a guide
291
00:25:09,881 --> 00:25:17,344
than the holder of all the information because that's where learning comes from is that
place of curiosity and there's neuroscience around that too.
292
00:25:17,824 --> 00:25:27,058
But then there's some piece of it as well where it's like we need to have some fluency
around how to use the tool and we need to be able to sit down and go like, hey, this is a
293
00:25:27,058 --> 00:25:27,840
great
294
00:25:27,840 --> 00:25:30,520
place to use case for it.
295
00:25:30,520 --> 00:25:32,860
We think you should use it as a writing partner.
296
00:25:32,860 --> 00:25:39,900
You should use it as a sparring partner to come up with different ideas and brainstorm and
think about how you can do X, Y, or Z.
297
00:25:40,220 --> 00:25:52,760
But you don't want it to take the place of all these important human relationships that
are really, again, going back to just the most deeply humane things about us is our
298
00:25:52,760 --> 00:25:54,420
connection with others.
299
00:25:57,022 --> 00:26:01,405
I just think there's not a ton of thought about that.
300
00:26:01,405 --> 00:26:04,356
It's such a deep place to go to, I think.
301
00:26:04,356 --> 00:26:15,162
And in education, I think we're still stuck in standards and, well, kids have to go to
college and kids have to do this and kids have to do that.
302
00:26:15,423 --> 00:26:24,704
And oftentimes if you ask the kids, they have a completely different, and I did a project
like this in a school where the kids' perceptions of
303
00:26:24,704 --> 00:26:31,384
what was happening in the environment and the adult's perceptions were on polar opposite
ends.
304
00:26:31,684 --> 00:26:36,444
And I think oftentimes we don't ask the kids what their perspective is.
305
00:26:36,444 --> 00:26:43,724
And I think that ends up inadvertently creating these types of dynamics.
306
00:26:44,809 --> 00:26:49,043
Is there a positive to all the AI influence coming in in education?
307
00:26:49,043 --> 00:26:51,024
I let me relay a quick story.
308
00:26:51,024 --> 00:26:54,407
So I grew up uh military brat, hopped around a lot.
309
00:26:54,407 --> 00:26:58,811
um Had a lot of teachers, lot of different schools.
310
00:26:59,071 --> 00:27:04,096
And I get to fifth grade and my teacher just goes, you're going to be bored with the
curriculum.
311
00:27:04,096 --> 00:27:08,469
Here's the student version or the teacher book and go for it.
312
00:27:08,469 --> 00:27:09,720
See how far you can get.
313
00:27:09,720 --> 00:27:11,232
So I did that for fifth and sixth grade.
314
00:27:11,232 --> 00:27:13,203
And by the time I had done it, I had finished the
315
00:27:13,259 --> 00:27:15,680
sixth grade, I'd finish the seventh and eighth grade curriculum.
316
00:27:15,740 --> 00:27:20,421
If I had an AI, this would have been a great tool for that to kind of put those pieces
forward.
317
00:27:20,421 --> 00:27:28,283
And it's one of those pieces where I think people can thrive in certain environments and
certain conditions, but education needs to be sent towards the masses.
318
00:27:28,283 --> 00:27:40,187
And while you need to look at the different fringe levels of those pieces, I think that
there's, to me, it seems like there's value in AI being used in those kinds of ways to
319
00:27:40,187 --> 00:27:40,887
answer your.
320
00:27:40,887 --> 00:27:43,467
answer questions about curiosity that you're mentioning.
321
00:27:43,467 --> 00:27:44,607
like Google was great for that.
322
00:27:44,607 --> 00:27:46,127
didn't, we didn't know the answer.
323
00:27:46,127 --> 00:27:46,907
Everyone knows the answer.
324
00:27:46,907 --> 00:27:50,027
Now that you can Google those things, you can settle an argument right then and there.
325
00:27:50,067 --> 00:27:55,107
But then the question kind of becomes with this instant access to information, how much am
I going to hold onto it?
326
00:27:55,107 --> 00:27:56,267
How much do I actually remember?
327
00:27:56,267 --> 00:27:57,927
I don't need to remember phone numbers anymore.
328
00:27:57,927 --> 00:28:00,027
I don't need to remember addresses anymore.
329
00:28:00,027 --> 00:28:02,167
I don't need to know how to have directions anymore.
330
00:28:02,167 --> 00:28:05,071
I just need to know how to operate my GPS and lock my phone.
331
00:28:07,135 --> 00:28:12,858
Those things I think are great and a value add, but yeah, there's a declining value on the
other pieces.
332
00:28:12,858 --> 00:28:18,320
And if society collapses and falls apart, sure, I don't have telephones anyways, whatever.
333
00:28:18,360 --> 00:28:24,123
But kind of my point being that it seems like there's actually good things in here that
could be pulled out of it.
334
00:28:24,123 --> 00:28:31,806
And that the way that we get those into the system needs to be integrated with a more
comprehensive education program that actually takes into account.
335
00:28:33,333 --> 00:28:39,909
the emotions of kids, but also the interests of kids because not every kid gives a shit
about going to college.
336
00:28:39,909 --> 00:28:43,712
They don't and they shouldn't, to be honest.
337
00:28:43,712 --> 00:28:51,538
mean, my degree in social sciences does nothing for me in my current field of occupation,
but I have it.
338
00:28:51,658 --> 00:28:55,772
And my wife's a real estate agent and heard this season English slip major.
339
00:28:55,772 --> 00:28:57,563
Like these things don't really cross over.
340
00:28:57,563 --> 00:29:02,059
And yes, I think there's value in going through these process and learning.
341
00:29:02,059 --> 00:29:06,320
how to operate in the system, going to college and getting that learning experience.
342
00:29:06,500 --> 00:29:14,223
But ultimately speaking, I don't think we're teaching kids enough of how to use the
existing tools of society.
343
00:29:14,223 --> 00:29:22,235
And now we have these whole new sets of tools coming in and we're not really doing a good
job of showing them how to use it and how to regulate it.
344
00:29:22,235 --> 00:29:24,766
And we're trying to push it into everything right now.
345
00:29:25,006 --> 00:29:27,231
And it's scary.
346
00:29:27,231 --> 00:29:33,174
I mean, Microsoft just came up with this whole new thing for all the Office line products
where you are forced to use copilot.
347
00:29:33,174 --> 00:29:42,338
Like you are forced to use their AI system and they are charging you more for this, up to
your subscription cost to force you to eat more AI, which is bad for the environment and
348
00:29:42,338 --> 00:29:43,599
terrible for this and that.
349
00:29:43,599 --> 00:29:51,102
And really takes away from the individual creative process and force you to think on your
own.
350
00:29:51,742 --> 00:29:53,577
If the tools themselves are being
351
00:29:53,577 --> 00:29:59,109
shifted in that direction and moved in that direction and you're not going to get
corporations to really change those those aspects.
352
00:29:59,329 --> 00:30:08,653
Do we just need to be more adaptive in the way that we address these things, especially
with children who are probably our most vulnerable members of society?
353
00:30:10,014 --> 00:30:11,634
What's what's the trick?
354
00:30:11,634 --> 00:30:19,427
mean, I don't think I don't know if anyone's figured this out yet, but you know, where
does that balancing line come between utilizing these cool new technology pieces to try to
355
00:30:19,427 --> 00:30:23,477
derive some value out of it versus you've just
356
00:30:23,477 --> 00:30:30,137
taking this new technology piece that you're bludgeoning children over the head with and
trying to turn this into another tool that they're only going to be able to use this much
357
00:30:30,137 --> 00:30:34,223
of and ultimately it winds up being a negative for
358
00:30:36,384 --> 00:30:45,464
Yeah, I don't know if I know the answer, but I laughed when you said co-pilot because I
noticed that logging into my work email the other day and I couldn't figure out how to get
359
00:30:45,464 --> 00:30:48,904
to my email because co-pilot made it very challenging.
360
00:30:50,104 --> 00:31:04,510
I think the best kind of metaphor for this is what's been going on with SEL for the last
10 to 12 years and
361
00:31:04,510 --> 00:31:16,974
The short version of that is instead of going down to the root and figuring out some of
what I was talking about before, I think, and the environment itself and the ways that can
362
00:31:16,974 --> 00:31:25,726
change, there became this movement that, kids just need more of these SEL skills, which I,
I'm a mental health professional.
363
00:31:25,726 --> 00:31:27,097
I agree they need those skills.
364
00:31:27,097 --> 00:31:32,488
I just don't think that calling them out and siloing them as a separate thing.
365
00:31:32,796 --> 00:31:34,127
is the effective way to do that.
366
00:31:34,127 --> 00:31:37,636
And so what happened is that's how it was presented.
367
00:31:37,636 --> 00:31:39,300
It got siloed.
368
00:31:39,300 --> 00:31:42,642
And then schools said, hey, we need to do SEL.
369
00:31:42,642 --> 00:31:45,433
Every teacher is going to have a advisory.
370
00:31:45,433 --> 00:31:52,647
They're going to teach these SEL uh lessons and provide these skills.
371
00:31:52,647 --> 00:31:56,719
And the teachers were like, no, we don't want to do this.
372
00:31:56,719 --> 00:31:57,720
We're not qualified.
373
00:31:57,720 --> 00:32:01,452
We don't know how to talk about self-awareness and social awareness.
374
00:32:02,580 --> 00:32:05,821
responsible decision making and all that stuff.
375
00:32:06,022 --> 00:32:08,813
it didn't get, it wasn't equitable and how it was rolled out.
376
00:32:08,813 --> 00:32:16,127
And so then it becomes, well, this ed tech company swoops in and they have the solution
and they go, we can do it for you.
377
00:32:16,228 --> 00:32:21,770
And the school feels some responsibility to say that they're doing it well.
378
00:32:22,331 --> 00:32:27,527
And so worse than the teachers trying to struggle through it and acknowledge that like,
hey, this is messy.
379
00:32:27,527 --> 00:32:28,834
I don't know exactly what I'm doing.
380
00:32:28,834 --> 00:32:30,085
And that's part of this whole thing.
381
00:32:30,085 --> 00:32:32,076
That's what SEL is anyways.
382
00:32:32,528 --> 00:32:37,492
We've got EdTech on the scene and they're like, yeah, just press this button and it'll
give you everything that you need.
383
00:32:37,492 --> 00:32:42,036
really kids, especially middle school and high school kids, they see right through that
stuff.
384
00:32:42,036 --> 00:32:43,637
They're not participating.
385
00:32:43,637 --> 00:32:45,979
They just want authenticity.
386
00:32:46,019 --> 00:32:55,427
And that goes, I mean, that goes back to where I started with this is like the validation
that they're looking for is because they're not, they don't feel those authentic
387
00:32:55,427 --> 00:32:55,967
connections.
388
00:32:55,967 --> 00:33:02,012
There's a lack of trust in the system in terms of like meeting me as a human being.
389
00:33:02,120 --> 00:33:12,884
And so that's where I think the biggest concern is, I guess to bring this back full circle
is like the biggest concern is that the power and agency that we have to just be human
390
00:33:12,884 --> 00:33:20,928
beings in these systems, we have made ourselves more robotic and less human.
391
00:33:20,928 --> 00:33:31,612
And so the robot actually seems like the better option because it's going to show up with
validation first instead of just leaning into like
392
00:33:31,616 --> 00:33:36,881
It's just the over optimization and the hustle and the busyness.
393
00:33:36,881 --> 00:33:39,804
And even in schools, it's not that big of a deal.
394
00:33:39,804 --> 00:33:41,896
And everyone is burnt out.
395
00:33:41,896 --> 00:33:47,971
The teachers, the kids, the administrators, we don't have to be in that spot, but we are.
396
00:33:48,043 --> 00:33:49,544
And it's the name of the thing.
397
00:33:49,544 --> 00:33:51,005
It's artificial.
398
00:33:51,245 --> 00:33:54,638
Like, we're not becoming more dynamic and more organic.
399
00:33:54,638 --> 00:33:56,889
We're becoming more artificial.
400
00:33:57,290 --> 00:34:00,792
the tool's not adjusting to us as much as we're adjusting to it.
401
00:34:00,792 --> 00:34:13,021
And this seems a lot like social physics, just like regular meat space physics, where
every action has an equal and opposite reaction, and there's only so much emotional space
402
00:34:13,021 --> 00:34:14,521
kids can deal with.
403
00:34:14,883 --> 00:34:16,703
And how do you?
404
00:34:16,789 --> 00:34:19,160
Yeah, that search for authenticity is huge.
405
00:34:19,160 --> 00:34:23,152
And also the fear of being authentic is huge.
406
00:34:23,152 --> 00:34:26,883
Like you might actually have terrible thoughts.
407
00:34:27,163 --> 00:34:35,767
And if you express them and the AI sees it and listens to it, it might amplify those
pieces or you might be shunned for it.
408
00:34:35,767 --> 00:34:46,675
mean, it feels like there's a real lack um of space for organic growth and control because
we've just
409
00:34:46,675 --> 00:34:51,317
embrace what we think is so much better, which is artificial, which by the way, it's very
inefficient.
410
00:34:51,317 --> 00:34:52,147
It's not great.
411
00:34:52,147 --> 00:34:53,217
It's not cost effective.
412
00:34:53,217 --> 00:34:56,798
There's all kinds of problems with it, but it makes things happen really, really fast.
413
00:34:57,198 --> 00:35:04,121
For some reason, we're more concerned with fast results as opposed to quality accuracy.
414
00:35:04,341 --> 00:35:11,653
And it really feels like that's being pushed into education and on teachers in an unfair
way.
415
00:35:14,419 --> 00:35:15,180
100%.
416
00:35:15,180 --> 00:35:27,692
Yeah, I think if the teacher was given some agency to, I'll talk about a little bit of a
project I did and I know an eighth grade teacher in Colorado that's doing this work as
417
00:35:27,692 --> 00:35:42,378
well, he's basically doing whatever he wants and then tying back what, and whatever he
wants is giving his agency and having them dictate within some red line.
418
00:35:42,378 --> 00:35:52,584
boundaries like where the learning goes and then the kids drive it and the kids have to
present and the kids have to write and the kids have to so they have to do all these
419
00:35:52,584 --> 00:35:52,944
things.
420
00:35:52,944 --> 00:36:01,108
There's plenty of friction there but he's letting the kids drive within what I'm calling
these red line boundaries which is don't walk out into traffic.
421
00:36:01,108 --> 00:36:10,303
Don't you know go to this place on the internet and if you don't do those things we're all
good here and I want you guys to drive it and then I'm going to be able to tie these back
422
00:36:10,463 --> 00:36:11,956
and so I kind of did a similar
423
00:36:11,956 --> 00:36:22,904
deal with some kids that I started a peer counseling program five years ago and I couldn't
control, going back to control, I couldn't control, I spent a semester training them,
424
00:36:22,904 --> 00:36:28,028
giving them credit and I couldn't control the flow of the kids that were coming in to seek
their support.
425
00:36:28,308 --> 00:36:32,651
So at one point I said, I want you guys to do a project.
426
00:36:34,213 --> 00:36:39,216
It wasn't for a grade, I just said, I'm giving you credit, it's pass fail, but you're
gonna do this project.
427
00:36:39,328 --> 00:36:42,188
what the project looks like is 100 % up to you.
428
00:36:42,188 --> 00:36:45,088
I'm going to give you some data, school level data.
429
00:36:45,088 --> 00:36:46,168
want you to look at it.
430
00:36:46,168 --> 00:36:47,168
We'll gallery walk it.
431
00:36:47,168 --> 00:36:50,948
And I want you to ask some questions and we'll go from there.
432
00:36:50,948 --> 00:36:56,048
And the first four weeks were painful because they just wanted me to tell them what to do.
433
00:36:56,048 --> 00:36:59,428
And they wanted to, it's just that that's how they were trained.
434
00:36:59,428 --> 00:37:01,628
And we trained the curiosity out of them.
435
00:37:02,628 --> 00:37:06,028
So fast forward, they, they run with it.
436
00:37:06,028 --> 00:37:07,408
They, it was a great experience.
437
00:37:07,408 --> 00:37:08,608
They publish.
438
00:37:08,780 --> 00:37:11,022
This paper they're presenting to the adults.
439
00:37:11,022 --> 00:37:14,334
They're the thought leaders We've shared power now.
440
00:37:14,334 --> 00:37:27,673
I've opened doors for them to to get in front of adults and and the I think the reaction
from adults when they see kids who have taken some agency Have gone through a thoughtful
441
00:37:27,673 --> 00:37:37,760
process have collected some data have taken all these meaningful steps It's like standing
ovation and yet we don't give them any space and then we go back to business as usual
442
00:37:37,760 --> 00:37:40,420
and we don't give them any space to actually do that.
443
00:37:40,420 --> 00:37:46,140
And so that was kind of where my side project impact league came from.
444
00:37:46,140 --> 00:37:58,200
It's just this idea that when you give kids some space to lean into their agency,
something that they're super motivated to kind of run with, and you give them a little bit
445
00:37:58,200 --> 00:38:01,700
of light mentorship and guidance, like they're gonna do amazing things.
446
00:38:01,700 --> 00:38:06,010
We just need to get out of their way, create those red lines.
447
00:38:06,010 --> 00:38:07,861
And that's how they learn.
448
00:38:07,861 --> 00:38:19,074
I mean, the level of confidence, capacity, ah the change in 12 to 18 months, way more than
you see in a typical classroom.
449
00:38:19,620 --> 00:38:20,517
10x.
450
00:38:20,517 --> 00:38:27,791
Like if we can bring more of that into the classroom, give the teachers the agency to do
this sort of thing or build new curriculum that does this.
451
00:38:27,791 --> 00:38:33,164
think that that is one at least answer to this big puzzle.
452
00:38:33,164 --> 00:38:33,823
ah
453
00:38:33,823 --> 00:38:35,384
treat them like IEPs, right?
454
00:38:35,384 --> 00:38:40,975
Individual education programs that can be tailored to the individual that can run through
this kind of process to give them that kind of feedback.
455
00:38:40,975 --> 00:38:46,007
But you have to make space, like Scott was saying, for the emotional connection and
interactions.
456
00:38:46,007 --> 00:38:55,080
Where I think if you use these things as derivative tools that go through and look at
these KPI indicators and then also added in the emotional context piece inside of this,
457
00:38:55,080 --> 00:38:57,171
you can actually go through and have those kinds of interactions.
458
00:38:57,171 --> 00:38:59,031
I think you'd be much better off.
459
00:38:59,159 --> 00:39:06,839
And I think we have to stop measuring things as ABCs, Ds, or ABCs, Ds and Fs, because
that's not what the criteria for output is anymore.
460
00:39:06,839 --> 00:39:11,139
I mean, if you look at the way that these systems actually work, they're 51 % positive.
461
00:39:11,139 --> 00:39:17,759
That 1 % is the effective value that they actually make and grow and generate in those
LLMs.
462
00:39:18,079 --> 00:39:23,279
businesses, most successful corporations aren't 100 % successful.
463
00:39:23,279 --> 00:39:25,867
They're 51 % successful or better.
464
00:39:25,867 --> 00:39:26,730
Yeah.
465
00:39:26,869 --> 00:39:28,381
That 1 % can derive a lot.
466
00:39:28,381 --> 00:39:30,284
I think that needs to be put into education.
467
00:39:30,284 --> 00:39:32,496
And I think other countries have done this.
468
00:39:32,496 --> 00:39:35,423
Scott, it sounds like your program is definitely leaving the forefront on this.
469
00:39:35,423 --> 00:39:38,269
Where can folks learn more about it and follow your work?
470
00:39:39,422 --> 00:39:41,593
Yeah, theimpactleague.org.
471
00:39:41,653 --> 00:39:44,635
We provide micro grants to youth.
472
00:39:44,635 --> 00:39:46,576
We give them some light mentorship.
473
00:39:46,576 --> 00:39:53,470
The goal is to amplify impact and really support the next generation of change makers in
doing the type of work that I know they can do.
474
00:39:53,470 --> 00:39:55,300
They have incredible potential.
475
00:39:55,301 --> 00:39:57,522
We just need to give them some space to do that.
476
00:39:57,522 --> 00:40:01,685
So that's where you can find more information about the program.
477
00:40:01,685 --> 00:40:02,858
We'll have the links in the show notes.
478
00:40:02,858 --> 00:40:04,872
Anything you want to add here that we didn't get to?
479
00:40:07,016 --> 00:40:14,532
No, I think the only thing would be to look at the people you love and stay connected to
them.
480
00:40:14,532 --> 00:40:22,987
And I've been trying myself to go back to friendships and folks have gotten disconnected
from it, just to reconnect.
481
00:40:22,987 --> 00:40:29,062
And I think that's just going to be the most important currency going forward is that
human connection.
482
00:40:29,409 --> 00:40:30,771
Could not agree more.
483
00:40:30,946 --> 00:40:31,719
Thank you so much, Scott.
484
00:40:31,719 --> 00:40:32,315
This has been great.
485
00:40:32,315 --> 00:40:33,463
I really appreciate your time.
486
00:40:33,463 --> 00:40:34,420
Thank you, Scott.
487
00:40:34,420 --> 00:40:35,345
Likewise.
488
00:40:38,840 --> 00:40:40,101
All right, our thanks Scott.
489
00:40:40,101 --> 00:40:41,621
Boy, that's really loud.
490
00:40:43,903 --> 00:40:46,585
All right, our thanks to Scott Merkel.
491
00:40:46,585 --> 00:40:47,786
Great conversation there.
492
00:40:47,786 --> 00:40:49,857
found it validating in a lot of ways as a parent.
493
00:40:49,857 --> 00:40:57,322
There's so many things where he was talking about like the red lines and just sort of
giving kids the freedom to experiment and grow within those boundaries.
494
00:40:57,322 --> 00:41:04,576
know, essentially, within those boundaries, essentially, you know, don't run into traffic,
but you know, go play in the yard.
495
00:41:04,744 --> 00:41:10,706
These are the kinds of things I do as a parent is try to give my kids the freedom to
explore, to go out and be kids.
496
00:41:10,727 --> 00:41:15,809
It's a novel concept seemingly now, but I mean, when I was a kid, that's how we grew up.
497
00:41:15,809 --> 00:41:18,610
We came home from school, said hi to our parents if they were home.
498
00:41:18,610 --> 00:41:20,130
More often they weren't.
499
00:41:20,211 --> 00:41:22,091
And we were just out in the neighborhood until dinner.
500
00:41:22,091 --> 00:41:28,814
And if you weren't home by dinner, parents might've been concerned, but for the most part,
kids were allowed to be kids and explore.
501
00:41:28,814 --> 00:41:33,506
And there's so much emphasis on testing and getting it right and...
502
00:41:33,506 --> 00:41:37,599
being on the optimized path to make the most of every opportunity.
503
00:41:37,599 --> 00:41:39,620
And I get it, like there's a lot of value in that.
504
00:41:39,620 --> 00:41:44,964
There's a lot of missed opportunity I think that does come along if you don't work towards
something.
505
00:41:44,964 --> 00:41:53,240
But if you're allowed to explore what that something is for yourself, I think it's just
gonna lead to a much more fulfilling existence for that kid that you're trying to raise in
506
00:41:53,240 --> 00:41:54,130
the world.
507
00:41:54,531 --> 00:41:55,571
Lots of great information there.
508
00:41:55,571 --> 00:42:01,975
Lots of information from Scott that you can find at his website, theimpactleague.org that
is linked at our website, brobots.me.
509
00:42:02,462 --> 00:42:07,820
And that's where we'll be back in just a few days with another episode.
510
00:42:07,820 --> 00:42:09,382
Thanks so much for listening.

Scott Merkel
Educator/Founder/Mental Health Professional/Dad
Scott Merkel is a school counselor, systems thinker, and the founder of the Impact League — a youth social entrepreneurship engine designed to help young people turn their ideas into real-world impact.
Based in San Diego, Scott has spent the last 15+ years working in schools across the country, from Washington D.C. to Colorado to California, supporting students not just academically, but emotionally — helping them build resilience, agency, and purpose in a rapidly changing world.
While chairing a high school counseling department in Colorado, Scott launched a peer counseling program that evolved into something bigger: a platform for youth-led action research. Students used their voices to tackle mental health stigma in schools, earning statewide recognition and influencing decision-makers. That experience became the foundation for the Impact League.
Scott is particularly interested in how artificial intelligence is reshaping education, identity, and mental health — and what it means to be human in a world where outsourcing our relationships to machines is no longer science fiction.