How Doctors Are Using AI To Help Reverse Chronic Illness
Healthcare is catastrophically broken. We're spending $100K on preventable hospital visits while doctors get 12 minutes with patients, never read charts, and wonder why people don't "just take their meds." Type 2 diabetes patients sit at 10% control rates while the system shrugs.
Dr. John Oberg took 50 patients with dangerously high A1Cs and got 49 into control in 12 weeks. Then he did it again with 79 patients, dropping them into healthy ranges and taking 30% off insulin entirely. His secret? Reading the chart. Treating humans like humans. Changing ice cream flavors instead of lecturing about willpower.
This episode reveals how combining radical empathy, AI tools, and actual common sense is revolutionizing chronic disease care—and why the same principles work whether you're managing diabetes, depression, or just trying to unfuck your life one decision at a time.
Learn "implementation medicine"—Dr. Oberg's approach that treats patients as partners, not problems. We cover AI co-pilots, peer counseling, the loneliness epidemic, and why calling your doctor by their first name might actually save your life.
Key Topics:
- 98% success rates through radically small changes
- AI in healthcare: when it helps vs. when it kills
- The loneliness crisis: men now have fewer than 3 close friends
- Why the food pyramid was corporate propaganda
- White coat syndrome and first-name medicine
- Using tech without losing your humanity
----
GUEST WEBSITE:
----
MORE FROM BROBOTS:
Connect with us on Threads, Twitter, Instagram, Facebook, and Tiktok
Subscribe to BROBOTS on Youtube
Join our community in the BROBOTS Facebook group
----
LINKS TO OUR PARTNERS:
-
Explore the many benefits of cold therapy for your body with Nurecover
-
Muse's Brain Sensing Headbands Improve Your Meditation Practice.
-
Get a Free One Year Supply of AG1 Vitamin D3+K2, 5 Travel Packs
-
You Need a Budget helps you quickly get out of debt, and save money faster!
1
00:00:00,000 --> 00:00:09,590
Imagine ending up in the hospital with uncontrolled diabetes and with that a $100,000 bill
that could have been prevented with a little well-timed insulin.
2
00:00:16,134 --> 00:00:10,591
Our guest today is Dr.
3
00:00:10,591 --> 00:00:11,451
John Oberg.
4
00:00:11,451 --> 00:00:13,451
That happened to his mother-in-law.
5
00:00:13,451 --> 00:00:19,354
As a result, he got pissed, got his doctorate, and helped 49 out of 50 patients
6
00:00:19,354 --> 00:00:20,852
who had failed for years.
7
00:00:20,852 --> 00:00:24,248
get their condition under control in just 12 weeks.
8
00:00:25,056 --> 00:00:25,977
How?
9
00:00:26,058 --> 00:00:30,064
Not willpower, not keto, tiny changes.
10
00:00:30,145 --> 00:00:36,936
One patient could only eat ice cream, so they changed the flavor to less sugar and their
blood sugar dropped.
11
00:00:37,906 --> 00:00:40,823
can look like when someone actually gives a shit.
12
00:00:40,861 --> 00:00:49,388
Today we're talking about implementation medicine, groundbreaking AI health tools, and why
loneliness might be killing you faster than your blood sugar.
13
00:00:49,584 --> 00:00:58,687
Hey, folks, this is robots, the podcast that talks about AI technology and your health and
your wellness and how they all combine sometimes into a big clusterfuck of a mess.
14
00:00:59,156 --> 00:01:01,890
And that big clusterfuck is something that I've complained about.
15
00:01:01,890 --> 00:01:02,831
You've complained about it.
16
00:01:02,831 --> 00:01:05,414
We've complained about it here on the show plenty.
17
00:01:05,418 --> 00:01:07,091
So there's no shortage of complaints.
18
00:01:07,091 --> 00:01:08,379
We can all do that.
19
00:01:08,379 --> 00:01:14,271
But when it comes to the mess of what the US healthcare system is, we found somebody who's
actually trying to do something about it.
20
00:01:14,271 --> 00:01:21,634
We were lucky enough to convince him to hang out with us for a little bit and talk about
the work that he is doing about how a lot of it is built on the backbone of AI and
21
00:01:21,634 --> 00:01:22,803
advanced technologies
22
00:01:22,803 --> 00:01:29,066
are helping him scale his business, which in effect means helping more sick people get
better faster.
23
00:01:29,095 --> 00:01:30,058
So our guest today is Dr.
24
00:01:30,058 --> 00:01:31,693
John Oberg from Prasena Health.
25
00:01:31,693 --> 00:01:32,774
He is an expert.
26
00:01:32,774 --> 00:01:34,346
in reversing type two diabetes.
27
00:01:34,346 --> 00:01:41,516
And so we asked him to start by sharing the story that inspired his mission to take on
this incredibly massive challenge.
28
00:01:41,744 --> 00:01:50,641
working with or trying to resolve the problems that many people face with type 2 diabetes
is what sort of set you on this path tell us more about that what is it that sparked that
29
00:01:50,641 --> 00:01:53,023
curiosity and put you on this road
30
00:01:53,626 --> 00:02:02,840
Yeah, so I worked with organizations doing change management for a long time and then
started a healthcare company in 2014 that was doing something called medical cost sharing
31
00:02:02,840 --> 00:02:05,361
where it's like insurance, but it's not insurance.
32
00:02:05,361 --> 00:02:07,642
so people bear one another's burdens.
33
00:02:07,642 --> 00:02:17,446
And we saw lots of people that dealt with, you know, chronic disease, cancer, but it hit
home, you know, years later when we, my wife and I went to visit her mom and her mom had
34
00:02:17,446 --> 00:02:21,067
type two diabetes and hadn't checked her blood sugar in a few days.
35
00:02:21,067 --> 00:02:23,188
And the blood sugar was so bad that she ended up in the
36
00:02:23,188 --> 00:02:26,608
hospital and it was a totally avoidable hospital visit.
37
00:02:26,608 --> 00:02:29,908
you know, my guess is the hospital visit cost $100,000.
38
00:02:29,908 --> 00:02:31,348
It was paid for by Medicare.
39
00:02:31,348 --> 00:02:32,268
So the U.S.
40
00:02:32,268 --> 00:02:38,928
paid the bill and you have $100,000 bill that could have been solved with, you know, a
couple of shots of insulin well-timed.
41
00:02:38,928 --> 00:02:40,268
And I got really angry.
42
00:02:40,268 --> 00:02:44,968
And so I went back to get my doctorate and I told people at USC that I wanted to solve
type two diabetes.
43
00:02:44,968 --> 00:02:50,448
And they told me that was cute that we've been working on that globally for like 60 years
and we couldn't do it.
44
00:02:50,468 --> 00:02:52,084
They didn't say couldn't, they said it'd be
45
00:02:52,084 --> 00:02:53,664
That's a big challenge.
46
00:02:53,664 --> 00:02:57,304
And we took 50 patients and they were all really sick.
47
00:02:57,304 --> 00:03:04,184
know, a one C's of 9.6 and so 6.5 is diabetes and they'd all been uncontrolled for years.
48
00:03:04,184 --> 00:03:09,964
And 12 weeks later we had 49 out of 50 patients in control and kept them controlled for
the next two years.
49
00:03:09,964 --> 00:03:12,244
And these are like low income.
50
00:03:13,144 --> 00:03:14,124
Yeah.
51
00:03:14,664 --> 00:03:15,144
Yeah.
52
00:03:15,144 --> 00:03:20,124
Like I think if you have more than one comorbidity, you get like 10 % control, 50
something percent.
53
00:03:20,124 --> 00:03:20,617
If you have.
54
00:03:20,617 --> 00:03:22,088
you know, a single disease state.
55
00:03:22,088 --> 00:03:26,691
yeah, we've just been in our latest study hasn't been released yet.
56
00:03:26,691 --> 00:03:28,592
We just are now collected data for it.
57
00:03:28,592 --> 00:03:36,908
We're about to send it off for peer review, but we took 79 patients with an average A1C of
11.06 and dropped them into the sevens in 12 weeks.
58
00:03:36,908 --> 00:03:41,171
And we took 30 % of those patients off of insulin at six months.
59
00:03:41,171 --> 00:03:46,254
So we're doing things radically differently and we're getting radically different results
and the patients love it.
60
00:03:46,254 --> 00:03:47,814
Our providers love it.
61
00:03:47,835 --> 00:03:49,777
And we're saving the system a ton of money.
62
00:03:49,777 --> 00:03:52,059
And so we think we're on the right path.
63
00:03:52,981 --> 00:03:54,623
So what's the mechanism?
64
00:03:54,804 --> 00:03:56,777
What's the trick you're using?
65
00:03:57,193 --> 00:03:59,074
yeah, yeah, yeah.
66
00:03:59,074 --> 00:04:00,544
I wish it was a thing, right?
67
00:04:00,544 --> 00:04:05,136
I wish it was like one thing we could give people and just solve the problem, but it's,
you know, it's a thousand little things.
68
00:04:05,136 --> 00:04:12,530
And so we take all of everything that we know about medicine today, clinically, and all of
the devices that are available.
69
00:04:12,530 --> 00:04:19,803
And we take everything we know about mental health, primarily looking at anxiety,
depression, and stress that are caused by disease states like diabetes.
70
00:04:19,803 --> 00:04:21,916
And then we evaluate everything.
71
00:04:21,916 --> 00:04:26,338
and we look at severity scores for what's the most important thing to be treating for a
patient.
72
00:04:26,338 --> 00:04:30,741
And then we measure the patient's readiness for change to find out what they want to
change.
73
00:04:30,741 --> 00:04:33,722
And then we give them a radically incremental program.
74
00:04:33,722 --> 00:04:35,843
Like how small a change can you make?
75
00:04:35,843 --> 00:04:40,226
Like, can you just get them on the right medicine, taking their medicine at the right
time?
76
00:04:40,226 --> 00:04:42,257
Can you get them to change their coffee creamer?
77
00:04:42,257 --> 00:04:43,751
Like the littlest of things.
78
00:04:43,751 --> 00:04:45,165
And we talk to them all the time.
79
00:04:45,165 --> 00:04:47,807
heard you talking about this in another interview.
80
00:04:47,807 --> 00:04:51,428
Just the small because it is like a lot of times you hear, oh they didn't take their
medicine.
81
00:04:51,428 --> 00:04:52,199
They don't do the things.
82
00:04:52,199 --> 00:04:58,811
But you are giving these great examples about people like literally changing the flavor of
ice cream because the person could only eat ice.
83
00:04:58,811 --> 00:05:00,428
That was the only way they would get calories was ice cream.
84
00:05:00,428 --> 00:05:01,532
You said, well, let's change the flavor.
85
00:05:01,532 --> 00:05:03,293
And that helped drop their blood sugar.
86
00:05:03,336 --> 00:05:04,227
Yeah, it's crazy.
87
00:05:04,227 --> 00:05:05,307
mean, but that's what we do.
88
00:05:05,307 --> 00:05:11,142
And I had one person who was so excited by their visit with the physician that they wanted
to go walk a mile.
89
00:05:11,142 --> 00:05:13,204
And we said, hey, let's just try to get to the mailbox.
90
00:05:13,204 --> 00:05:14,595
Like, let's just make it smaller.
91
00:05:14,595 --> 00:05:16,597
Like, just tone it down.
92
00:05:16,597 --> 00:05:19,910
And so really small changes that people can get their arms around.
93
00:05:19,910 --> 00:05:21,611
But then we have to talk to them more regularly.
94
00:05:21,611 --> 00:05:24,723
Like, you can't wait two weeks or three months to talk to the patient again.
95
00:05:24,723 --> 00:05:30,788
So we interact with them as often as they need to make sure they keep making these small
changes and they build momentum.
96
00:05:30,788 --> 00:05:32,860
They change the inertia, build the momentum.
97
00:05:32,860 --> 00:05:33,527
And guess what?
98
00:05:33,527 --> 00:05:34,430
works.
99
00:05:34,725 --> 00:05:43,265
I want to get to some of the technology that you're using to help with this in a minute,
but I also want to start there because I think that is the thing, whether it's trying to
100
00:05:43,265 --> 00:05:47,765
lose weight, trying to get my finances under control, trying to fix whatever is broken.
101
00:05:47,925 --> 00:05:49,045
There's that hurdle.
102
00:05:49,045 --> 00:05:52,385
There's that degree of being stuck.
103
00:05:52,785 --> 00:05:59,005
I don't know, are you getting closer to figuring out what that is and how to help people
get out of it?
104
00:05:59,316 --> 00:06:00,896
Yeah, we look at lots of things, right?
105
00:06:00,896 --> 00:06:05,956
So we look at someone's skillset and if they don't have the right skillset, we have to
give them some form of training.
106
00:06:05,956 --> 00:06:09,056
We look at their mindset, like what's their belief system.
107
00:06:09,056 --> 00:06:14,336
know, the type 2 diabetes health belief systems are really important, but there are other
belief systems as well.
108
00:06:14,336 --> 00:06:19,476
We look at how they're prioritizing different parts of the problem to see if we can help
with that.
109
00:06:19,476 --> 00:06:27,316
And then we try to figure out what's the smallest unit of work we can kind of engage with
them to make it as easy as possible to have success.
110
00:06:27,316 --> 00:06:31,956
But that means that our providers have to be, you know, a lot more than what they were
trained to do in school.
111
00:06:31,956 --> 00:06:35,156
They have to take on new training and we train our providers every week.
112
00:06:35,156 --> 00:06:38,796
Like they have to go through weekly training with me where we're talking about psychology.
113
00:06:38,796 --> 00:06:40,136
We're talking about empathy.
114
00:06:40,136 --> 00:06:44,656
We're talking about skills training, all these things we have to be good enough at.
115
00:06:44,656 --> 00:06:47,076
And then we help our patients re-engage in the system.
116
00:06:47,076 --> 00:06:49,216
Like we're not going to try to be a nutritionist.
117
00:06:49,256 --> 00:06:50,616
There's nutritionists in the system.
118
00:06:50,616 --> 00:06:56,478
We can get people back to nutritionists when the time is right or certified diabetic, you
know, education people that
119
00:06:56,478 --> 00:07:01,351
that have all this great training, but they're frustrated too because patients aren't
doing what they're being asked to do.
120
00:07:01,351 --> 00:07:06,565
And so we try to tee up all those people for success, nurses, hotlines and care
coordination teams.
121
00:07:06,565 --> 00:07:08,556
We don't try to take the place of any of those people.
122
00:07:08,556 --> 00:07:13,702
We just want to have the patient show up ready to do what that next one step is.
123
00:07:13,702 --> 00:07:19,515
So is the mechanism to basically go through and apply kind of a minimum effective dose for
a change and then go through and create home
124
00:07:19,515 --> 00:07:28,355
homeostasis and then do another small change in your chain homeostasis again and
eventually get a trend where they're actually doing the right things.
125
00:07:28,696 --> 00:07:36,016
would say homeostasis is probably a little bit of a strong term because sometimes we move
before homeostasis is kind of achieved scientifically.
126
00:07:36,436 --> 00:07:39,096
let's use metformin as a great example.
127
00:07:39,096 --> 00:07:40,396
a really safe drug.
128
00:07:40,396 --> 00:07:42,156
We use it all the time.
129
00:07:42,156 --> 00:07:51,176
But there are people that self-medicate off of the drug because it can cause diarrhea,
right?
130
00:07:51,176 --> 00:07:53,596
And it's really uncomfortable, right?
131
00:07:54,771 --> 00:07:56,831
But we talk people through that.
132
00:07:56,831 --> 00:07:58,631
We can't stop the diarrhea.
133
00:07:58,631 --> 00:08:02,311
We can get them on Metformin Extended Release instead of regular release.
134
00:08:02,311 --> 00:08:07,331
We can have them taking it twice a day instead of once a day, which is a little bit less
difficult.
135
00:08:08,051 --> 00:08:12,031
So there's things we can do to make it a little bit more palatable for people.
136
00:08:12,031 --> 00:08:15,071
But then as soon as they're kind of ready for the next change, we want to make the next
change.
137
00:08:15,071 --> 00:08:19,391
We might not be all the way to homeostasis necessarily, but we do want to keep them
moving.
138
00:08:19,891 --> 00:08:22,125
I think your idea is the right idea.
139
00:08:22,125 --> 00:08:29,158
and then from an overall measurement perspective I'm guessing you guys do have a metabolic
panel up front and then you get continuous monitors like CGM sitting on them but maybe
140
00:08:29,158 --> 00:08:30,449
another wearable that tracks record
141
00:08:30,449 --> 00:08:33,434
activity functions, sleep, all those different pieces.
142
00:08:33,434 --> 00:08:40,836
And you try to correlate that data into some type of time series graph chart and create
trend sets around that to try to understand what things are happening.
143
00:08:40,836 --> 00:08:44,641
And then you can make small incremental changes to strive to see if you can move those
markers.
144
00:08:44,805 --> 00:08:45,766
All of all of that.
145
00:08:45,766 --> 00:08:52,479
Now, what I would say is that, you know, CGMs are probably one of the most misunderstood
devices in our practice for our practice.
146
00:08:52,479 --> 00:08:59,603
uh We prefer to have people using blood sticks for their fingers because it's so much more
accurate when you're using insulin in particular.
147
00:08:59,603 --> 00:09:04,946
But CGMs are really great for like the person who wants the data for behavioral reasons.
148
00:09:05,046 --> 00:09:10,119
This is one of the big conversations we have with between the medical doctors and the
behavioral people in our practice.
149
00:09:10,119 --> 00:09:13,881
It's like, hey, a CGM may not be clinically indicated for medicine.
150
00:09:14,075 --> 00:09:23,994
But it's really great because it can give somebody some data about their behavior and make
the behavior change so much easier and therefore has a really big clinical impact
151
00:09:23,994 --> 00:09:24,434
medically.
152
00:09:24,434 --> 00:09:28,108
And so we have to look at things through kind some different lenses sometimes.
153
00:09:28,108 --> 00:09:32,121
And so the data is really important for some people and for other people it's stressful.
154
00:09:32,325 --> 00:09:37,672
So are you tracking that stuff in real time or is that something that they sort of monitor
and report to you?
155
00:09:37,917 --> 00:09:42,800
Well, mean, first of all, would say a CGM data isn't real time data in the sense that it's
interstitial fluid.
156
00:09:42,800 --> 00:09:45,592
It's not blood, but, but no, our, patients track it.
157
00:09:45,592 --> 00:09:49,595
And then they want, we do some cases monitor the CGM data directly.
158
00:09:49,595 --> 00:09:53,998
In most cases, we want them entering that into an app so that they're processing it in
their brain.
159
00:09:53,998 --> 00:10:01,483
So they actually know what's happening versus just having the number reported out and
being having their brain kind of out of the loop with the provider.
160
00:10:01,523 --> 00:10:05,406
So we, try to get them engaged in that, but we try to get engaged in just a few numbers,
right?
161
00:10:05,406 --> 00:10:07,937
Because you could measure everything.
162
00:10:08,071 --> 00:10:10,853
And that isn't for most people helpful.
163
00:10:10,853 --> 00:10:12,284
Now I'm a little weird.
164
00:10:12,284 --> 00:10:13,455
I want to track more data.
165
00:10:13,455 --> 00:10:16,168
You know, would say there's people in my family that want to track lots of data.
166
00:10:16,168 --> 00:10:17,019
Yeah, totally.
167
00:10:17,019 --> 00:10:20,281
And for those people, what we do is we try to meet everyone where they are.
168
00:10:20,281 --> 00:10:23,764
So for people that don't want to track a lot of data, we track a little data.
169
00:10:23,764 --> 00:10:27,087
We might collect a lot, but we only pay attention to a little.
170
00:10:27,087 --> 00:10:32,531
And then for some people, they want to track every now, by the way, there are also people
that want to track too many things.
171
00:10:32,701 --> 00:10:35,493
Like there are patients are like, I have this number it's out of band.
172
00:10:35,493 --> 00:10:38,395
It's like, yeah, but when that one number is out of band, it's okay.
173
00:10:38,395 --> 00:10:41,277
Unless it's out of band with six other things.
174
00:10:41,717 --> 00:10:51,724
And so like it's really, and then what it does is it causes a different type of treatment
burden, which is anxiety based on not getting an A plus on your paper, which is it's, it's
175
00:10:52,185 --> 00:10:54,246
yeah, it's all yeah.
176
00:10:54,506 --> 00:10:55,057
You got it.
177
00:10:55,057 --> 00:10:55,309
Yeah.
178
00:10:55,309 --> 00:10:59,050
So we really try to help people understand like, when is it not important to track a
number?
179
00:10:59,050 --> 00:11:01,155
can be as important as when it's important to track.
180
00:11:01,155 --> 00:11:09,157
Yeah, because that high signal-to-noise ratio definitely throws people off, and whether
you're a provider or a person looking at it, any extra noise makes it that hard.
181
00:11:09,157 --> 00:11:10,890
Yeah.
182
00:11:10,890 --> 00:11:15,897
I, medical practice came to us the other day and said they track like 25,000 different
things in their panel.
183
00:11:15,897 --> 00:11:21,735
And we're like, wow, if you want that great, but not great for most people.
184
00:11:21,898 --> 00:11:28,701
Well, and so what you're talking about here is a very individualized approach to health
care that we do not typically see in a doctor's office.
185
00:11:28,701 --> 00:11:29,081
Yes.
186
00:11:29,081 --> 00:11:36,444
I mean, typically you're walking in, you're giving a list of symptoms and you're getting
whatever drug the doctor thinks is going to attack most of them.
187
00:11:36,444 --> 00:11:37,264
How?
188
00:11:37,319 --> 00:11:38,039
medicine.
189
00:11:38,039 --> 00:11:46,762
mean, there are services like this for multiple different treatment sets that are
available and they're financially inaccessible for the vast majority of people because
190
00:11:46,762 --> 00:11:48,463
they're very expensive.
191
00:11:48,923 --> 00:11:49,771
That's right.
192
00:11:49,771 --> 00:11:51,049
you approach that?
193
00:11:51,495 --> 00:12:00,099
mean, for us, we, so in my doctorate, we did all the medicine and all the, you know, the
mental health work, but we also ripped apart the entire mechanism for the administrative
194
00:12:00,099 --> 00:12:10,114
practice of medical office and all the financial practices and rebuilt the entire practice
so that it could fit work for the, you know, the most, I wouldn't say, you know, people
195
00:12:10,114 --> 00:12:17,227
who have, um who are homeless, for example, are very difficult to treat for our practice
right now because insulin's a refrigerated drug and
196
00:12:27,027 --> 00:12:29,078
You
197
00:12:29,078 --> 00:12:34,693
to work in that environment and to be successful financially, not just clinically.
198
00:12:36,019 --> 00:12:42,502
So a big part of what we discuss typically is AI and how AI interacts and works with these
different pieces.
199
00:12:42,502 --> 00:12:44,593
And AI has lots of different inference functions.
200
00:12:44,593 --> 00:12:52,946
And you're talking about tracking a lot of data, trying to organize it and using multiple
different LLMs to try to make these things make sense.
201
00:12:52,946 --> 00:13:00,350
mean, Amazon has 15 different LLMs you can choose from, Bedrock just on medical devices.
202
00:13:00,350 --> 00:13:04,055
And then everybody has their own different pieces.
203
00:13:04,055 --> 00:13:09,000
Are you guys using AI in this way to try to actually help people with their information?
204
00:13:09,000 --> 00:13:11,753
And are you encouraging people to use AI to ask for help?
205
00:13:11,753 --> 00:13:18,289
Are you discouraging them to ask for AI for help when it comes to treating themselves
medically?
206
00:13:18,548 --> 00:13:19,038
Great question.
207
00:13:19,038 --> 00:13:20,329
We're giving them AI to help.
208
00:13:20,329 --> 00:13:24,742
So we use AI on the backend to help with all of our administration for our providers, of
course.
209
00:13:24,742 --> 00:13:31,267
Like there are things that are table stakes like ambient scribing and getting information
into and out of medical charts that are online.
210
00:13:31,267 --> 00:13:34,279
that's all happening, of course, inside of our medical practice.
211
00:13:34,279 --> 00:13:38,181
so, and we're doing more and more and more of that as time goes on.
212
00:13:38,181 --> 00:13:45,008
Right now, what we're really excited about, we're releasing a co-pilot for our patients
that can actually talk to their medical chart and talk to their provider.
213
00:13:45,008 --> 00:13:49,359
And it can help the patient learn what questions they should be asking about with their
provider.
214
00:13:49,359 --> 00:13:51,530
But it also tells providers what to be looking out for.
215
00:13:51,530 --> 00:13:57,762
So if something exotic is happening, it can call that out and say, hey, you might need to
consider a consult with this other specialist.
216
00:13:57,762 --> 00:13:59,723
Talk to tech tour medical director.
217
00:13:59,723 --> 00:14:01,423
And the medical director's all over that.
218
00:14:01,423 --> 00:14:09,325
So we're constantly looking for what are the threads that humans wouldn't normally be able
to see because they're not specialists in every area.
219
00:14:09,325 --> 00:14:11,416
And then the medical director can then.
220
00:14:11,416 --> 00:14:15,439
know, pull the chart and say, we think we need this specialist involved and here's why.
221
00:14:15,439 --> 00:14:23,054
And then we have this really beautiful continuation of care note that we can give to that
specialist and say, hey, here's what's relevant to what's going on.
222
00:14:23,054 --> 00:14:25,486
Here's what's less relevant, but important to understand.
223
00:14:25,486 --> 00:14:26,987
We have an entire chart available.
224
00:14:26,987 --> 00:14:30,940
If you need to see labs, images, whatever, let us know, we'll get it over to you.
225
00:14:30,940 --> 00:14:34,282
And so we can, we, we do have joking with someone the other day.
226
00:14:34,282 --> 00:14:35,483
We do crazy stuff.
227
00:14:35,483 --> 00:14:40,070
We ingest like all of their medical charts for a patient before they come see us for the
first time.
228
00:14:40,070 --> 00:14:44,154
and then our providers do this crazy thing, we read the chart before we see the patient.
229
00:14:45,335 --> 00:14:47,498
I mean, crazy stuff, right?
230
00:14:47,498 --> 00:14:48,639
And now we don't have to read all of it.
231
00:14:48,639 --> 00:14:49,359
We just read the parts.
232
00:14:49,359 --> 00:14:59,169
The AI is kind of flagged appropriately and it goes in and it characterizes like, you must
read this, should read this, probably don't need this, it's archived, but here's the
233
00:14:59,169 --> 00:14:59,794
summary of.
234
00:14:59,794 --> 00:15:01,375
So it kind of gives you a heat map.
235
00:15:01,375 --> 00:15:04,227
of other actual medical information and says, look here first.
236
00:15:04,227 --> 00:15:05,248
That's actually very interesting.
237
00:15:05,248 --> 00:15:06,139
That's a great use case.
238
00:15:06,139 --> 00:15:17,008
uh The next kind of question I have along the same area is that Western medicine,
typically speaking, or at least when this in the US is very much symptomatic treatment,
239
00:15:17,008 --> 00:15:23,613
and we don't tend to spend a lot of time working on uh actual root cause analysis of why
people are actually sick.
240
00:15:23,613 --> 00:15:29,018
And a lot of people have turned to naturopaths and naturopathic medicine because
naturopaths tend to stay more involved.
241
00:15:29,018 --> 00:15:30,879
And, and I guess
242
00:15:30,943 --> 00:15:33,354
Um, more frequent with their updates and their approach.
243
00:15:33,354 --> 00:15:38,627
Like I see my naturopath and my appointment is an hour, like legitimately an hour.
244
00:15:38,627 --> 00:15:45,911
If I go see somebody like one of my LGPs, I might get 12 minutes with them, maybe 15 at
the most.
245
00:15:47,792 --> 00:15:52,609
are you plugging a fundamental gap that we have with our existing medical infrastructure
today?
246
00:15:52,609 --> 00:15:55,076
Is that where you see this problem at?
247
00:15:55,076 --> 00:15:59,210
Because I'm guessing the techniques are trainable and you could teach other people how to
do this.
248
00:15:59,210 --> 00:16:01,049
And we could do this at scale.
249
00:16:01,049 --> 00:16:08,983
for everybody out there at the medical practice itself and actually change that require
massive changes in insurance providers and how we actually treat medicine and treat
250
00:16:08,983 --> 00:16:09,943
people.
251
00:16:10,124 --> 00:16:17,727
Do you see yourself as making doing this and what are the prospects for the future given
where our medical system is today?
252
00:16:18,578 --> 00:16:27,385
We think we may have found a new, entirely new school of medicine, know, inside of the
internal medicine kind of, you know, realm.
253
00:16:27,586 --> 00:16:33,251
Right now, you know, it's based on chronic metabolic disease, but it really is
implementation medicine.
254
00:16:33,251 --> 00:16:35,523
And how we implement these things is really important.
255
00:16:35,523 --> 00:16:42,158
So we think there's a whole new set of protocols that providers need to learn to do this
very specific thing for these very sick people.
256
00:16:49,903 --> 00:16:51,584
Yeah.
257
00:16:51,584 --> 00:16:53,175
and pulmonologists.
258
00:16:53,175 --> 00:16:56,876
so, we think it's, know, infinitely scalable.
259
00:16:56,876 --> 00:17:00,338
uh We're building technology to be infinitely scalable.
260
00:17:00,338 --> 00:17:02,419
We think that's true.
261
00:17:02,679 --> 00:17:05,940
And uh we spend a lot of time with patients really.
262
00:17:05,940 --> 00:17:09,619
mean, our first visit could be an hour if it needs to be.
263
00:17:09,619 --> 00:17:11,659
Patients don't generally need to spend that long with us.
264
00:17:11,659 --> 00:17:16,819
don't feel like they want to, but we spend, we put that much time aside inside of our
practice.
265
00:17:16,819 --> 00:17:18,199
And then again, we talk to them regularly.
266
00:17:18,199 --> 00:17:25,998
Like if they get, you know, their AI and they're interacting with their chart, and like I
have a question for one of your providers, you know, the AI can say, well, do you want to
267
00:17:25,998 --> 00:17:29,059
schedule an appointment or do you just want somebody to call you between appointments?
268
00:17:29,059 --> 00:17:31,059
And a lot of times like, Hey, can you just give me a quick call?
269
00:17:31,059 --> 00:17:35,819
And it's not having to go through the front desk and it's like the provider just calls and
says, Hey, what's going on?
270
00:17:35,819 --> 00:17:37,994
I saw you were talking about, you know,
271
00:17:54,985 --> 00:18:00,557
Yeah, it sounds like you're patients where they're at, which is great because a lot of
patients have a lot of
272
00:18:00,993 --> 00:18:02,593
nervousness about talking to doctors.
273
00:18:02,593 --> 00:18:03,995
I white coat syndrome is a real thing.
274
00:18:03,995 --> 00:18:04,495
Like I have it.
275
00:18:04,495 --> 00:18:07,528
I go in and I see my doctor, my blood pressure spikes every single time.
276
00:18:07,528 --> 00:18:13,331
You know, I'm suddenly, you know, in the one thirties over nineties and they're like, you
should be on blood pressure medication.
277
00:18:13,331 --> 00:18:15,653
I'm like, you put me on it three times and I've crashed.
278
00:18:15,653 --> 00:18:20,096
Like you're going to kill me if we try to do this again, because I'm normal.
279
00:18:21,397 --> 00:18:23,228
Yeah, that's that's that's kind of it.
280
00:18:23,228 --> 00:18:30,703
And I mean, being able to have a conversation with something that's not human, I guess, or
maybe something that you're not
281
00:18:30,703 --> 00:18:32,603
worried about being judged by.
282
00:18:32,603 --> 00:18:34,003
I think it's huge.
283
00:18:35,463 --> 00:18:35,969
Yeah.
284
00:18:35,969 --> 00:18:38,349
well in some circumstances, but we've got to be careful too.
285
00:18:38,349 --> 00:18:40,980
Like the human interaction is really where it's at for us.
286
00:18:40,980 --> 00:18:44,261
You know, all of our providers are on a first name basis with our patients.
287
00:18:44,261 --> 00:18:47,122
uh know, people, you know, I often get Dr.
288
00:18:47,122 --> 00:18:50,733
Oberg, it's like, man, if you see me in the hospital or in my classroom, that's cool.
289
00:18:50,733 --> 00:18:56,235
But if you're a patient or a friend or a colleague like John, like that's just a much
better way that we're going to interact together.
290
00:18:56,235 --> 00:19:01,086
And there's some early evidence that like for some parts of cognitive behavioral therapy,
291
00:19:01,350 --> 00:19:04,882
that people are interacting with it and getting better results than talking to a human.
292
00:19:04,882 --> 00:19:10,614
So I think that we're going to find that there are some, you know, places where anonymity
makes a ton of sense.
293
00:19:10,614 --> 00:19:17,237
But I want to be super cautious about that because I mean, with all the hallucination and
thing, the AI has got limitations as well.
294
00:19:17,237 --> 00:19:20,558
And we're bought in like we're, we're converts.
295
00:19:20,758 --> 00:19:21,499
Yeah.
296
00:19:21,499 --> 00:19:29,778
So, so we don't, we, we have some, we have some ways of putting expert systems along AI so
that we can actually make sure we don't have hallucinations.
297
00:19:29,778 --> 00:19:31,198
in any kind of environment.
298
00:19:31,198 --> 00:19:36,318
And then we have the obviously human in the loop systems that work really closely with the
patients.
299
00:19:36,318 --> 00:19:39,438
And so we're very careful we have fully autonomous systems.
300
00:19:39,438 --> 00:19:47,322
We would never use a fully autonomous system where something's going to have anything
clinical at all like that just would not be appropriate for us.
301
00:19:47,322 --> 00:19:49,735
plugs that gap of being able to have that constant contact.
302
00:19:49,735 --> 00:19:57,666
If I'm somebody that's having an anxiety problem and I can't get ahold of my doctor, my
physician, and I can talk to a robot and kind of feel better.
303
00:19:57,666 --> 00:19:58,526
Yeah.
304
00:19:58,948 --> 00:19:59,278
Yeah.
305
00:19:59,278 --> 00:20:00,370
Vet this out for me.
306
00:20:00,370 --> 00:20:01,197
Right.
307
00:20:01,197 --> 00:20:05,748
then we, we have guardrails in our system are like, yeah, no, that's not actually true.
308
00:20:05,748 --> 00:20:12,110
And it's popular science and it's unfortunate because, and you know, for us, we talk, we
have this kind of gradient for people.
309
00:20:12,110 --> 00:20:14,871
like, here's what the science knows for sure.
310
00:20:14,871 --> 00:20:18,021
Here's what the evidence is telling doctors to do.
311
00:20:18,021 --> 00:20:21,342
And they do in the practice here's stuff that we just don't know.
312
00:20:21,342 --> 00:20:23,845
It's probably not going to hurt you, but like,
313
00:20:23,845 --> 00:20:27,289
If you want to give it a shot, just be informed that we don't think it works.
314
00:20:27,289 --> 00:20:28,390
Maybe it works.
315
00:20:28,390 --> 00:20:30,413
Try it on, see if it fits.
316
00:20:30,413 --> 00:20:32,355
Maybe it's placebo, maybe it's great.
317
00:20:32,355 --> 00:20:34,216
And then there's stuff that does harm.
318
00:20:34,317 --> 00:20:44,118
And so as long as we're telling patients, like they're getting full like information about
where all of that is, and then patients need to be agents and make their own choices based
319
00:20:44,118 --> 00:20:45,649
on what the science says.
320
00:20:46,949 --> 00:20:49,210
Enabling people to actually make their own decisions.
321
00:20:49,210 --> 00:20:51,299
That's an interesting concept.
322
00:20:51,299 --> 00:20:52,552
is huge for us, yeah.
323
00:20:52,552 --> 00:20:55,836
I mean, because we're all we're taught very young to not do that.
324
00:20:55,836 --> 00:20:58,900
Like your doctor said to do this is the thing that you have to do.
325
00:20:58,900 --> 00:21:03,266
you know, get X amount servings of dairy a day, get X amount servings of meat.
326
00:21:03,266 --> 00:21:06,009
And there's just some people that don't do right with that.
327
00:21:06,711 --> 00:21:07,087
Yeah.
328
00:21:07,087 --> 00:21:08,509
go read the book Fiat food.
329
00:21:08,509 --> 00:21:09,707
It was startling.
330
00:21:09,707 --> 00:21:11,411
It was scary.
331
00:21:11,411 --> 00:21:15,254
was the food pyramid is not a thing for anyone listening.
332
00:21:15,254 --> 00:21:16,395
It's just not a thing.
333
00:21:16,395 --> 00:21:17,116
Whole food.
334
00:21:17,116 --> 00:21:19,817
talk, you know, eat more whole food, move to the right.
335
00:21:20,058 --> 00:21:25,242
Meaning from the left hand side, processed food, right hand side, whole food, move to the
right.
336
00:21:25,242 --> 00:21:26,904
And then for exercise, move more.
337
00:21:26,904 --> 00:21:33,719
Like we can talk to our optimization and there are experts in optimization, which I would
say that our practice is not an expert in like
338
00:21:41,215 --> 00:21:43,191
Yeah.
339
00:21:43,191 --> 00:21:46,905
Well, and it's tricky, mean, without the risk of getting political here.
340
00:21:46,905 --> 00:21:48,917
I mean, even the idea of eating whole food, right?
341
00:21:48,917 --> 00:21:56,517
If you stick to the outer rim of the grocery store, it sure looks like whole food, but it
was not raised that way for the most part.
342
00:21:56,517 --> 00:21:57,616
I mean, it's very hard.
343
00:21:57,616 --> 00:21:59,265
Yeah, exactly.
344
00:21:59,265 --> 00:22:07,671
You got to read your read the labels and know you know, no, and we talked to we actually
read labels with patients that ice cream story was us reading labels with patients, right?
345
00:22:07,671 --> 00:22:08,953
So we'll read labels with them.
346
00:22:08,953 --> 00:22:13,187
And so, you know, as me meal planning easier, it's great.
347
00:22:13,187 --> 00:22:24,887
Yeah, I wanted to ask you about your take on mental health because so many people are
using AI for a mental health surrogate, whether it's in place of an actual therapist or in
348
00:22:24,887 --> 00:22:25,807
conjunction with.
349
00:22:25,807 --> 00:22:29,827
I do it myself, very guarded and very carefully.
350
00:22:30,147 --> 00:22:38,688
And there's a lot of things that used to be thought spirals in my head that would spin
there and keep me stuck that I can now go to that judgment free zone.
351
00:22:38,688 --> 00:22:42,885
dump them out and get some organization to my thoughts and start to come up with an action
plan.
352
00:22:42,885 --> 00:22:45,408
So for me, it's been wildly beneficial.
353
00:22:45,529 --> 00:22:47,192
For other people, it's been deadly.
354
00:22:47,192 --> 00:22:52,121
So I'm curious your take on it and where you see the future of mental health with regard
to AI.
355
00:22:52,121 --> 00:22:53,862
Yeah, I'm pretty cautious here.
356
00:22:53,862 --> 00:23:00,746
think, you know, my, my first coaching experience ever in my life, I was getting coached
by a kind of a coach early in their career.
357
00:23:00,746 --> 00:23:09,531
And they took me into kind of the therapy zone and did some pretty therapeutic damage that
it was undone by a doctor later on who had the right credentials and the right therapeutic
358
00:23:09,531 --> 00:23:10,151
background.
359
00:23:10,151 --> 00:23:19,156
so I think, um, here's a great analogy when I'm using AI, which I use every single day,
I'm exceptional at using it in my domain of expertise.
360
00:23:19,156 --> 00:23:20,917
It really speeds me up.
361
00:23:20,956 --> 00:23:22,647
But I don't code software.
362
00:23:22,647 --> 00:23:25,390
And so I work with our chief technology officer who does that.
363
00:23:25,390 --> 00:23:30,053
And when I watch him use AI for coding, it's like a different language.
364
00:23:30,053 --> 00:23:36,058
Like he can use AI in his domain of expertise and it gets sped up, you know, 10 to a
hundred times.
365
00:23:36,058 --> 00:23:38,839
can use uh it.
366
00:23:39,360 --> 00:23:40,381
Yeah.
367
00:23:40,381 --> 00:23:45,585
But, but if you, think if you're working in your domain of expertise, you can be
exceptional at it.
368
00:23:45,625 --> 00:23:51,033
If you're not in your domain of expertise, it can turn into an echo chamber and lead to
disastrous results.
369
00:23:51,033 --> 00:23:59,420
So if someone's interested in using AI for mental health and they're trying to understand
like, do I engage with my professional?
370
00:23:59,941 --> 00:24:05,825
How do I engage with people that are uh in a future looking environment?
371
00:24:05,825 --> 00:24:07,487
I think those things are less dangerous.
372
00:24:07,487 --> 00:24:15,173
I think taking AI into a therapeutic environment where the AI hasn't been built for that
specifically has a lot of danger to it.
373
00:24:15,173 --> 00:24:17,115
And I would recommend not doing that.
374
00:24:17,115 --> 00:24:19,889
uh
375
00:24:19,889 --> 00:24:31,177
Look, I think talking to an AI versus not talking to anybody, you know, gosh, I believe we
can lower the cost of mental health care by 90 or 95 % using AI.
376
00:24:31,177 --> 00:24:32,978
Here's the way I think we should do it.
377
00:24:32,978 --> 00:24:42,615
I think we should train people to be peer counselors, to engage in subclinical settings,
and then have an AI listen in with the AI flagging a therapist when a therapist needs to
378
00:24:42,615 --> 00:24:43,096
get involved.
379
00:24:43,096 --> 00:24:45,227
That's the way that we see it working inside of our practices.
380
00:24:45,227 --> 00:24:48,069
Like, let's get people comfortable enough with the AI.
381
00:24:48,069 --> 00:24:55,995
listening in as a peer to the peer counselor and the peer counselor who's trained in
handling subclinical issues.
382
00:24:56,036 --> 00:24:59,559
And then when you get into something as therapeutically indicated, let's bring an expert
in.
383
00:24:59,559 --> 00:25:06,264
That'll let the expert kind of get a much broader uh scope of what people they can help,
right?
384
00:25:06,264 --> 00:25:10,247
And they can really be involved in work at the top of their license.
385
00:25:10,288 --> 00:25:17,173
And we can get people, peer counselors, uh modeling empathy again, like great listening.
386
00:25:40,397 --> 00:25:42,287
Well, that's weird because I think
387
00:25:42,287 --> 00:25:47,887
A lot of us used to have friends and we used to go to bars and hang out and chat and.
388
00:25:48,367 --> 00:25:55,487
We don't really do that like like, I mean, I can go back like to the early 2000s be like,
yeah, OK, I used to meet people at the bar.
389
00:25:55,487 --> 00:25:58,187
We hang out, we chit chat, we socialize.
390
00:25:58,607 --> 00:26:06,027
And it happened much more frequently, and I just feel like people are much more insulated
and much more insular and much more protected and kind of what they say, and they don't
391
00:26:06,027 --> 00:26:07,567
tend to have real deep conversations.
392
00:26:07,567 --> 00:26:12,583
It can be very superficial, and I think we've drifted into that kind of as a society
because we've
393
00:26:12,815 --> 00:26:18,016
created these Facebook monikers where everything's looked, watched, monitored, and we're
looked at all the time.
394
00:26:18,016 --> 00:26:22,478
uh full disclosure, I own a medical startup.
395
00:26:22,478 --> 00:26:30,310
I take wearable data from Dexcoms and Fitbits and Garmin and put those things into time
series chart graphs and overlay them with other data sets to make sense of it.
396
00:26:30,310 --> 00:26:32,710
And then we were way ahead of the market when you put it out there.
397
00:26:32,710 --> 00:26:33,701
So it's not gone anywhere.
398
00:26:33,701 --> 00:26:37,341
A thousand other companies come in since and made changes.
399
00:26:37,882 --> 00:26:42,625
That being said, um we collect too much data.
400
00:26:42,625 --> 00:26:46,696
And even in our platform, it can be used in interesting ways.
401
00:26:46,696 --> 00:26:49,646
And we have all kinds of safeguards that lock those pieces down.
402
00:26:49,646 --> 00:26:53,158
And we have a sharing system and it requires multifactor authentication.
403
00:26:53,158 --> 00:26:57,159
And you only, have limited access with your key tokens to be able to get to things.
404
00:26:57,959 --> 00:27:00,160
How do you get people over this fear?
405
00:27:00,160 --> 00:27:06,722
Like you're using AI, you're using biometric data, you're collecting all this information,
you're pulling it back in.
406
00:27:07,802 --> 00:27:10,477
One, actually the real question is do your patients even.
407
00:27:10,477 --> 00:27:13,368
worry about it or are they in the state where they're like, I got to do something.
408
00:27:13,368 --> 00:27:19,671
Or is it that, you know, this is another signal that creates another layer of anxiety that
you have to worry about in your practice?
409
00:27:23,933 --> 00:27:24,813
Yeah.
410
00:27:29,055 --> 00:27:29,955
Yeah.
411
00:27:39,079 --> 00:27:39,728
Yeah.
412
00:27:39,728 --> 00:27:47,014
these are the things that we really lean into and that, em and when people ask, like you
guys did, you know, what's the thing?
413
00:27:47,394 --> 00:27:49,966
It is messy, muddy.
414
00:27:51,050 --> 00:27:53,151
Yeah.
415
00:27:59,471 --> 00:28:00,931
Yeah.
416
00:28:00,931 --> 00:28:05,010
know, I think it was the fifties or the sixties, there was some study done that I can't
find.
417
00:28:05,010 --> 00:28:11,054
I went back and looked for it, but my recollection was like, men could talk to like 12
other men about anything.
418
00:28:11,054 --> 00:28:13,245
And then a more recent study was like less than three.
419
00:28:13,245 --> 00:28:15,455
And I heard somewhere recently less.
420
00:28:15,455 --> 00:28:17,096
I can't find those studies by the way.
421
00:28:17,096 --> 00:28:18,856
I keep looking for the numbers.
422
00:28:25,527 --> 00:28:26,378
Yep.
423
00:28:26,378 --> 00:28:28,019
would tend to believe that.
424
00:28:28,060 --> 00:28:30,023
We need to solve this.
425
00:28:30,023 --> 00:28:34,087
And so one of the things that we do is we really connect with people.
426
00:28:34,489 --> 00:28:37,354
I know it's a crazy idea, but...
427
00:28:37,354 --> 00:28:42,679
tell a story of a colleague who came to you and said, how do you convince the patient that
you care about them?
428
00:28:42,679 --> 00:28:45,842
And you said, you actually care about them.
429
00:28:45,842 --> 00:28:46,902
right?
430
00:28:47,843 --> 00:28:53,306
Yeah, I mean so many of us think that nobody gives a shit and One it's probably not true.
431
00:28:53,306 --> 00:28:55,747
I mean there probably are people that do but we don't
432
00:28:55,747 --> 00:29:02,611
know how to handle those signals and we don't know how to we don't know how to take
empathy and friendship because we're not used to it.
433
00:29:02,611 --> 00:29:11,195
Like I really think that's the case with I look at my kids and how they interact and
behave and it's a whole different level and they're way more open and honest than I've
434
00:29:11,195 --> 00:29:11,996
ever been.
435
00:29:11,996 --> 00:29:16,988
But as a Gen Xer, like we're just taught to swallow so much and not say anything.
436
00:29:16,988 --> 00:29:17,799
And yeah.
437
00:29:17,799 --> 00:29:25,560
um
438
00:29:25,560 --> 00:29:27,651
a hundred year cycle we go through.
439
00:29:27,732 --> 00:29:39,780
The way I had heard it described to me early in my life was that they said that bad times
make strong people make good times, make weak people make, right?
440
00:29:39,780 --> 00:29:41,025
So it's the hard times.
441
00:29:41,025 --> 00:29:41,141
Yeah.
442
00:29:41,141 --> 00:29:44,883
So you go through this cycle and so, you know,
443
00:29:46,288 --> 00:29:49,408
And then we have people that are, know, the kids grow up, right?
444
00:29:49,408 --> 00:29:55,428
It takes until you're 25, 26, 27 years old, your brain's fully formed depending on lots of
factors.
445
00:29:55,428 --> 00:30:01,028
so, well, and empathy is a practice skill, right?
446
00:30:01,028 --> 00:30:08,588
Like you have to, like, I didn't make really good lifelong friends until I was in my early
thirties after I started that, going to that therapist I told you about.
447
00:30:08,588 --> 00:30:11,208
And that wasn't because my parents did anything wrong.
448
00:30:11,208 --> 00:30:15,376
had a, you know, something happened, my dad died when I was really young and
449
00:30:15,376 --> 00:30:19,096
It had an impact on me that no one really helped me through.
450
00:30:19,096 --> 00:30:24,356
And had I not for lack of trying, I just never really processed it until I got into my
early thirties.
451
00:30:24,356 --> 00:30:26,796
And once I did, I was able to make lifelong friends.
452
00:30:26,796 --> 00:30:31,876
And now I've got, you know, lifelong friends that feel like family, but I'm really
fortunate.
453
00:30:31,876 --> 00:30:33,796
But it was a very deliberate process for me.
454
00:30:33,796 --> 00:30:35,096
Like I went out in my early thirties.
455
00:30:35,096 --> 00:30:35,976
Like I don't have this.
456
00:30:35,976 --> 00:30:36,956
I want this.
457
00:30:36,956 --> 00:30:39,896
And you can't just walk out to somebody like, Hey man, best friends.
458
00:30:39,896 --> 00:30:41,216
Like that's weird.
459
00:30:41,216 --> 00:30:41,676
Right?
460
00:30:41,676 --> 00:30:45,355
So you've got to learn about the layers of friendship and how that develops over time.
461
00:30:45,355 --> 00:30:46,241
you know.
462
00:30:46,301 --> 00:30:50,898
And I'm kind of the opposite way where like I hate small talk and I hate the ramping up to
the real connection.
463
00:30:50,898 --> 00:30:54,203
So I want to be the guy that just goes up to the randos like, hey, can we just be friends?
464
00:30:54,203 --> 00:30:56,656
Like, can we skip all the like, how's the weather?
465
00:30:56,656 --> 00:30:57,477
What are you doing this weekend?
466
00:30:57,477 --> 00:30:58,024
Right.
467
00:30:58,024 --> 00:30:59,504
those that doesn't hold up, right?
468
00:30:59,504 --> 00:31:05,004
I've tried that before and I've had those relationships where it gets super intense, super
fast with male friendships.
469
00:31:05,004 --> 00:31:08,464
And it's like, but they, you know, it doesn't, doesn't generally last.
470
00:31:08,464 --> 00:31:10,828
so, yeah.
471
00:31:10,828 --> 00:31:16,724
was going to ask you something along the lines of, what are some ways that people can use
technology to, you know, to better their lives?
472
00:31:16,724 --> 00:31:19,026
But it sounds like you guys are doing a lot of the technology on the back end.
473
00:31:19,026 --> 00:31:26,463
So I'll open it to you, like for the listener who's out there who read the description
that had something to do with like dealing with health issues and how technology can help
474
00:31:26,463 --> 00:31:26,903
them.
475
00:31:26,903 --> 00:31:34,120
What are some things that people can take away from what you see every day that are simple
places to change that ice cream or whatever it is that they need to do to get on the right
476
00:31:34,120 --> 00:31:34,800
path?
477
00:31:40,787 --> 00:31:41,247
You
478
00:31:41,247 --> 00:31:42,767
I'll try and get down into a minute.
479
00:31:42,767 --> 00:31:51,887
Like I want people to get involved with an expert so that they're making the plan with the
expert, right?
480
00:31:51,887 --> 00:31:54,767
So because the expert has to say, yes, this is a good plan.
481
00:31:54,767 --> 00:31:57,467
And the person has to say, yes, I'll do that plan.
482
00:31:57,927 --> 00:31:58,307
Right.
483
00:31:58,307 --> 00:32:02,575
But if, but if a person comes in and is like, I'm going to kick diabetes, I'm going to do
it by
484
00:32:02,575 --> 00:32:03,995
I'm going to cut out sugar.
485
00:32:03,995 --> 00:32:07,955
I'm only going to eat like, you know, Captain Crunch and it's like, that's a hold on time
out.
486
00:32:07,955 --> 00:32:08,995
Like that's sugar too.
487
00:32:08,995 --> 00:32:11,475
Like that, you know, okay, I won't eat Captain Crunch.
488
00:32:11,475 --> 00:32:12,375
I'll just eat rice.
489
00:32:12,375 --> 00:32:13,195
We'll call the home.
490
00:32:13,195 --> 00:32:18,995
Like we got to get like, so, there has to be an expert and there has to be a person who
says, here's what I'm willing to do.
491
00:32:18,995 --> 00:32:21,255
And they've to put that expert plan together.
492
00:32:21,255 --> 00:32:24,215
And usually it starts with getting people on the right medicine.
493
00:32:24,215 --> 00:32:25,755
So the medicine is tolerable.
494
00:32:25,755 --> 00:32:31,175
And then it's move at least 30 minutes a day, five days a week and at least 10 minute
chunks.
495
00:32:31,175 --> 00:32:31,639
Just.
496
00:32:31,639 --> 00:32:32,660
and constant movement.
497
00:32:32,660 --> 00:32:33,680
You don't have to break a sweat.
498
00:32:33,680 --> 00:32:35,271
Just get out and walk, right?
499
00:32:35,271 --> 00:32:39,363
And then eat more whole food, like just a little more.
500
00:32:39,363 --> 00:32:41,645
We tell people, if you're going to the steakhouse, get the steak.
501
00:32:41,645 --> 00:32:42,945
Like, that's fine.
502
00:32:43,826 --> 00:32:47,207
But make the right choice when you're not at the steakhouse, right?
503
00:32:47,828 --> 00:32:52,000
get the thing that's more whole food and just do a little more each day.
504
00:32:52,000 --> 00:32:54,121
Like wherever you are, just do a little better.
505
00:32:54,431 --> 00:32:55,541
Yeah, yeah.
506
00:32:55,894 --> 00:32:58,475
Constant persistent consistent refinement.
507
00:32:59,119 --> 00:33:00,079
That's it.
508
00:33:00,079 --> 00:33:01,419
Just, yeah, just a little better each day.
509
00:33:01,419 --> 00:33:02,439
A little better.
510
00:33:02,835 --> 00:33:03,196
Awesome.
511
00:33:03,196 --> 00:33:04,448
This has been great.
512
00:33:04,530 --> 00:33:05,391
John, Dr.
513
00:33:05,391 --> 00:33:06,675
Oberg, whichever you prefer.
514
00:33:06,675 --> 00:33:07,877
Thank you so much for your time.
515
00:33:07,877 --> 00:33:12,859
uh Anything important that we did not get to that you want to make sure we mentioned.
516
00:33:13,412 --> 00:33:17,665
Yeah, I mean, for me, I'd say, you know, our life's work is at pristine and health, which
is the medical practice.
517
00:33:17,665 --> 00:33:25,887
But we try to tell these stories of hope at our little podcast called tales of
abundance.com, where we talk about things like AI and you know, the stress of economics
518
00:33:25,887 --> 00:33:30,306
and but it's, it's, it's hopefully mostly stories of hope, like stories of redemption.
519
00:33:30,306 --> 00:33:35,410
And so if you want to come check out and just feel good for a minute, come check out tales
of abundance.com.
520
00:33:35,410 --> 00:33:41,912
And uh yeah, just reach out if you want to be part of the movement in whatever way you
want to part of it, reach out and let us know how you want to join.
521
00:33:41,912 --> 00:33:51,700
We're definitely gonna come check that out because our mantra is to try to be positive and
put positive spins on things and we often wind up doing saying during all of our AI chats.
522
00:33:51,700 --> 00:33:55,222
So I think we might listen in for some advice.
523
00:33:55,411 --> 00:33:56,373
Exactly, exactly.
524
00:33:56,373 --> 00:33:56,824
Cool.
525
00:33:56,824 --> 00:34:00,773
Links links to all the above in the show notes for this episode.
526
00:34:00,773 --> 00:34:01,314
John, Dr.
527
00:34:01,314 --> 00:34:03,156
Oberg, thanks so much for your time.
528
00:34:03,378 --> 00:34:04,210
Really good conversation.
529
00:34:04,210 --> 00:34:05,361
Appreciate your time.
530
00:34:05,981 --> 00:34:06,622
Let's do it again.
531
00:34:06,810 --> 00:34:07,780
All right, that again is Dr.
532
00:34:07,780 --> 00:34:08,551
John Oberg.
533
00:34:08,551 --> 00:34:11,483
Our thanks again to him for being on the show with us here today.
534
00:34:11,483 --> 00:34:17,927
You can find all the links and necessary resources that we talked about in that
conversation at our website, that is brobots.me.
535
00:34:17,927 --> 00:34:21,829
That's also where you will find us again in about a week with another new episode.
536
00:34:21,829 --> 00:34:30,354
Hit subscribe if you have not, leave a rating and or a review if you would, and of course
share this with anybody who might be struggling with these same issues in the same way you
537
00:34:30,354 --> 00:34:32,784
are to help them get the results they need.
538
00:34:32,784 --> 00:34:33,617
Thanks so much for listening.
539
00:34:33,617 --> 00:34:36,183
will see you again in about a week at ProBots.me.
John Oberg
CEO
John Oberg is a Founder, CEO, Board Director, Advisor, Professor, and Investor trusted by organizations to navigate growth, conflict, and change. He founded Precina Health with Dr. Dustyn Williams and have developed the key to solving Type 2 Diabetes.
Previously, John founded another Austin-based healthcare innovator, Sedera, a community-based medical cost-sharing organization (#193 on Inc 500). John serves on the board of directors of OnlineMedEd (which writes curricula for Medical and Physician’s Assistant schools globally) and Health Admins (Healthcare Business Process Outsourcing). Through John Oberg Advisory, John advises large institutions, healthcare industry leaders, professional services firms, and nonprofits. John received his doctorate from the University of Southern California (Social Work) and his MBA from the University of New Mexico (Policy & Planning and Management of Technology). He is currently an adjunct professor at USC. He has authored patents, started companies, and managed teams larger than 1,000 people. Over the last three decades, he has studied the behavior of individuals, communities, and systems.
John was born and raised in Los Angeles, California. He currently lives with his family in Austin, where he engages in many outdoor activities. When traveling, he also enjoys SCUBA diving and golf and has recently taken to hunting and fishing. He holds a second-degree black belt in orthodox karate.
