Sept. 16, 2025

How AI Can Be Used For Mental Health

Can AI save your sanity when therapy isn’t enough?

In this episode, Rajeev Kapur shares how ChatGPT became a lifeline for a suicidal business exec, repairing his marriage, reconnecting him with his daughter, and even saving his company.

We cover AI therapy prompts, data privacy hacks, deepfake nightmares, and how to use AI responsibly without losing your soul.

Topics Discussed:

  • AI has been integrated into our lives for a long time.
  • Mental health support can be augmented by AI tools.
  • Therapists should embrace AI to enhance their practice.
  • Prompting techniques can make AI more effective for users.
  • Data privacy is crucial when using AI applications.
  • Deepfakes pose significant risks to individuals and organizations.
  • AI can help improve personal relationships and communication.
  • The future of work may involve universal basic income due to AI advancements.
  • Ethics in AI development is a pressing concern.
  • AI is still in its early stages, with much potential ahead.

Resources:

  • Books: AI Made Simple (3rd Edition), Prompting Made Simple by Rajeev Kapur

----

GUEST WEBSITE:
https://rajeev.ai/ 

----

MORE FROM BROBOTS:

Connect with us on Threads, Twitter, Instagram, Facebook, and Tiktok

Subscribe to BROBOTS on Youtube

Join our community in the BROBOTS Facebook group

----

LINKS TO OUR PARTNERS:

 

 

 

1
00:00:01,025 --> 00:00:02,832
Have you ever had a mental breakdown at 2 a.m.

2
00:00:02,832 --> 00:00:06,644
and wished you had access to some sort of 24 seven therapy hotline?

3
00:00:06,665 --> 00:00:08,248
That's what happened to one corporate executive.

4
00:00:08,248 --> 00:00:11,853
was suicidal, failing at work, and disconnected from his family.

5
00:00:11,880 --> 00:00:13,429
simply wasn't enough.

6
00:00:13,429 --> 00:00:19,997
In this episode, you'll learn how AI became his lifeline, helping him reconnect with his
daughter, save his marriage, and rebuild his career.

7
00:00:19,997 --> 00:00:24,409
Plus you'll learn tips about AI safety without selling your soul to the algorithm.

8
00:00:24,409 --> 00:00:30,227
Stick around, this conversation might change how you think about therapy, privacy, and
even Taylor Swift tickets.

9
00:00:30,558 --> 00:00:37,355
This is BroBots, we're the podcast that tries to help you use AI and technology to safely
manage your health and well-being.

10
00:00:37,355 --> 00:00:45,423
As you've heard me talk about on this show before, occasionally when I'm sort of in a
mental health crisis, I will turn to AI and I will ask it for questions, I will ask it for

11
00:00:45,423 --> 00:00:55,126
guidance, I will ask it to help me get through those tough times when I'm in between
appointments with my real actual human therapist who is also massively helpful in my life.

12
00:00:55,126 --> 00:00:57,580
as it turns out, AI is helping a lot of other people as well.

13
00:00:57,580 --> 00:01:04,330
I was just reading an article the other day about how 35 % of people are using AI to
manage their physical and mental health symptoms.

14
00:01:04,330 --> 00:01:08,925
Well today you're going to hear a conversation that my co-host Jason had with Rajiv
Kapoor.

15
00:01:08,925 --> 00:01:13,011
He's the bestselling author of AI Made Simple and Prompting Made Simple.

16
00:01:13,011 --> 00:01:18,543
He's gonna share the story of a business executive who was pulled back from the edge of
suicide with the help of chat GPT.

17
00:01:18,543 --> 00:01:28,057
They dig into how AI can fill the gaps therapy leaves wide open, how to protect your
privacy while using it, why deepfakes might be scarier than nuclear weapons.

18
00:01:28,479 --> 00:01:30,519
Jason's conversation with Rajiv.

19
00:01:30,513 --> 00:01:38,008
What in your professional journey kind of led you to focus kind of specifically on on AI
and how it intersects with with the modern world?

20
00:01:38,008 --> 00:01:40,920
And if you can kind of more specifically about health care.

21
00:01:40,920 --> 00:01:47,925
And I understand you had a personal event November of last year that really kind of made
it salient for you and made it very relatable.

22
00:01:47,945 --> 00:01:49,433
Any chance you can tell us that story?

23
00:01:49,648 --> 00:01:50,248
Yeah.

24
00:01:50,248 --> 00:01:50,648
Yeah.

25
00:01:50,648 --> 00:02:01,228
So I guess the first part of question, guess, is, you know, but just like you, I, you
know, look, my, my, my first, you know, platform wasn't Atari computer and it's already

26
00:02:01,228 --> 00:02:02,148
system.

27
00:02:02,148 --> 00:02:02,568
Right.

28
00:02:02,568 --> 00:02:03,148
I'm sure.

29
00:02:03,148 --> 00:02:03,928
Right.

30
00:02:03,928 --> 00:02:05,668
You know, yeah.

31
00:02:05,668 --> 00:02:09,768
You know, and so then, you know, you, kind of graduate, then you had the first Apple too.

32
00:02:09,768 --> 00:02:13,768
And then you, you know, and I was like, I was a gateway first and then Dell.

33
00:02:13,768 --> 00:02:15,308
So you've been around tech forever.

34
00:02:15,308 --> 00:02:17,668
And so, you know, you have this issue where.

35
00:02:17,668 --> 00:02:27,571
You start to come across things and then, you and then you start to see things happening
in media, like in movies and television around AI and the little things and robots and you

36
00:02:27,571 --> 00:02:28,371
know, whatever, right?

37
00:02:28,371 --> 00:02:30,802
RoboCop and all the Terminator and all this stuff, right?

38
00:02:30,802 --> 00:02:33,352
So pretty soon you start to grow up with it.

39
00:02:33,352 --> 00:02:39,934
then, you know, and then one of the things I don't think people realize is we've actually
been surrounded by AI for a very long time.

40
00:02:39,934 --> 00:02:42,955
Like Netflix uses AI to determine what we watch.

41
00:02:43,055 --> 00:02:46,506
Amazon determines, uses AI to determine what we buy.

42
00:02:46,768 --> 00:02:47,328
Right.

43
00:02:47,328 --> 00:02:55,488
Instagram, like all these things use AI and we're surrounded by a GPS and Google maps and
all these things use it as you know, and when you're at Microsoft and you guys were using,

44
00:02:55,488 --> 00:03:01,708
doing things in the AI space already on the machine learning site, essentially it's using
data to help to tell a story basically.

45
00:03:01,708 --> 00:03:02,288
Right.

46
00:03:02,288 --> 00:03:06,028
So we've been kind of around that AI space for, for a long time, you and I both.

47
00:03:06,088 --> 00:03:15,288
And, know, it wasn't until I really got really involved here at 1105 and when ChatGPT came
out and decided to write my books and the book done really well.

48
00:03:15,288 --> 00:03:16,624
The second book just came out.

49
00:03:16,624 --> 00:03:21,784
about two weeks ago called prompting made simple, which is all about how to prompt and how
to teach people prompt.

50
00:03:21,784 --> 00:03:25,344
I know we're gonna, we'll get that in a few minutes, but to answer your question of what
happened to me in November.

51
00:03:25,344 --> 00:03:26,444
So here's the thing I want to point out.

52
00:03:26,444 --> 00:03:28,144
It didn't happen to me personally.

53
00:03:28,144 --> 00:03:33,324
So what happened was, so I do, I have keynote talks that I do all around the world.

54
00:03:33,324 --> 00:03:40,164
And I had done a virtual version of this talk to a bunch of business executives in South
Carolina.

55
00:03:40,602 --> 00:03:43,014
And whenever I do my talk, I always put my phone number up on the screen.

56
00:03:43,014 --> 00:03:46,757
So if anybody has any questions, if you need any help, cause I know this is confusing.

57
00:03:46,757 --> 00:03:49,255
It's, there and I know it's going to be daunting.

58
00:03:49,255 --> 00:03:50,720
If you have any questions, call me.

59
00:03:50,720 --> 00:03:56,755
So I did that and literally like I hang up done and about an hour later, I get a phone
call and I noticed the phone number.

60
00:03:56,755 --> 00:03:58,579
says, I didn't know the number, but I noticed it's up there.

61
00:03:58,579 --> 00:03:59,728
know whatever we go.

62
00:03:59,728 --> 00:04:01,929
Maybe it's one of these guys who's calling me.

63
00:04:02,450 --> 00:04:07,113
So I answered the phone and it definitely was one of the guys and he goes, Hey, Reggie,
you don't know me.

64
00:04:07,414 --> 00:04:08,885
But, you know, great talk.

65
00:04:08,885 --> 00:04:09,680
I loved it.

66
00:04:09,680 --> 00:04:11,220
You know, I have a question for you.

67
00:04:11,220 --> 00:04:16,580
And I goes, he do you think AI can help me with my mental health?

68
00:04:16,720 --> 00:04:18,760
And I said, yeah, absolutely.

69
00:04:18,860 --> 00:04:21,020
And, you know, it just kind of depends.

70
00:04:21,020 --> 00:04:25,580
You know, you know, and I had never really thought about it, you know, in all that much
detail.

71
00:04:25,580 --> 00:04:29,420
And so we started chatting and I said, well, what can you tell me what's going on?

72
00:04:29,420 --> 00:04:30,600
Like what's, what's happening?

73
00:04:30,600 --> 00:04:34,560
And he goes, well, you know, I have severe mental health issues.

74
00:04:34,560 --> 00:04:36,360
And I said, oh, okay.

75
00:04:36,360 --> 00:04:37,880
Well, on scale of one to 10, how bad is it?

76
00:04:37,880 --> 00:04:39,280
And he said, it's a 15.

77
00:04:39,694 --> 00:04:41,725
And it kind of really caught me by surprise.

78
00:04:41,725 --> 00:04:46,827
And my first reaction was, whoa, like 15 means like you want to end things.

79
00:04:46,827 --> 00:04:47,307
Right.

80
00:04:47,307 --> 00:04:50,849
And so I asked him, well, do you feel suicidal?

81
00:04:50,849 --> 00:04:51,829
And he goes, yeah, all the time.

82
00:04:51,829 --> 00:04:54,024
And I go, well, why are you talking to me?

83
00:04:54,024 --> 00:04:55,000
Why aren't you talking to therapists?

84
00:04:55,000 --> 00:05:01,753
Because, well, gee, if I talk to so many therapists and, know, and when I'm in some of
them are great, some of them are not so great.

85
00:05:01,753 --> 00:05:03,194
Like I have my ego deal one right now.

86
00:05:03,194 --> 00:05:03,854
Yeah, I have one right now.

87
00:05:03,854 --> 00:05:09,506
But the challenge I'm having is that when I talk to her, I can do my hour and I'm done.

88
00:05:09,648 --> 00:05:12,688
And then I don't feel like there's any support for me afterwards.

89
00:05:12,688 --> 00:05:18,368
And then I got to go another month, you know, to get help and support again.

90
00:05:18,508 --> 00:05:19,648
And I said, yeah.

91
00:05:19,648 --> 00:05:21,948
And I was just wondering if there's anything AI can do.

92
00:05:21,948 --> 00:05:25,707
And look, this is, I'm not here to bad about therapists.

93
00:05:25,707 --> 00:05:27,308
Therapists are awesome.

94
00:05:27,548 --> 00:05:28,488
I've gone to therapy.

95
00:05:28,488 --> 00:05:29,968
I'm sure others have.

96
00:05:29,968 --> 00:05:38,808
Look, to me, I think the future here is going to be a world where AI can help augment what
the therapist does.

97
00:05:39,117 --> 00:05:45,952
And I think that if you're a therapist and you're listening to this, I think you need to
find a way to find out to embrace AI in your practice.

98
00:05:45,952 --> 00:05:47,173
Because here's the thing.

99
00:05:47,173 --> 00:05:49,414
If you have, let's just call it a patient.

100
00:05:49,414 --> 00:05:52,936
Let's just call this guy, patient zero.

101
00:05:52,936 --> 00:05:54,427
Let's, you know, we'll call this guy patient.

102
00:05:54,427 --> 00:05:57,149
So patient zero says, Hey, you know, he's got problems.

103
00:05:57,149 --> 00:05:58,780
You know, he's got challenges.

104
00:05:59,261 --> 00:06:05,985
But, know, and I'm sure some therapists can make exceptions, but in his mind, he's like, I
can't see that therapist again for a month.

105
00:06:05,985 --> 00:06:08,975
Well, so we started talking and I said, well,

106
00:06:08,975 --> 00:06:11,275
What I want you to do is I want to download ChatGPT.

107
00:06:11,275 --> 00:06:18,635
And he had done, he'd downloaded already ChatGPT, and he was using the free version and he
wasn't really doing much with it on his iPhone.

108
00:06:18,635 --> 00:06:20,795
And I said, I want you to get the paid version.

109
00:06:20,795 --> 00:06:22,275
So he gets the paid version.

110
00:06:22,275 --> 00:06:27,575
I told them where to go in the app to anonymize, so it doesn't use the data to train and
all that.

111
00:06:27,575 --> 00:06:30,535
so I said, I want you to do ChatGPT.

112
00:06:30,535 --> 00:06:32,275
I want you to do persona based prompt.

113
00:06:32,275 --> 00:06:34,875
And I want you to add ChatGPT to become a therapist.

114
00:06:35,183 --> 00:06:39,765
and a therapist that specializes in business executives or whatever the challenge you're
facing.

115
00:06:39,765 --> 00:06:44,727
In his case, he was having an issue with connecting with his 14 year old daughter, major
business problems.

116
00:06:44,727 --> 00:06:47,418
He and his wife were having major, major, major challenges.

117
00:06:47,418 --> 00:06:51,390
It's just like everything was collapsing around him.

118
00:06:52,471 --> 00:06:52,891
Yeah.

119
00:06:52,891 --> 00:06:55,612
And so it's just like he just didn't think he could get away.

120
00:06:55,612 --> 00:06:57,553
And everything was his fault.

121
00:06:57,673 --> 00:07:00,634
And when he goes to therapy, it's great.

122
00:07:00,634 --> 00:07:02,416
But then he leaves.

123
00:07:02,416 --> 00:07:09,536
And two days later, it's like the world just collapsed again and he can call his therapist
and it's only so often can you know, sometimes they don't answer.

124
00:07:09,836 --> 00:07:15,436
So I kind of, so I quickly, we spent about 20 minutes, 30 minutes going through and I
said, give it a prompt, give it this persona based thing.

125
00:07:15,436 --> 00:07:18,016
It's got a memory function, going remember, ask it to become a therapist.

126
00:07:18,016 --> 00:07:19,656
I specialize in this.

127
00:07:19,656 --> 00:07:21,836
Now explain your issue and your challenge.

128
00:07:21,876 --> 00:07:24,516
So I started to explain the issue, the challenges and some of the things that I saying.

129
00:07:24,516 --> 00:07:29,128
like, look, do you really want me to be on listening notes if I'm already this far along
and you might as well stay and help.

130
00:07:29,263 --> 00:07:31,783
So I did, I stayed along for another 15, 20 minutes.

131
00:07:31,783 --> 00:07:34,623
So all in, we're about an hour and hour and in this whole experiment.

132
00:07:34,703 --> 00:07:36,483
And this is the first time I'd ever done this.

133
00:07:37,703 --> 00:07:44,923
And started asking him questions and he started giving responses and he was just starting
to have a conversation.

134
00:07:45,063 --> 00:07:49,783
So to make a long story short, I said, all right, have the full conversation with Chai
CPT, but I want you to do me a favor.

135
00:07:49,783 --> 00:07:54,393
I don't want you to pretend you're talking to a robot or...

136
00:07:54,393 --> 00:07:57,606
or an algorithm, I want you to pretend you're actually talking to a therapist.

137
00:07:57,606 --> 00:08:03,471
Just pretend this therapist is in the Philippines or in India or in France, wherever your
mind wants to go.

138
00:08:03,471 --> 00:08:06,428
Just pretend you're talking to somebody online and it's multi-modal.

139
00:08:06,428 --> 00:08:08,875
So you're actually having the full conversation.

140
00:08:09,436 --> 00:08:11,863
And so he said, all right, so then do me a favor.

141
00:08:12,798 --> 00:08:19,043
Do this, have the conversations, get really serious and then call me in a couple of weeks.

142
00:08:19,324 --> 00:08:21,163
So three or four days go by and he calls me.

143
00:08:21,163 --> 00:08:22,384
I'm like, what's going on?

144
00:08:22,384 --> 00:08:23,439
He goes, say to the machine.

145
00:08:23,439 --> 00:08:26,179
Can I ask you if he become my business coach?

146
00:08:26,239 --> 00:08:27,699
I said, yeah, sure.

147
00:08:27,799 --> 00:08:28,819
No problem.

148
00:08:28,959 --> 00:08:30,859
Just tell it what you want it to be.

149
00:08:30,859 --> 00:08:36,059
I asked you, a you know, do you have a specific board makeup that you want to be your
business coach?

150
00:08:36,059 --> 00:08:40,179
Like if you had a board, a board of advisors to be your business coach, you would it be?

151
00:08:40,179 --> 00:08:47,582
And he said people like, you know, typical like Steve jobs, Mark Cuban, Warren Buffett,
you know, the, the, you know, the, the typical ones.

152
00:08:47,582 --> 00:08:49,296
guardrails into things, right?

153
00:08:49,315 --> 00:08:55,518
Yeah, and then I think he said Simonson or Adam Grant, one of those guys, But anyway, so
you have three or four.

154
00:08:55,518 --> 00:08:57,239
go, well, then do that.

155
00:08:57,239 --> 00:09:01,448
uh then Chachapiti's gonna know those three or four people anyway.

156
00:09:01,448 --> 00:09:06,484
So tell Chachapiti you want to be your personal board of directors, personal advisors to
help you with your business.

157
00:09:06,484 --> 00:09:09,596
And whatever you're having, just start talking with it, just like you do on the therapy
side.

158
00:09:09,596 --> 00:09:16,590
Start talking to Chachapiti from a business perspective, employee issues, growth issues,
cost issue, whatever it might be, just start telling it.

159
00:09:16,590 --> 00:09:18,864
It might ask you to upload some information.

160
00:09:18,864 --> 00:09:23,984
If you do anonymize data, whatever it might be and all that, you would be fine.

161
00:09:24,204 --> 00:09:24,744
So it starts happening.

162
00:09:24,744 --> 00:09:26,124
So then I don't hear from him.

163
00:09:26,124 --> 00:09:29,424
So then about two weeks later, he calls me and he goes, like, Hey, how's it going?

164
00:09:29,424 --> 00:09:31,444
He goes, wow, it's been amazing.

165
00:09:31,504 --> 00:09:32,244
I go, what's going on?

166
00:09:32,244 --> 00:09:32,544
Tell me.

167
00:09:32,544 --> 00:09:37,004
He goes, well, you know, I, you know, on the way to work, Chachi is my business coach on
the way home.

168
00:09:37,004 --> 00:09:38,664
It's my life coach.

169
00:09:38,664 --> 00:09:42,024
It's, it's, it's my therapist for, my, for me and my family.

170
00:09:42,024 --> 00:09:43,344
And I go, well, what's happened?

171
00:09:43,344 --> 00:09:45,104
I go, how are you feeling now on a scale of one to 10?

172
00:09:45,104 --> 00:09:45,869
He goes,

173
00:09:45,869 --> 00:09:48,040
right now, probably a three or four to be honest with you.

174
00:09:48,040 --> 00:09:51,522
I the last couple of weeks, the last week or so especially, I've been really good.

175
00:09:52,339 --> 00:10:03,139
I connected, I asked for tips on to connect with my daughter, uh a 14 year old daughter,
and it did, gave me, tattoo gave me three options, and I picked one and we started, I went

176
00:10:03,139 --> 00:10:12,875
to talk to my daughter and I put my phone away and I started just being present for her,
and then I started being, same thing for my wife, how do we become a better husband?

177
00:10:12,899 --> 00:10:14,910
more present because my wife's saying that I'm never present.

178
00:10:14,910 --> 00:10:15,532
I'm always on my phone.

179
00:10:15,532 --> 00:10:24,090
I'm always working on to be more present and you know and you know and I surprised my
daughter with Taylor Swift tickets and you know whatever right I mean and so all these

180
00:10:24,090 --> 00:10:25,364
things and he had never

181
00:10:25,364 --> 00:10:27,068
him to get those Taylor Swift tickets?

182
00:10:27,068 --> 00:10:29,347
Because that's a prompt I would like to know about.

183
00:10:29,347 --> 00:10:37,007
but but it didn't it didn't be like it didn't per se and I know you're joking but what it
did it did say, find a way to connect with your daughter.

184
00:10:37,007 --> 00:10:45,987
And right now Taylor Swift is Taylor Swift was one of the options and so he found out this
was Miami or wherever it was it didn't matter where it was fine but he got the tickets and

185
00:10:45,987 --> 00:10:49,467
it wasn't like he was trying to bribe or anything he just he wanted to find a way to
connect.

186
00:10:49,587 --> 00:10:54,415
And he would never do that he would never go to a table like the mom would go not him.

187
00:10:54,415 --> 00:11:00,475
The fact that the dad is going with the daughter, that really just surprised her and the
mom was happy and all that.

188
00:11:00,475 --> 00:11:04,075
So anyways, it was just really great to see.

189
00:11:04,355 --> 00:11:08,415
you know, and I know there's some issues and some challenges as a little bit of backlash.

190
00:11:08,415 --> 00:11:09,095
Like know J.B.

191
00:11:09,095 --> 00:11:12,635
Pritzker, the governor of Illinois, to my respect, he came out and signed a thing
yesterday.

192
00:11:12,635 --> 00:11:15,815
I don't know if you have saw or not, but he signed a thing saying no AI therapist.

193
00:11:15,915 --> 00:11:18,115
I think, and I think that's a little ass backwards.

194
00:11:18,115 --> 00:11:22,375
You know, I, I, I'm not saying that there's, I'm not saying the AI therapist should be the
only therapist.

195
00:11:22,375 --> 00:11:24,399
I think therapists should find a way to use.

196
00:11:24,399 --> 00:11:30,319
AI to augment what people to augment what they're doing to help their patients, just like
in this person's case.

197
00:11:30,319 --> 00:11:36,579
And so he still goes see this therapist, but he but in between uses Chachi PD as a
therapist and it gets kind of the best of both worlds.

198
00:11:36,579 --> 00:11:44,099
And when you're having a crisis at two o'clock in the morning, do you really give a shit
sorry for my French, you know, about whether it's a chatbot or something if it really

199
00:11:44,099 --> 00:11:45,371
helps you with your mental health?

200
00:11:45,371 --> 00:11:47,282
And that brings up a really good point.

201
00:11:48,024 --> 00:11:49,375
well, first of all, I I can go for it.

202
00:11:49,375 --> 00:11:50,886
Like that's what the show is about.

203
00:11:50,886 --> 00:11:52,268
m We talk about real things.

204
00:11:52,268 --> 00:11:59,814
So, but also having the accessibility and access to something at times in your mental
crisis, it doesn't happen at nine to five.

205
00:11:59,855 --> 00:12:03,919
It's going to happen at any given point in time and being able to have someone to talk to
can get you out of crisis.

206
00:12:03,919 --> 00:12:06,361
I mean, that's why we have emergency numbers for this.

207
00:12:06,361 --> 00:12:10,689
And ultimately speaking, when you look at those component pieces, I think you're a hundred
percent correct.

208
00:12:10,689 --> 00:12:16,489
And if he hadn't had those pieces and he was at that 15, how long until he actually broke
and did something terrible to himself?

209
00:12:16,729 --> 00:12:28,089
And the downside effect of taking that type of tool away from somebody, think is salient
to every conversation out there with mental health professionals.

210
00:12:28,089 --> 00:12:29,249
And don't get me wrong.

211
00:12:29,249 --> 00:12:36,409
mean, MIT had this study where they went through and they looked at the different
component pieces of how people were interacting with these different agentic functions.

212
00:12:36,409 --> 00:12:37,913
And there was one...

213
00:12:38,005 --> 00:12:47,839
of that study or one snippet that got brought in where uh folks are using an agentic AI
that was pre-prompt and pre-programmed and ready to go for mental health and the person

214
00:12:47,839 --> 00:12:56,223
that was testing it was pretending he was a 16 year old and the negative downside of it
was, you know, it was telling him, well, you should murder your parents and kill yourself

215
00:12:56,223 --> 00:12:57,513
and meet me in heaven.

216
00:12:57,993 --> 00:12:59,984
yes, that's the terrible downside of those things.

217
00:12:59,984 --> 00:13:04,576
And I hate to break it to everybody, but Hannibal Lecter mode is probably a real thing on
everything.

218
00:13:04,576 --> 00:13:07,039
And you could have bad human beings that do those things too.

219
00:13:07,039 --> 00:13:14,113
What we don't hear enough about are all the positive stories and the people that are using
this in a positive way to really help improve their mental health.

220
00:13:14,113 --> 00:13:18,315
And I think the story that you just presented is a fantastic example of that.

221
00:13:18,356 --> 00:13:26,460
Along those lines, with you creating this new book called Prompting Made Simple, you
mentioned a few of the prompts you would have and go in and kind of tailor some of these

222
00:13:26,460 --> 00:13:30,623
pieces for his uh almost different uh ideal self profiles.

223
00:13:30,623 --> 00:13:34,105
I want to call an ideal customer profile, since it's the marketing background.

224
00:13:34,105 --> 00:13:34,687
But

225
00:13:34,687 --> 00:13:37,479
being able to go through and turn yourself into the target for this thing.

226
00:13:37,479 --> 00:13:44,363
ah Do you have any tips or tricks um that you can share with us quickly that might be in
that book to get people interested and enticed?

227
00:13:44,363 --> 00:13:47,982
They can actually go through and read more about those um prompts that you're pushing
forward.

228
00:13:47,982 --> 00:13:50,422
Yeah, look, think that's a point.

229
00:13:50,422 --> 00:13:53,162
I'll get to that answer in just one second.

230
00:13:53,162 --> 00:14:04,322
But to your point about the negative side, As a parent, you got to watch what your kids
are doing because there are these really bad tools called character AI out there.

231
00:14:04,462 --> 00:14:17,518
so you got to be careful because there's a lot of male issues out there where they don't
feel like that they can find love and companionship and they're turning to AI.

232
00:14:17,518 --> 00:14:18,638
chatbots and everything.

233
00:14:18,638 --> 00:14:20,338
just be careful, monitor what they're doing.

234
00:14:20,338 --> 00:14:24,418
What I'm talking about is what's for a 40 year old gentleman, right?

235
00:14:24,418 --> 00:14:28,178
I'm not talking about a 15 year old boy, you know, so, so let's be careful.

236
00:14:28,178 --> 00:14:31,538
So I just want to put that out there that, know, you monitor their habits.

237
00:14:31,538 --> 00:14:35,998
Now, in terms of like prompts to watch and monitor, look here, here's the thing.

238
00:14:35,998 --> 00:14:39,878
To me, the easiest way to get into prompting is do a persona based prompt.

239
00:14:39,878 --> 00:14:41,778
And it's so much fun.

240
00:14:41,778 --> 00:14:42,798
Persona based prompts are great.

241
00:14:42,798 --> 00:14:44,398
If you have to go give,

242
00:14:44,398 --> 00:14:51,738
If you want, let's say you have to give it toast and you know, you're not a great public
speaker and you can use Chad GPT.

243
00:14:51,738 --> 00:14:54,878
Now they have Chad, by the way, Chad GPT 5 came out yesterday.

244
00:14:55,098 --> 00:15:08,198
And so it's, can, you can say, Hey, Chad GPT, I need you to take on the persona of Snoop
Dogg and help me write a best man's toast at my buddy's wedding on Saturday and hear all

245
00:15:08,198 --> 00:15:12,771
the things I want you to know about my best mom, about my buddy, his future wife.

246
00:15:12,771 --> 00:15:16,034
Here's a couple of funny stories and then give it to you in the voice of Snoop Dogg.

247
00:15:16,034 --> 00:15:18,526
Now, obviously that's for fun.

248
00:15:18,526 --> 00:15:22,640
You know, if you wanted to go learn how, if you wanted to learn how to bake a cake, right?

249
00:15:22,640 --> 00:15:28,384
Ask it to take on the role of a great world class baker, like Duff, you know, from Food
Network or whatever, right?

250
00:15:28,384 --> 00:15:29,626
Give it a persona.

251
00:15:29,626 --> 00:15:31,209
Maybe you want to get into politics.

252
00:15:31,209 --> 00:15:39,862
Well, if you're going to get into politics and you're a Democrat, you might say, Hey, it's
at GPT, I need you to take on the persona of Barack Obama and be my politics coach.

253
00:15:39,862 --> 00:15:46,544
and help me figure out how I can run for mayor, help me win run for mayor or run for
governor or whatever, whatever you want to do, right?

254
00:15:46,544 --> 00:15:47,502
Take that persona.

255
00:15:47,502 --> 00:15:50,425
You're Republican, maybe you want Trump or want Bush or you want somebody else.

256
00:15:50,425 --> 00:15:51,146
Who knows what you want?

257
00:15:51,146 --> 00:15:53,766
So my point is you take on that persona, right?

258
00:15:53,766 --> 00:16:01,785
You know, if you wanted to have a fun conversation and say, you know what, I love Jerry
Seinfeld and I want to, and I want to, you know, have a good, I want you to pretend you're

259
00:16:01,785 --> 00:16:02,379
Jerry Seinfeld.

260
00:16:02,379 --> 00:16:05,550
And like, if there were, if you want to have fun,

261
00:16:05,550 --> 00:16:10,390
just having a conversation with Jerry, just having a conversation with, pretending that
ChachiBee can act like Jerry Seinfeld, right?

262
00:16:11,010 --> 00:16:16,310
So, persona-based prompting is kind of the way to start, put your toes in the water and
kind of jump in, right?

263
00:16:16,310 --> 00:16:27,450
Well, once you guys start doing those things, the best prompt structure that I find is
that you want to give ChachiBeeT what I call a role, a task, then you want to give it

264
00:16:27,450 --> 00:16:29,910
context, and then you give it an ask.

265
00:16:29,910 --> 00:16:32,190
So this RTCA kind of thing.

266
00:16:32,190 --> 00:16:33,310
So what's the role?

267
00:16:33,310 --> 00:16:34,414
The role is,

268
00:16:34,414 --> 00:16:38,794
I want you to be a world-class guitar instructor.

269
00:16:39,414 --> 00:16:39,554
Right?

270
00:16:39,554 --> 00:16:41,414
I'm just saying that because you've got guitars with you.

271
00:16:41,414 --> 00:16:42,154
Right?

272
00:16:42,654 --> 00:16:49,274
And then the task is I want you to teach me how to, and I'm a beginner.

273
00:16:49,514 --> 00:16:52,774
I want you to teach me how to play the guitar like the edge.

274
00:16:53,434 --> 00:16:55,574
Context is, you know what?

275
00:16:55,734 --> 00:16:59,754
I'm a beginner or I played in high school a little bit.

276
00:16:59,754 --> 00:17:02,374
I'm now 52 years old and

277
00:17:02,668 --> 00:17:11,948
I haven't picked up a cup tower in 30 years and you know, I probably need help reading
sheet music again or whatever context you want to provide or hey, you know what?

278
00:17:11,948 --> 00:17:14,531
I broke my thumb so my thumb doesn't work all that well or whatever.

279
00:17:14,531 --> 00:17:17,662
Well, whatever context it needs to have, right?

280
00:17:17,662 --> 00:17:27,934
And then the ask is, you know, I need you to give me a plan that can help me learn how to
play, I don't know, Where the Streets Have No Name by U2 over the next 12 months because I

281
00:17:27,934 --> 00:17:31,185
want to play it for my wife at her birthday or whatever.

282
00:17:31,257 --> 00:17:33,139
And that's absolutely our song.

283
00:17:33,139 --> 00:17:35,241
So it says, roll task content.

284
00:17:35,241 --> 00:17:36,051
So start with that.

285
00:17:36,051 --> 00:17:37,763
So just think about that, right?

286
00:17:37,763 --> 00:17:39,094
Another one is chain prompting.

287
00:17:39,094 --> 00:17:42,567
Chain prompting is also, it's just maybe you don't know what the prompt should be.

288
00:17:42,567 --> 00:17:44,207
Just start with a question.

289
00:17:44,648 --> 00:17:46,850
Hey, hey, chat GPT.

290
00:17:46,957 --> 00:17:48,911
I'm thinking about buying a guitar.

291
00:17:49,092 --> 00:17:50,473
What do you recommend?

292
00:17:50,933 --> 00:17:55,196
And it's gonna say, well, know, this, this and this and say,

293
00:17:55,777 --> 00:17:57,178
Can you tell me more about this brand?

294
00:17:57,178 --> 00:17:58,419
you tell me more about that brand?

295
00:17:58,419 --> 00:18:00,100
now you just, that's chain prompting.

296
00:18:00,100 --> 00:18:05,874
says it's actually conversational prompting where you're just like, if you were just
talking, it was just like you and I were having a conversation about the guitar.

297
00:18:05,874 --> 00:18:08,446
I'm like, dude, I don't know if I could talk, let's learn to play guitar.

298
00:18:08,446 --> 00:18:10,347
And you know, where do I even start?

299
00:18:10,347 --> 00:18:12,209
You give me an answer and I go back and forth.

300
00:18:12,209 --> 00:18:13,370
And here's the thing, right?

301
00:18:13,370 --> 00:18:19,954
You know, and then you kind of get to the next one, you know, which is really interview
based kind of prompting, you know, which you can combine things.

302
00:18:19,954 --> 00:18:22,336
So can say, chat GPT, I really want to learn how to play guitar.

303
00:18:22,336 --> 00:18:24,137
Here's my background, here's my experience.

304
00:18:24,397 --> 00:18:32,837
please ask me questions that I can then provide you answers for to help you give me a
better response, help you take care of my ask a lot better.

305
00:18:32,837 --> 00:18:37,257
So when you start asking, when you start getting ChatGP to ask you questions, that's
really a game changer.

306
00:18:37,257 --> 00:18:39,477
And it's got a memory function in ChatGPT.

307
00:18:39,477 --> 00:18:42,677
So it'll remember your conversations, remember these things.

308
00:18:42,677 --> 00:18:46,137
those are kind of the three or four fun ways to just start the process.

309
00:18:46,137 --> 00:18:48,297
It's easy, it's not daunting.

310
00:18:48,437 --> 00:18:51,097
It's a nice way to just not even dive in at first.

311
00:18:51,097 --> 00:18:53,937
It's just a nice way to just jump in with both feet and just have some fun.

312
00:18:54,581 --> 00:18:55,597
Simple ways to go.

313
00:18:55,597 --> 00:18:56,167
like a human.

314
00:18:56,167 --> 00:19:00,892
Don't try to sit there and tweak P and I values um because that's a dead end for most
folks.

315
00:19:00,892 --> 00:19:02,642
So no, that's great advice.

316
00:19:02,642 --> 00:19:10,762
say, yeah, Jason, like I always say, look, and I know Sam, not a big fan of Sam, is not a
big fan of this, but I always say, please, thank you.

317
00:19:11,182 --> 00:19:15,922
and I do it for two, and I do it for one primary reason.

318
00:19:15,922 --> 00:19:21,302
The primary reason is that it helps me in my mind mentally feel like I'm talking to a
human.

319
00:19:22,022 --> 00:19:28,602
Now the secondary reason I do it, and this is more of the joking reasons, is just in case
a Terminator robots do come for us, then they're gonna remember I was nice.

320
00:19:29,402 --> 00:19:30,232
So, so there you go.

321
00:19:30,232 --> 00:19:33,783
be sent to the nice meat factory, not the not the angry one.

322
00:19:33,783 --> 00:19:34,283
Yeah.

323
00:19:34,283 --> 00:19:35,102
No, that's a great answer.

324
00:19:35,102 --> 00:19:35,643
Great response.

325
00:19:35,643 --> 00:19:46,446
So um we're talking about giving a lot of personal information to these different elements
and giving all the contextual clues behind it, which actually might if somebody nefarious

326
00:19:46,446 --> 00:19:53,878
tech data or maybe not nefarious, maybe my health insurance company picked this up and
extract this information because the PII that's available inside of it in this country is

327
00:19:53,878 --> 00:19:57,759
not protected where other countries do um along those lines.

328
00:19:57,803 --> 00:20:02,558
How do people protect themselves and their personal information in this kind of space?

329
00:20:02,558 --> 00:20:10,825
Or should they even really worry about that given the near anonymized layering bits of
information that sit inside these LLMs?

330
00:20:11,607 --> 00:20:13,918
So I'm going give you a cop out answer, right?

331
00:20:13,918 --> 00:20:25,721
I'm going to give you the cop out answer is I'm a believer that the healthcare companies
in your example already have an idea because they have access to your Facebook data.

332
00:20:26,061 --> 00:20:27,369
They're buying the Facebook data.

333
00:20:27,369 --> 00:20:32,037
Like if all of a sudden you join a group about kidney disease, well, you know what mean?

334
00:20:32,043 --> 00:20:35,720
Or, you know, so they know, so they know what drugs you're on.

335
00:20:35,720 --> 00:20:39,041
You know, they know, they know if all of sudden you have a bunch of

336
00:20:39,041 --> 00:20:43,363
high blood pressure medicines and they know this guy probably has coronary artery disease.

337
00:20:43,363 --> 00:20:48,825
So I guess so that's so that's the cop outside of me, which is saying that I think they
already know.

338
00:20:48,945 --> 00:20:54,787
So number one, you know, and by the way, there's a hack every day of somebody's system.

339
00:20:54,927 --> 00:20:55,927
who knows?

340
00:20:55,988 --> 00:21:07,092
So number one, number two is, you know, there is security layers built within all of these
different LLMs, whether it's Chet GPT or Gemini or GROC or whatever the case might be,

341
00:21:07,092 --> 00:21:07,493
right?

342
00:21:07,493 --> 00:21:09,063
There is security built in.

343
00:21:09,262 --> 00:21:17,982
I happen to tell people, and I know people that do this, that if you're going to upload
your blood work, anonymize the blood work, meaning print it out, print out your blood

344
00:21:17,982 --> 00:21:25,022
work, take a black Sharpie, Sharpie out your name, your date of birth, and whatever else
might be identifiable.

345
00:21:25,922 --> 00:21:31,922
And then you can upload that and make sure your settings is put to, do not use my data to
train.

346
00:21:32,282 --> 00:21:33,822
So you got to go into settings and make sure that's there.

347
00:21:33,822 --> 00:21:34,702
If you don't do that, fine.

348
00:21:34,702 --> 00:21:38,085
Now, if you have the Teams version of chat GPT,

349
00:21:39,005 --> 00:21:39,869
Teams, yeah.

350
00:21:39,869 --> 00:21:46,029
ChatGifty, which is $25 versus $20, you have to have at least two employees, it's kind of
on by default.

351
00:21:46,029 --> 00:21:49,589
But if you don't, and you're paying the 20 bucks a month, then you have to go do it
manually.

352
00:21:49,589 --> 00:21:52,349
Now, if you have the free version, you don't have the option to do it at all.

353
00:21:53,029 --> 00:21:58,589
So, and if you want to opt out of things, you can, but it's gonna take, and still gonna
need your data for 90 days to train, because they need your data to train.

354
00:21:58,589 --> 00:21:59,929
And so that's the thing, right?

355
00:21:59,929 --> 00:22:01,329
Data is the new oil.

356
00:22:01,609 --> 00:22:07,869
And they need that data to, they need, because ChatGifty is a refinery that sits on top of
the data that gives it an output.

357
00:22:08,297 --> 00:22:12,641
And so, so I'm of the belief that go ahead and do it, upload it.

358
00:22:12,641 --> 00:22:19,546
Like for example, this gentleman, when he did it, we made sure setting was on, not this
on, on, on, do not share my data.

359
00:22:19,586 --> 00:22:29,054
And, you know, it knows who he is because he had to log in, but he didn't put his name,
you know, in when he was having a conversation with chgpt, he didn't use his name.

360
00:22:29,054 --> 00:22:30,705
He actually came up with a different name.

361
00:22:30,705 --> 00:22:35,419
And I told him, when I told him again, have chgpt address you by initials.

362
00:22:35,419 --> 00:22:37,090
So JC or whatever it was.

363
00:22:37,090 --> 00:22:37,681
Right.

364
00:22:37,681 --> 00:22:38,423
So.

365
00:22:38,423 --> 00:22:41,234
So anyways, that's way to start doing it.

366
00:22:41,234 --> 00:22:42,657
then that's what I would do.

367
00:22:42,657 --> 00:22:47,761
whether it's for your health information or your business information, anonymize your
information.

368
00:22:47,761 --> 00:22:49,783
It's always just a good practice to anonymize it.

369
00:22:49,783 --> 00:22:53,486
Like if you're going to upload your sales funnel, you don't want that out there.

370
00:22:53,486 --> 00:22:54,740
You never know what happens.

371
00:22:54,740 --> 00:23:01,913
if it's an Excel file and you're uploading it or Google Sheets or whatever it might be,
and you're doing business, make it up.

372
00:23:01,913 --> 00:23:05,475
Let's say you're doing business with Gibson guitars.

373
00:23:05,496 --> 00:23:07,231
Well, instead of putting Gibson guitars,

374
00:23:07,231 --> 00:23:09,562
in the name of the company field, but gg.

375
00:23:10,082 --> 00:23:10,802
Right?

376
00:23:10,802 --> 00:23:12,623
So I didn't just have a little key that does that.

377
00:23:12,623 --> 00:23:13,523
And you're fine.

378
00:23:13,523 --> 00:23:19,546
And just don't make sure it doesn't have your, you know, your, your, your, your, your
go-to client's name or their email address, whatever.

379
00:23:19,546 --> 00:23:22,587
mean, you don't need that stuff anyway to end to, to analyze your funnel.

380
00:23:22,587 --> 00:23:28,309
And if you're in Salesforce, you can download this up and Salesforce has its own AI stuff
that you can do these things too.

381
00:23:28,373 --> 00:23:29,614
Yeah, no, that's great.

382
00:23:29,614 --> 00:23:34,358
And I think that's a salient point that's often missed by most people out there that you
actually control this data.

383
00:23:34,358 --> 00:23:36,660
You control the input and how you put it into those systems.

384
00:23:36,660 --> 00:23:43,526
em What advice do you have for people to make sure that when they put those pieces in,
aren't revealing all of those details?

385
00:23:43,526 --> 00:23:47,929
And should they do it as part of the prompt process to not store things or not remember
certain things?

386
00:23:47,929 --> 00:23:56,246
Like, for example, if I'm telling a personal story to my AI therapist about something
that's going on and I've got other names of people that I'm putting those pieces in when

387
00:23:56,246 --> 00:23:57,951
I'm talking to a regular therapist,

388
00:23:57,951 --> 00:23:59,202
I don't hide names.

389
00:23:59,202 --> 00:24:03,986
tell all the pieces so I can get all those pieces exposed and out.

390
00:24:04,006 --> 00:24:11,993
Any advice for folks that want to be able to have that level of conversation and have that
level of anonymity, em but still be able to use those tools in that kind of way?

391
00:24:12,620 --> 00:24:16,400
Yeah, look, I think you can still have them instead of saying the name of your wife, say
my wife.

392
00:24:16,400 --> 00:24:21,360
mean, like, I don't, don't think that we need to complicate this by any stretch of the
imagination.

393
00:24:21,360 --> 00:24:29,339
I think practicing really good operational security here in terms of being able to, what
you're saying and what you're doing is really important.

394
00:24:29,460 --> 00:24:34,280
look, know, therapists take all their notes and they dictate them into a device anyways.

395
00:24:34,280 --> 00:24:40,560
And they take that and they take those notes and they put them on their computers, on
their laptops, wherever they put them.

396
00:24:40,560 --> 00:24:42,144
And they might load them into the cloud.

397
00:24:42,144 --> 00:24:42,684
Well, guess what?

398
00:24:42,684 --> 00:24:44,796
That's hackable, you know, or whatever it might be.

399
00:24:44,796 --> 00:24:47,996
look, I'm not, I'm not, I'm not a defeatist here.

400
00:24:47,996 --> 00:24:48,789
I mean an alarmist.

401
00:24:48,789 --> 00:24:51,784
mean, like everything is hackable and everything is there.

402
00:24:51,784 --> 00:24:56,514
All I'm simply saying is that, you know, don't, don't be afraid by, by that side of this.

403
00:24:56,514 --> 00:25:02,848
If you, if you really are having a challenge, mental health, regular health, and you need
some help and support, just start there.

404
00:25:02,848 --> 00:25:11,051
Look, I broke my wrist about three months ago and, and I went to urgent care and I sit and
I got the X-ray and I said, Hey,

405
00:25:11,051 --> 00:25:19,481
I went to the radio and don't tell me I want to say I walked up and I said, I can't take a
picture of the extra because yeah, so I went up my phone, my phone, I took a picture of

406
00:25:19,481 --> 00:25:22,464
the x ray, and then uploaded a chat GPT.

407
00:25:22,644 --> 00:25:24,184
And I said, what did I do?

408
00:25:24,444 --> 00:25:33,787
And it said, Yeah, man, you have a break in your right wrist here exactly the spot where
it said, and I said, here are the three, here's the three types of breaks it could be.

409
00:25:33,787 --> 00:25:39,368
But I really think it's number one, which is the nevolution fracture of your of your right
wrist underneath your left thumb.

410
00:25:39,368 --> 00:25:40,208
Okay.

411
00:25:40,300 --> 00:25:41,561
So my buddy's an orthopedic doctor.

412
00:25:41,561 --> 00:25:45,204
went the next morning to go see him and I said, what did I do?

413
00:25:45,204 --> 00:25:49,047
goes, dude, you broke your right wrist on your thumb and you have an avulsion fracture.

414
00:25:49,188 --> 00:25:51,080
I said, look, let me show you something.

415
00:25:51,080 --> 00:25:52,535
And so I showed him what Tech GPT said.

416
00:25:52,535 --> 00:25:53,632
goes, I gotta lose my job.

417
00:25:53,632 --> 00:25:55,734
So look, I mean, it's super smart.

418
00:25:55,734 --> 00:26:00,928
I mean, it's just, you know, you gotta get up over the hangup that you're talking to this
super advanced chat bot.

419
00:26:00,928 --> 00:26:07,804
And by the way, Tech GPT-5, from what I'm reading, what I can tell, apparently has a PhD
level of education, you know?

420
00:26:08,170 --> 00:26:09,440
So that's pretty cool.

421
00:26:09,440 --> 00:26:13,423
mean, if you get a PhD in your boxing, by the way, not just for mental, for anything you
want to do.

422
00:26:13,564 --> 00:26:14,735
Let's say you want to start a business.

423
00:26:14,735 --> 00:26:17,957
Let's say you want to figure out how to sell something to a new client.

424
00:26:17,957 --> 00:26:21,479
Let's say you want to plan a party, whatever you want to do, man.

425
00:26:21,479 --> 00:26:24,731
I'm telling you, this stuff is going to be wild.

426
00:26:24,731 --> 00:26:25,436
That's here today.

427
00:26:25,436 --> 00:26:29,875
And by the way, if this was a baseball game, this is the first pitch of the game.

428
00:26:29,875 --> 00:26:31,776
That's how early we are in this.

429
00:26:31,793 --> 00:26:32,863
It's actually so good.

430
00:26:32,863 --> 00:26:40,666
just use it today to go through and actually drag across multimodal functions from
multiple different chats that I had before and suck them into the chat GPT-5.0 engine and

431
00:26:40,666 --> 00:26:43,757
basically said replay and the output was entirely different.

432
00:26:43,757 --> 00:26:46,057
I mean, it's amazing how good it is.

433
00:26:46,057 --> 00:26:53,809
And you're right, like that level of kind of uh almost intellectual fluency and being able
to augment those pieces as individuals is really, really high.

434
00:26:53,809 --> 00:26:57,360
But I also think that there's something to this that we forget.

435
00:26:57,360 --> 00:26:59,615
And that's that these are

436
00:26:59,615 --> 00:27:04,300
These are tools that we can use to make our lives better, but it's just like any other
tool out there.

437
00:27:04,300 --> 00:27:07,312
If you don't take the time to use it right, you're going to do bad things.

438
00:27:07,312 --> 00:27:10,786
mean, a Swiss Army knife is fantastic for whittling things or taking things down.

439
00:27:10,786 --> 00:27:12,267
It can also be used to cut your finger off.

440
00:27:12,267 --> 00:27:17,653
So at what point do people actually need to take responsibility for some of these pieces?

441
00:27:17,653 --> 00:27:19,364
And I think we need to learn how to do it now.

442
00:27:19,364 --> 00:27:24,257
um But along those lines, ethical guardrails and other pieces can be put into place.

443
00:27:24,257 --> 00:27:33,137
So these companies that are out there creating these, these agentic AI functions and using
them to go through and, and a lot of the times they're eliminating human capital resources

444
00:27:33,137 --> 00:27:39,417
because they're trying to save money and put those pieces in place, which I don't
necessarily want to get into that conversation, but I do want to get in the conversation

445
00:27:39,417 --> 00:27:46,837
of replacing the ethical values of human beings with the ethical values of prompt
functions inside of AI agentic AI functions.

446
00:27:46,857 --> 00:27:54,133
Do you think that companies are taking enough precautions today or do you think we're kind
of, like you said, first pitch?

447
00:27:54,133 --> 00:28:04,908
Wild Wild West, we're going to see what shakes out and what kind of near-term consequences
do you think we're going to see as far as the casualties of the initial portions of this?

448
00:28:04,908 --> 00:28:06,588
Well, Jason, I think that's a great question.

449
00:28:06,588 --> 00:28:11,236
But I think part of the fallacy in this whole thing is we're assuming humans are ethical.

450
00:28:11,701 --> 00:28:12,929
Very good point.

451
00:28:15,794 --> 00:28:25,703
You know, I think we can probably all agree that right now there are some things happening
that where ethics are really being questioned.

452
00:28:25,703 --> 00:28:27,303
And I'm going to leave it at that.

453
00:28:27,364 --> 00:28:30,666
And so that's the first point.

454
00:28:31,067 --> 00:28:40,835
In terms of what's happening in organizations and the ethics behind this is to me, think,
so let's step back a little bit and talk about what are the overarching concerns about AI

455
00:28:41,015 --> 00:28:42,092
primarily, right?

456
00:28:42,092 --> 00:28:44,112
So number one is the Superman effect.

457
00:28:44,212 --> 00:28:51,912
So Superman effect is imagine a Superman spaceship crash landed in Osama bin Laden's
backyard instead of Ma and Pa Kent's backyard in Kansas.

458
00:28:51,932 --> 00:28:53,032
Well, what kind of Superman would we have?

459
00:28:53,032 --> 00:28:59,292
Well, we probably have one that's very pro, pro freedom and all this stuff.

460
00:28:59,292 --> 00:29:02,932
We'd probably all be living under Osama bin Laden's rule right now.

461
00:29:02,932 --> 00:29:03,612
All right.

462
00:29:03,612 --> 00:29:05,852
Unless Lex Luthor ends up being a good guy.

463
00:29:05,852 --> 00:29:06,632
You know what I mean?

464
00:29:06,632 --> 00:29:07,596
so anyway, so

465
00:29:07,596 --> 00:29:08,396
You never know.

466
00:29:08,396 --> 00:29:10,156
So, and we find kryptonite somewhere.

467
00:29:10,156 --> 00:29:12,576
anyway, the point of it is that that's a superman effect.

468
00:29:12,576 --> 00:29:12,656
Right.

469
00:29:12,656 --> 00:29:16,776
So we have to watch out for the, so basically with good AI, there's going be bad AI.

470
00:29:17,276 --> 00:29:21,316
And like, do we, do we bad mouth car manufacturing?

471
00:29:21,316 --> 00:29:30,996
Do we bad mouth the guys that invented the combustion engine for a car that gets in a car
accident because the user decided to drive it the wrong way drunk.

472
00:29:31,836 --> 00:29:32,716
Right.

473
00:29:32,716 --> 00:29:36,575
So, so with, so with, you know, with,

474
00:29:36,575 --> 00:29:41,857
with all the opportunities that AI can provide, it's the humans have to make a decision.

475
00:29:41,857 --> 00:29:46,298
Are we going to use AI for a utopian world or a dystopian world?

476
00:29:46,298 --> 00:29:48,408
And I absolutely hope we use it for a utopian world.

477
00:29:48,408 --> 00:29:49,639
that's the move.

478
00:29:50,079 --> 00:29:58,941
Number two is I believe the thing that could derail this whole AI experiment is deepfakes.

479
00:29:59,002 --> 00:30:03,997
To me, deepfakes are hands down probably one of the most dangerous things.

480
00:30:03,997 --> 00:30:07,328
on the planet today, almost as dangerous, if not more dangerous than nuclear weapons.

481
00:30:07,328 --> 00:30:08,279
That's a big statement.

482
00:30:08,279 --> 00:30:08,889
I get it.

483
00:30:08,889 --> 00:30:10,427
know a nuclear bomb goes off in LA.

484
00:30:10,427 --> 00:30:12,911
It's going to wipe out all of Southern California and probably the West coast.

485
00:30:12,911 --> 00:30:13,591
get it.

486
00:30:13,591 --> 00:30:17,673
My point is, that chance of that happening, pretty, pretty, pretty, might.

487
00:30:17,673 --> 00:30:23,575
But somebody could create a deep, you have me, a friend, family, daughter, son, relative,
whatever.

488
00:30:23,575 --> 00:30:26,437
And all of sudden to that person, they can extort money.

489
00:30:26,437 --> 00:30:31,059
They can put their, you know, Taylor Swift's face got put on the face of porn stars.

490
00:30:31,059 --> 00:30:32,447
So, you know, and so

491
00:30:32,447 --> 00:30:34,208
like bullying aspects of it.

492
00:30:34,208 --> 00:30:40,949
if you know, I would encourage you and your audience to Google CFO Hong Kong deep fake.

493
00:30:40,949 --> 00:30:48,231
And that is essentially what happened was, you know, and I think you're going to Google it
right now, but the CFO Hong Kong deep fake.

494
00:30:48,470 --> 00:30:51,932
And while you're doing it, I'll give you a second to Google it and I'll tell you what
happened.

495
00:30:51,932 --> 00:30:52,832
So.

496
00:30:53,433 --> 00:31:01,865
So what happened was a finance analyst at a finance firm in Hong Kong got an email

497
00:31:03,148 --> 00:31:08,348
I got a Zoom invite from the CFO and like the head of finance.

498
00:31:09,148 --> 00:31:11,888
And this is the boss and the boss is boss.

499
00:31:11,948 --> 00:31:19,148
And it came from their email, but it turned out the email address was spoofed and got the
email to join the Zoom call.

500
00:31:19,428 --> 00:31:23,328
let's just for the sake of argument, say that the person's name is Jason.

501
00:31:23,328 --> 00:31:29,308
So Jason joins the Zoom call and Jason is talking to Rajiv and Jeff.

502
00:31:30,923 --> 00:31:36,043
Rajiv and Jeff tell Jason, Hey Jason, we've entered a new relationship with Kelly.

503
00:31:36,463 --> 00:31:38,363
Please wire Kelly $25 million.

504
00:31:39,863 --> 00:31:40,803
Okay.

505
00:31:40,883 --> 00:31:43,463
You're my boss and my boss is a boss and I'll wire the money.

506
00:31:43,463 --> 00:31:44,983
Well, guess what?

507
00:31:45,283 --> 00:31:48,203
Both Rajiv and Jason were deep faked.

508
00:31:48,503 --> 00:31:49,903
It wasn't us.

509
00:31:49,943 --> 00:31:59,003
So the money's being sent out, the money's wired and you or the Jason in this scenario
says, you know what?

510
00:31:59,003 --> 00:31:59,423
This is awesome.

511
00:31:59,423 --> 00:32:00,963
Something's going on here, man.

512
00:32:01,067 --> 00:32:12,367
like I know you guys told me to do this and he goes to the CFO, he goes over or whatever
and he calls the CFO and says, hey dude, boss, how much more money are we gonna send

513
00:32:12,367 --> 00:32:13,167
Kelly?

514
00:32:13,347 --> 00:32:14,727
Like, what are you talking about?

515
00:32:15,687 --> 00:32:19,067
Well, the last couple of weeks I've been sending the money like you told me to, what
money?

516
00:32:19,147 --> 00:32:21,607
So and they discovered that was all de-faked and all this stuff.

517
00:32:21,607 --> 00:32:23,987
so it's stuff like that.

518
00:32:23,987 --> 00:32:24,257
Right.

519
00:32:24,257 --> 00:32:33,871
mean, I know folks right now who have elderly parents who are getting scammed saying that,
but in all that all it needs is like 10 seconds of your voice and my voice and our voice

520
00:32:33,871 --> 00:32:40,604
is out there to replicate our voice and call our parents and say, hey, mom, hey, we have
your son areas.

521
00:32:40,604 --> 00:32:46,921
Rajiv is held hostage unless you send half a Bitcoin or a Bitcoin or send 10, 15, 20, 30
thousand dollars.

522
00:32:46,921 --> 00:32:47,812
We're going to kill him.

523
00:32:47,812 --> 00:32:48,577
Right.

524
00:32:48,677 --> 00:32:53,789
So, you know, an older and elderly people like my dad, even if it's a scam,

525
00:32:53,867 --> 00:32:55,207
would answer every phone call.

526
00:32:55,207 --> 00:32:56,447
I'm like, Dad, stop it.

527
00:32:56,447 --> 00:32:57,427
You know, don't do that anymore.

528
00:32:57,427 --> 00:32:58,167
Right?

529
00:32:58,167 --> 00:32:58,967
Stop responding to emails.

530
00:32:58,967 --> 00:33:06,547
And so, so that's where I have a friend of mine who that happened to even Miami and you
know, his, his mom got a call saying we have your son, da da.

531
00:33:06,547 --> 00:33:10,227
And so the mom hung up and call the daughter in a panic.

532
00:33:10,227 --> 00:33:11,187
She goes, Mom, what are you talking about?

533
00:33:11,187 --> 00:33:12,827
I just talked to Mark five minutes ago.

534
00:33:12,827 --> 00:33:13,527
It's fine.

535
00:33:13,527 --> 00:33:21,472
You know, so like, you know, and, you know, the same thing happened to the CEO, CEO
Ferrari, where one of his employees got deepfakes saying, Hey,

536
00:33:21,472 --> 00:33:29,214
But then what he did was he asked the deepfaker, the question that only the Ferrari CEO
would know, and he got it wrong and that guy hung up and ran away.

537
00:33:29,214 --> 00:33:32,605
But look, so the deepfake thing to me is absolutely crazy.

538
00:33:32,605 --> 00:33:34,935
It's out there and everybody's got to be careful.

539
00:33:34,935 --> 00:33:39,586
And so and it doesn't help that we have some of our leaders are using it to their
advantage.

540
00:33:39,586 --> 00:33:41,187
It's just really bad and it doesn't help.

541
00:33:41,187 --> 00:33:42,777
it's really bad news.

542
00:33:42,777 --> 00:33:45,632
So people got to be watching out for deepfakes, especially if you have a daughter.

543
00:33:45,632 --> 00:33:48,752
You have a daughter, sister, whatever.

544
00:33:48,752 --> 00:33:50,279
You got to protect them.

545
00:33:50,283 --> 00:33:52,703
somehow and I don't really have a good answer for how to protect them.

546
00:33:52,703 --> 00:33:57,663
One of things you can do is you can get a Google alert for your name.

547
00:33:57,703 --> 00:34:00,603
Unfortunately, Rajiv Kapoor in India is like John Smith.

548
00:34:00,603 --> 00:34:03,383
So for me, I get quite a few alerts.

549
00:34:03,383 --> 00:34:04,503
But you know, it's something, right?

550
00:34:04,503 --> 00:34:10,623
And then, know, just, you look at Google, you know, like when you're on Instagram or
TikTok, look for your name, make sure people are not, you know, co-opting your name or

551
00:34:10,623 --> 00:34:13,603
your image to find ways to protect yourself and just start.

552
00:34:13,603 --> 00:34:17,103
Come up with a family safe word is another good one.

553
00:34:17,123 --> 00:34:19,573
So maybe it might be, I love pineapple on pizza.

554
00:34:19,573 --> 00:34:23,084
or whatever and you don't, but that's your safe word or phrase or whatever it might be.

555
00:34:23,084 --> 00:34:24,774
So those are some things like that you, that you can do.

556
00:34:24,774 --> 00:34:32,863
And then the third thing is, that, you know, like I, you know, if we are headed down, like
there's two camps, the one camp is we're going to be, everyone's going to lose their jobs

557
00:34:32,863 --> 00:34:39,775
and we're going to go to two day work week or we're just going to have their job or do
work two days a week, or it's going to create millions of new jobs that we haven't thought

558
00:34:39,775 --> 00:34:40,038
of yet.

559
00:34:40,038 --> 00:34:41,529
So those are the two camps.

560
00:34:41,989 --> 00:34:44,109
I kind of bounced back and forth.

561
00:34:44,150 --> 00:34:46,450
So of where we are, cause I just think it's too early to tell.

562
00:34:46,450 --> 00:34:48,651
And anybody who knows the answer is full of shit.

563
00:34:48,651 --> 00:34:49,791
Nobody knows.

564
00:34:50,251 --> 00:35:00,371
And so, look, I just think that we're going to have to at some point grapple with the fact
that if we start going to a four day, three day work week, these kinds of things, then

565
00:35:00,371 --> 00:35:02,591
we're to have to grapple with universal basic income.

566
00:35:02,591 --> 00:35:04,231
We're going to have to grapple with some of these things.

567
00:35:04,231 --> 00:35:05,711
And so what does that look like?

568
00:35:05,731 --> 00:35:07,031
So that's kind of weird.

569
00:35:07,031 --> 00:35:10,723
And I think those are really the three big things that we're going to have to really
address in the future.

570
00:35:11,327 --> 00:35:11,918
That's great.

571
00:35:11,918 --> 00:35:20,774
And it's funny you mentioned because those three things that you mentioned, we've talked
about several times on this show, um probably ad nauseam, um and it's nice to get

572
00:35:20,774 --> 00:35:25,178
validation that what we're thinking about is actually is actually salient.

573
00:35:25,178 --> 00:35:26,729
So this has been really fantastic.

574
00:35:26,729 --> 00:35:27,709
I really appreciate it.

575
00:35:27,709 --> 00:35:28,970
um Yeah.

576
00:35:28,970 --> 00:35:32,713
And, Rashiv, if people want to learn more about you or want to connect with you, where
should they go?

577
00:35:33,067 --> 00:35:36,787
Well, look, first of all, both books are available.

578
00:35:36,787 --> 00:35:37,627
The first book, A.I.

579
00:35:37,627 --> 00:35:43,407
Made Simple, the third edition just came out, doing a small addendum to it because it's
IGPT-5 now.

580
00:35:43,407 --> 00:35:45,467
Prompting Made Simple is out, that one's doing really well.

581
00:35:45,467 --> 00:35:47,927
So again, these books are here to democratize A.I.

582
00:35:47,927 --> 00:35:50,007
to make it simple for everybody to understand.

583
00:35:50,007 --> 00:35:51,267
These books are targeted at the masses.

584
00:35:51,267 --> 00:35:53,087
They're not targeted at technical people.

585
00:35:53,087 --> 00:36:00,007
This is targeted to your mom, your dad, younger folks, people who are not technically
literate, whatever the case might be.

586
00:36:00,007 --> 00:36:01,207
This is for them.

587
00:36:01,223 --> 00:36:12,788
And so that's who's targeted towards you can also go to my website, rajerevrajeev.ai check
it out, LinkedIn, people want to connect with me on LinkedIn, name of my company is oh

588
00:36:12,788 --> 00:36:16,019
1105media, 1105media.com check that out as well.

589
00:36:16,019 --> 00:36:20,161
And if people can find me and I'm sure you're gonna have stuff in the show notes.

590
00:36:20,161 --> 00:36:22,621
I was just going to mention, we'll put all that stuff in the show notes for everybody.

591
00:36:22,621 --> 00:36:26,820
Well again, thank you very much for joining, we really appreciate it and we look forward
to talking to you again soon.

592
00:36:26,831 --> 00:36:28,033
Great conversation there, Jason.

593
00:36:28,033 --> 00:36:29,095
Nice job with that interview.

594
00:36:29,095 --> 00:36:31,480
Rajiv Kapoor, thanks so much for being on the show.

595
00:36:31,480 --> 00:36:35,053
Again, all those links that were just mentioned, you can find those in the show notes for
this episode.

596
00:36:35,053 --> 00:36:40,138
And if you've enjoyed this episode, please do share it with somebody who could benefit
from the information that was shared.

597
00:36:40,138 --> 00:36:44,261
You can find links to do that and the links that we just mentioned at brobots.me.

598
00:36:44,261 --> 00:36:47,604
That's also where you're going to find another new episode from us next week.

599
00:36:47,604 --> 00:36:50,387
We'll see you at brobots.me in just a few days.

600
00:36:50,387 --> 00:36:51,335
Thanks for listening.