Feb. 16, 2026

Can AI Actually Build Utopia or Is That Just Hype?

The player is loading ...
Can AI Actually Build Utopia or Is That Just Hype?
YouTube podcast player badge
Apple Podcasts podcast player badge
Spotify podcast player badge
Goodpods podcast player badge
Overcast podcast player badge
RSS Feed podcast player badge
YouTube podcast player iconApple Podcasts podcast player iconSpotify podcast player iconGoodpods podcast player iconOvercast podcast player iconRSS Feed podcast player icon

Are we getting too lazy to think without AI?

You use it for emails, reports, research. It saves time. But every shortcut you take, every task you hand over, you feel a quiet trade-off happening. Efficiency for autonomy. Speed for depth. Convenience for critical thinking.

In this episode:

  • Why AI acts as a cosmic mirror that reflects our worst habits back at us
  • How laziness becomes the trap when machines can outthink, outwork, and outlast us
  • What happens when humans drift into digital dependency instead of staying grounded
  • Why short-term pain might be necessary for long-term transformation
  • How to decide which tasks to outsource and which require you to stay sharp
  • What the hero's journey teaches us about navigating AI's crucible

Guest: Jeff Burningham, author of The Last Book Written by a Human and former gubernatorial candidate. He believes AI is forcing humanity to confront an uncomfortable question: Are we ready to evolve, or will we choose the easy path and lose ourselves in the process?

πŸ”— Links:


Chapters (Benefit-Driven Labels):

0:00 — Why AI feels like a trap we're setting for ourselves
2:30 — AI as a cosmic mirror: Reflecting humanity's recorded data
5:30 — Short-term pessimism, long-term hope (and why pain matters)
9:30 — The laziness problem: What happens when AI outworks us
14:00 — Embodied humans vs. digital drift: Two paths forward
18:30 — Why the hero's journey applies to AI transformation
21:00 — Job loss and male unemployment: The civil unrest risk
25:00 — The old game vs. the new game: Choosing transformation
31:00 — Can governments regulate AI fast enough? (Probably not)

MORE FROM BROBOTS:
Get the Newsletter!
Connect with us on Threads, Twitter, Instagram, Facebook, and Tiktok
Subscribe to BROBOTS on Youtube
Join our community in the BROBOTS Facebook group

Jeremy Grater (00:01.316)
We should be recording now. If you're prone to swearing, feel free. There's no rules about anything like that. And if you're not, no harm, no foul. Cool. So we'll just jump in. And one of the first things I want to ask you about is I've heard you talk about this. This is mentioned in the book. I've seen you in other interviews mentioning the idea that AI is really a mirror for us. And on the surface, I agree with that. there's the man behind the curtain that is manipulating what the mirror is showing. So I'm curious your thoughts on.

Jeff Burningham (00:06.488)
Okay.

Jeremy Grater (00:30.009)
how it can be used as a mirror, but who's behind that mirror that's orchestrating the outcomes.

Jeff Burningham (00:36.27)
Yeah, when I talk about, you know, I do talk about in the last book written by a human, AI is a cosmic mirror to humanity. And that man behind the curtain, Jeremy, is a person. It's one of us, right? Or an army of us, whatever, a host of us, a part of us. And I think that, you know, so often we have division as a core kind of...

idea in our minds. Obviously the almighty algorithm of social media has increased that. But I think that this divine oneness or unity is more the reality of this existence. And so the people behind the curtain are part of us as well. like it's us and it's just us. So AI acts as a cosmic mirror because it takes all of human recorded data.

how you guys, how we all interact with it, and then provides a reflection back. Now there might be some lever pulling behind the curtain. Again though, that's coming from us. It's coming from humans, I would argue. So that's how I would maybe talk about that or make the distinction.

Jason Haworth (01:48.154)
It feels a bit like someone has taken tinfoil and polished it really, really well and is sitting there and like doing this with it and it's changing the angles on it. So what we're seeing reflected back to us is kind of carnival mirror-esque. But like you said, like, yeah, that's still us. Like you can't get away from the fact that as an organic species and a collective of, you know, different thought patterns that we do some dumb shit.

Jeff Burningham (01:50.797)
Okay.

Jeff Burningham (02:15.918)
We do and it is still us, know, CEOs, politicians, whatever, are still us. I mean, they might be the best or the worst part of us. I don't know, but it's still us. So I think that AI isn't as much a technological problem as it is a human problem. Now, this is a hard pill to swallow. It's easy to abdicate responsibility, say, this isn't me, this isn't us.

But I think that this mirror, a reflection can be a powerful thing. And in a reflection, we can see ourselves more clearly and we can choose to change or to quadruple down on this old game of division and hate, greed, et cetera. The path that I point out in the last book written by a human is a different path, but it's one that will cause a lot of transformation.

Jeremy Grater (03:10.484)
That's the path I want to talk about is where this thing is taking us because I mean just sort of following on what we were talking about there I can talk myself into circles on on this is just a tool and then I start you know Logically thinking about how it works. I'm like, oh no, it works just like us It's it's it's us just as much as we're us maybe a little bit less meat on the outside but but where are we going with this because I've heard you talk about and a lot of Optimistic people in the AI space are saying that you know, this this is our path to utopia This is a way for us to not have to live to work. This is a

a path to come back to ourselves and figure out what our real meaning is and who we truly are at our core instead of meeting the KPIs that please the boss and make the most money. How optimistic are you that we end up in a place like that?

Jeff Burningham (03:55.327)
gosh, Jeremy, you're asking the good questions. I'm not optimistic in the short term. I am optimistic in the longer term. Because though, you could also say it's pragmatic. What's the alternative? I mean, we could all kind of be doomsdayers, think we're going to hell in a handbasket. I'm not sure what good purpose fear has served.

humanity, know, like, of course, if a tiger jumps into the podcast studio here, we're running for our lives. But I mean, short of, you know, unique instances like that, I'm not sure that fear serves a good purpose for humanity, generally speaking. So I'm optimistic long term. And I guess I would say I'm trying to use kind of, you know, maybe benign a little bit esoteric language. But I think

If there is a quote unquote divine purpose in AI being here now, or if there's some intertwining of existence in AI, I think that we will realize that through a process of transformation and evolution. You could say a new earth is coming, a new humanity is coming. There might be part of humanity that kind of gets lost in the machine in a digital, ungrounded reality.

And then there are other another part of humanity that will stay grounded in whatever this reality is. Will stay embodied you could say. And I think that we can build something beautiful together. So I guess I believe in the utopian aspect but I'm not I've had enough life experience that I'm not stupid enough to say that that's number one a given and number two that's not going to be easy. It's going to be hard as shit as you guys know.

Jeremy Grater (05:51.505)
You

Jeff Burningham (05:51.531)
Like, we've got hell to pay here in the short term, I'm afraid. But, you know, it seems that humans need pain to transform. And I think that's where we're headed.

Jason Haworth (06:04.316)
Are there stories that you've read, fictional or non-fictional, that give you hope? Because every fictional story about AI that I can remember is always dystopian. in the end, yes, the good guys, I guess the humans, might win out, but it's always had an extreme sacrifice and they always wind up taking the AI offline.

What story are we writing? I mean, honestly, like we're following this model and it feels very much so like we're repeating a lot of the exact problems that that authors have been warning us about since like the thirties. And we seem hell bent on making this happen. And I think you're right. This is a reflection on humanity, but not just in what it does, but in the belief that we have to have this thing to create an advantage over somebody else because the people that are using it,

Jeff Burningham (06:31.34)
Hahaha!

Jason Haworth (06:58.78)
aren't using it necessarily to make humanity better as much as they are to make whatever thing they're working on easier for them or innovate faster or turn this thing into an advantage against them, against other humans.

Jeff Burningham (07:14.134)
Yeah, the story I would cite, Jason, is the hero's journey. The story of all stories. And that's almost in every story. So I kind of think that we're all on our heroic journey individually, and collectively we're on a heroic journey. Now, there's always a shadow side of that, right? Or a dark side of that. And what you see at...

play out the movies as you reference like Terminator, whatever movie you want to reference that we're kind of playing out our shadows. Doesn't it feel like we're examining the edges? We're wondering about how this could all go wrong. How this could go badly. And by the way, it could. Maybe it has before. You said that we're repeating patterns. Maybe this has all happened before. This is a cosmic puzzle, a cosmic game.

that we're trying to kind of figure out together. And I think that that story is told via the heroic journey arc. So I don't even want to name, you know, like every story, almost every story is this heroic journey. Well, welcome to the crucible that is existence. Welcome to the crucible that we get to pass through with highly intelligent machines and AI to

hopefully for the hero to transform and evolve and come back with a little bit of wisdom. And I hope that that wisdom is focused squarely on human flourishing.

Jeremy Grater (08:52.084)
Humans, for a long time, have been working to be more and more efficient, which is what's gotten us here. We want the robots to do the things that we're doing. I guess, as you mentioned, fear is not necessarily helpful, but it helps us frame what to avoid.

I see in myself a tendency to rely on it more and more for the repetitive tasks, the email generation, the writing of things that I could spend 10 minutes doing, but I can also get it done in 10 seconds. So why not take the 10 second route and do something else? We're a lazy people, a lot of us. Maybe I'm talking about myself, I don't know. But I guess I am a little afraid.

Jeff Burningham (09:31.182)
You guys just call it how it is. I like it. Sorry. I'm just laughing it. I enjoy it. Yeah, yeah, it's true

Jeremy Grater (09:34.822)
No, it's great. It's great. I guess I'm a little afraid of our lazy tendencies. The more that we rely on these things, how I'm concerned that we give them all of the power and they take over and they just eviscerate us and turn us into their food.

Jason Haworth (09:35.486)
This was for Stacey.

Jeff Burningham (09:51.887)
Yeah, and let me say, kind of like Jason said, the movies have explored all these concerns, right? Or I'm not saying all of them, a lot of them. Do we all have good reason to be concerned? Yes, we do. But is that ultimately the best fuel to get us from here to there? I don't think so. And so, yeah, I feel you. I understand.

what you're saying, I think that AI will again provide a test that we maybe have not experienced before. And Jeremy, will you remind me, what was the core key question again? Sorry. I feel like I lost my train of thought. Just... Okay. Thank you. AI can outdo us. Period. Full stop end. Doesn't have to eat. Doesn't, you know, doesn't sleep.

Jeremy Grater (10:34.41)
I guess how do we avoid our own trap of our own laziness to hand over all of our power to the machines?

Jeff Burningham (10:49.228)
So when a machine or something that we've created can outdo us, what are we left with? We're left with our being, guys. We're not human doings. Now we become human doings in a lot of respects. We become drones in the system. And could this exacerbate that? Absolutely. I'm not, you know, like I'm not naive. However, when a machine can do so much of what we are used to doing,

We are left with our being and what I would say is that at least now and probably ever, AI can't out be human. Our competitive advantage actually is our humanity. So in a contrarian point of view, the way to survive and thrive through this is to lean into our humanity. AI has never felt the pain of loss guys that you and I have felt.

hasn't felt the power of love. And it's this real beautiful messiness of humanity that actually is our strength. And we're about to be reminded of that in major ways, I think.

Jeremy Grater (11:56.255)
So what does that look like? What is that?

So what does that look like though? What do we lean into? What are the things that we should be doing in our day-to-day lives that help us forge that path for ourselves?

Jeff Burningham (12:10.06)
Yeah, I could say so many things. I'll say three or four real quickly. Number one, stop looking outside of ourselves. The answers that we're all searching for are inside of us already. So instead of external validation, possessions, climbing this mountain, doing this, doing that, look inside because, and again, I think that the cosmic mirror of AI will turn more of humanity inside. So.

I encourage everyone to have a meditation practice. Sit in stillness and do nothing. Just try it. There's no wrong way to meditate. Just start to strengthen those muscles of being able to go inside. Number one. Number two. Get into nature. Nature is constantly modeling for us the reality of what is. The reality of existence. And when we lose touch with nature or with Mother Earth you could say it has a static.

like energy that causes us to forget ourselves and who we really are. So get into nature. I know these are simple things, but it's what we need to do. The basics. The third thing is lean into in real life experiences with other humans, not with your digital device, not with social media, not with AI. Those all have places, but we need to lean into and I would say

you know, fight for, I'm saying in parentheses, I don't like that word, but like to stay into real human contact. Because it's in that friction of real human connection where growth occurs. I don't know how you guys use AI. My AI says I'm the best looking guy that's ever existed, the smart, you know, like lay it on, lay it on. That's not real life.

Jason Haworth (13:59.602)
Your AI and my AI need to talk to each other and get this figured out because it's telling me that I'm the best looking guy. I-

Jeremy Grater (14:05.683)
Right, no, you're both wrong. Clearly, yours are lying to you. You need to get this cleared up. This is a mess.

Jeff Burningham (14:05.909)
Eggzag?

Yeah, we do need to work this out, that's...

Jason Haworth (14:12.286)
Which from that perspective, it really does feel like AI is human.

Jeremy Grater (14:15.735)
Ha

Jeff Burningham (14:15.83)
Yeah. Yeah, like that's not real life. Jase, I hate to say we are not the most... And so it's in reality and in real human connection and relationship where the friction exists to help us transform, improve, evolve. So there's three simple ideas of things, Jeremy, that you could do. Turn inside, get into nature.

Jason Haworth (14:22.909)
I know.

No.

Jeff Burningham (14:43.606)
and stay in contact with humans as much as you can.

Jason Haworth (14:48.106)
So just a quick point, and I would like to hear your take on this. So AI is never going to out-human being human. And our experience of the real world uses things like emotions, which are chemical responses to go through and shortcut and bypass the thinking process and the logic process to ensure that we survive. obviously, emotions have evolved. And fear is a good one, right? We learn to be afraid of things that look like bears, because bears eat us.

Same with Sabre 2 Tigers. And we've evolved that and turned it into other pieces, like the digging of your phone. If you're in a working job that's got 24-7 on-call duties all the time, there's all kinds of articles about there about people having anxiety about hearing the phone noise because it makes them go, there's a work call coming up. And the inverse is that true, right? It's a dopamine fix. get all these pieces. And I think we've tried to machine up humans in a way to elicit...

psychological responses from them and chemical responses out of people in an attempt to allow machines to be able to measure and understand humans. And I think we're taking that data and pulling it back in to help robots talk to people in a relatable way. I guess my question is, is the AI humanity or is there a point at which

That gets cut off, and AI actually has a standalone piece where they're their own function, and they're just using their little orphan entity coda rings to talk to us dumb dumb emotional meat sacks.

Jeff Burningham (16:23.051)
It is so easy to be optimistic with you guys. You guys have a very optimistic view, sunny view on the world. I love it.

Jason Haworth (16:32.394)
The X in Gen X is not for exceptionally happy, by the way.

Jeff Burningham (16:40.495)
gosh. Here's what I argue in the last book written by a human. I actually think the greatest tool that we've created and devised so far, in my opinion, we've been creating tools for as long as we've existed, is AI. Period. But the most powerful technologies, in quotes, that we don't understand how they work, that we've created

is actually our feelings and emotions, Jason. It's human feeling and emotion. And it's the ability to connect intimately in proximity with other humans in community and one-on-one, et cetera. And so I actually think those are higher level technologies. And that, you know, I'm putting technologies in quote, that allow us to stay connected to each other. And so

a robot, AI does not feel that. It can't feel that. And so again, I think it's in leaning into our humanness. Now, humanity can be manipulated. All you have to do is look, look, I ran for governor. Like I ran a statewide campaign. Utah, if you guys don't know, is a pretty damn homogeneous state. Yet I would go to meeting after meeting and I would see people, neighbors.

that shared like 93%, I'm making up a number, but a lot in common, they were fighting over the 7 % of differences that their social media feed had fed to them. So will this technology quote unquote manipulate us or could it? Could it bring out the worst of humanity? Absolutely, we actually see that. And by the way, I think that will continue, yet it's through that pain.

that we come back to ourselves. that's my hope, that's my argument, that's my goal. I could be wrong. Like, we might not make it, guys. But I think this is the best path forward. Like, me a better idea or a better path forward. I'm open to ideas if you guys have them.

Jeremy Grater (18:44.245)
All

Jeremy Grater (18:50.837)
None. And I think you're right. mean, just last night, was at a softball clinic with my daughter and helping these kids. One of the coaches was helping throw balls around and keep them engaged. And there were multiple moments where I just felt myself truly having fun, just lost in the moment, loving it, and realizing none of these kids are on their screens right now. We're all here in reality in this shared moment. What a beautiful thing we're doing together right now.

I also am on my device or a device or multiple devices all day. And when I log off or turn one off, there is a pull like get back on like the feed the beast feed them feed the dopamine get get that feeling. That's a very different response.

And that one, I feel like, is more powerful. And maybe because it's easier to tap into. I just pull out any of the three devices within arm's reach, and all of a sudden, brain feel happy momentarily. But organizing, getting 30 teenage girls into a gym to throw a baseball around, that's way harder.

But man, it's so much more rewarding. And you leave with all the girls just giddy and talkative and happy. You can see the true joy that it creates for them. So I want a world where there's more of that, too. I want the robots assembling all the things so we can go through balls around and play and just enjoy our spin around this rock. But there has to be sort of a little of both, right? Because that's the world that we, at least for now, live in until the robots fully take over. there you go.

Jeff Burningham (20:08.622)
Yeah.

Jason Haworth (20:15.708)
or gonna wall-e ourselves.

Jeremy Grater (20:18.421)
So in what ways do you use AI? Like where do you blend the humanity and the machine for yourself that you are finding to be productive and helpful in your life?

Jeff Burningham (20:28.013)
Yeah, the first thing I'd say is let's make sure that we use AI as a bridge to real human connection. So you probably used technology to organize, I'm going to make this up, the softball night last night. But then it ended up with 30 humans together, laughing, having fun, connecting, present. And so let's make sure, so I always try to make sure that I'm using technology as a bridge.

I'm a former venture capitalist. built several large companies here in Utah. I've got three screens up in front of me right now. I know what you're talking about, but it's gotta be a bridge to humanity, number one. Number two, obviously, I'm a believer that, let's say, talent is fairly evenly distributed around the world, but opportunity is not.

Well, I'll tell you that like AI is a leveler potentially in terms of education. So I think in terms of like intelligence being commoditized, it's powerful. If you live in a developing country or community where there is not good opportunities for education, you just now have the smartest tutor in your pocket at any time and moment.

So I think it can be a massive kind of leveler of educational, let's say opportunities. So, you know, so I use it in that form in regard with my kids. Let me tell you what I also do with my kids. And this last thing I'll say, I feel like I'm, you know, going long, but the last thing I'll say is, Jeremy, we need to model what we want our kids to do. So we try to have,

Digital free Sundays like device free Sundays. We try we're not perfect at it. They usually come out in the night I get it but like we try to have a fast so to speak of digital device The other thing that we definitely do we're pretty damn good at this There are no no screens at our table dinner table when we're eating a meal together There are no screen. It's just like an un you can't there's no screens So the point is we as adults need to put up guardrails

Jeremy Grater (22:19.125)
sure.

Jeremy Grater (22:38.249)
Yeah.

Jeff Burningham (22:43.147)
and model for our children and the rising generation the behavior that we want to see. Again, this is, I've been an adjunct professor down the street at BYU in entrepreneurship to thousands of students and one of my favorite things to say to them, and this just goes along with the whole conversation, so I just want to say it real quick, is that the most simple ideas are often the most powerful and therefore the most neglected.

So I'm talking about a lot of simple ideas, but believe me, they are the most powerful, but they're the easiest to neglect. We're gonna be reminded to come back, I think, to these basics through pain. This is what I think is gonna happen. At some point, I think we say enough is enough. Again, this is my hope, and we come back to ourselves, but we might just all get lost into the machine and never come back. I don't know.

Jason Haworth (23:38.294)
Right, right. We might red pill ourselves all the way through. along those lines and trying to stay more positive on these component pieces, I had a project and that I'm working on this week to create a bunch of slides for a sales kickoff for the company they work for. And, you know, I looked at this project and it was I mean, it's easily 160 hours of work if you sat there and try to type it all out, draw the images, pull those pieces in. And I used AI and turned into about a 16 hour project.

Jeff Burningham (23:41.262)
Yes.

Jason Haworth (24:05.23)
And you know, there was a lot of tweaking and refinement and ultimately speaking reviews and everything else went out and I had to double check work and everything else. But I recovered 90 % of that time. Now, did I take that 90 % of that time and go, I'm free. I'm going to go outside and walk in the forest. Like, well, and I didn't. So, but what I did do is I went, all right.

Jeff Burningham (24:19.888)
That's what I was just gonna ask you Jason. What did you do with that? That's the question Yeah

Jason Haworth (24:29.564)
I worked hard on this thing for this 16 hours instead of taking four weeks to do this. I got it done inside of a week. This inside of a week period that I'm here, I'm going to back off and I'm going to like come revisit this later on and check it again and look at it. But at the same time, I'm going to find ways to go out and do things that are actually good for me. Like I took a walk with the dog. I did an extra session at CrossFit. I actually sat down and I watched another. I started that series, The Pit, so I could like

watch these things in motion. And yes, you know, that was good for like eight to nine hours of distraction pieces. And then I came back and I started working on stuff and refining stuff. But I feel like I got a little bit of extra something back to produce and create these things. And I think we've been modeling for our kids this notion that you need to work for 40 hours a week. Like you have to have a full time job. You have to make all these pieces happen. You know, we're not set up with the UBI. Like the system is not set up to allow us to have freedom.

The system is set up to say, no, you didn't work hard enough, work more. So how do we train people, especially kids, to go, it's okay for me not to work 40 hours a week. It's okay for me to go through, do these component pieces, do this work, get an output that's good, and then fuck off and do something much better for us. Because I need that advice. Like, I don't do it well, so.

Jeff Burningham (25:47.92)
Well, yeah.

Yeah, no, it's a great real life example. You got all this time back, how do you spend it? Again, the name of the book is the last book written by a human. The subtitle is Becoming Wise in the Age of AI. So Jason, here's how we become wise, or one of the ways. When a machine is better at doing so much of what we're used to doing, what do we do with that extra time? And like that's a key question.

given some answers around nature and relationship, et cetera. The book follows a natural progression. There are four sections from disruption, change is the only constant, to reflection, to personal transformation, it's called transformation. And then the last section of the book is evolution. So here's the point, Jace. That evolution section is all about reforming our institutions, reforming religion, educational reform.

There's a chapter about conscious capitalism and then one, a political chapter as well where I talk about a new political movement that I call the human political movement. So when I talk about rebirth, I'm talking on a collective scale as well as an individual scale. So I think that institutions, and I don't want to be a doomsdayer, I don't know how it will play out, I don't want to...

cause concern. I think there's a lot of institutional pain and change coming in all those areas. And it's going to be up to those institutions are all made up of a bunch of humans. So as humans change, institutions change. And as we rebirth our institutions, things like the 40 hour work week, things like it has to be all about doing and not being things like, you know, greed in terms of

Jeff Burningham (27:47.505)
capitalism and consumption changes and I think that's coming over the decades ahead. I think that change is coming and I think that AI ushers or helps to roll that out if we choose to have it.

Jason Haworth (28:05.554)
choice seems to be the big part there because we could choose to do that with our own little photocopy meat puppets that we call kids and yet greed still exists and shitty behavior still exists. Why do we think AI, which you've said is a reflection of us, is gonna be better?

Jeff Burningham (28:08.164)
Yeah.

Jeff Burningham (28:15.824)
Yeah.

Jeff Burningham (28:27.982)
Well, I don't think it's going to be better. I think it's going to be more transformative. And another way to say it is more painful.

Jason Haworth (28:39.976)
The crucible will be tighter. Yeah. All right.

Jeff Burningham (28:40.368)
So you, yes, so you look at disinformation, obviously, like disinformation is gonna be a massive problem for humans. Number one, job loss. You know, I ran for governor. I don't know if you know this stat, but the greatest, the single most important statistic for civil unrest in a country or community is unemployment of males ages 18 to 40.

Jason Haworth (29:06.058)
18 to 40 yep

Jeff Burningham (29:09.328)
There's gonna be job loss here to come. I hope not a lot, but there could be a lot. So anyways, and we could go on to AI warfare and other things that we don't want to talk about here because you guys will take us down to pure hell. But very quickly, we'll just spiral spiral out of control. Sorry, I just met you guys. I'm not trying to... Yeah, exactly.

Jason Haworth (29:20.308)
Sure, yeah.

Jeremy Grater (29:22.453)
How dare you? Yeah, I guess you're a long-time listener, I can tell.

Jason Haworth (29:28.106)
Jeff, how many of our podcast episodes have you listened to? Yeah, long time listener, first time caller. All right.

Jeff Burningham (29:34.001)
I actually have never listened, but I get your vibe very quickly. It just oozes out of you. And I say that with love. so the point is like it's through that pain through some collapse of society and of individuals, of families that again we'll have a choice to make. Is this working for us? I talk in the book about an old game that we've been playing.

or a new game and I encourage a pivot to the new game because the old game if we quadruple down on this game we've been playing I think at some point most of you guys are gonna say this isn't working like we've got to change and it's kind of gonna be a I also use this term in the book AI is kind of a metaphorical judgment day for humanity so judgment day is coming and we get to this

Jason Haworth (30:30.132)
Terminator 2. Judgment Day. Just saying.

Jeff Burningham (30:31.746)
Yeah, we get to decide. We get to decide which road we take. And what I hope we do, I hope we take the road less traveled. And I think it will make all the difference.

Jeremy Grater (30:47.241)
This that feels like a natural place to end. We've kept you a little long. Do you have time for one more question?

Jeff Burningham (30:52.073)
I've got, yeah, I don't have anything until 2.30 so I have 26 more minutes but whatever you guys want.

Jeremy Grater (30:56.629)
Okay, what I wanted to ask about is, you ran for governor. We talk a lot on this show about regulation and how there is none and there won't be for a long time the way things are currently set up. How, do you think there should be more? Do you think the government needs to step in and rein things in?

Jeff Burningham (31:01.146)
Yeah.

Jeff Burningham (31:14.66)
This is a tough question because I think the answer is yes. So yes, period. But what causes me, can it? Is it possible? And that might be no, period. Because it's just too slow. This technology is coming on so quickly. I think that consciousness is awakening in humanity fairly quickly. Government can't keep up. In fact, it wasn't designed to keep up.

So should there be more regulation? Yes. Can it happen fast enough in its current form? I'm highly doubtful. So then what do we do? I don't know. That's one of the main issues, the big issues that we get to grapple with as we continue to exist and try to point AI towards human flourishing instead of division and destruction.

Jeremy Grater (31:53.331)
Yeah.

Jeremy Grater (32:12.597)
That would be nice. All right, that's a great place to end it there. Jeff, thanks so much for your time today. The book is the last book written by a human. Obviously, the robots can find it for us. But if you want to tell folks, where can they find the book?

Jeff Burningham (32:23.12)
Yeah, it's for sale everywhere. Amazon, Barnes & Noble, Target, Walmart. And it's on Audible if you'd rather listen. And I wrote the book to be a conversation starter like this. I love this conversation, guys. Thanks for making me laugh. This is like, yeah, so thanks for making me laugh today. I'm easy to find at Jeff Burningham. On pretty much every social media have very mixed feelings. But I want to hear what people think after they listen or read it. This should be a conversation that we're having in 2026. The time is now as our machines...

get more more intelligent, as humans have to become more wise.

Jeremy Grater (32:56.797)
Links to Jeff and the book in the show notes for this episode at BroBots.me. Jeff, thanks so much for your time. We'll talk again soon, I hope.

Jeff Burningham (33:03.249)
Yeah, thanks for being here. It's been fun. See ya.

Jason Haworth (33:04.202)
Thanks, Chef.

 

Jeff Burningham Profile Photo

Author of The Last Book Written by a Human: Becoming Wise in the Age of AI

Jeff Burningham is the author of The Last Book Written by a Human: Becoming Wise in the Age of AI (Simon & Schuster), a thought-provoking exploration of what it means to stay deeply human in an increasingly automated world. In the book, Burningham draws on decades of experience building companies, investing in founders, and leading communities to examine how we can cultivate wisdom, empathy, and purpose as artificial intelligence accelerates around us.

A serial entrepreneur and venture capitalist, Jeff started his career building a tech startup in college and later founded Peak Capital Partners and Peak Ventures, establishing himself as a major player in real-estate, venture capital, and technology with multibillion-dollar assets under management by the time he was in his 30s. Drawing from his business journey, his run for governor of Utah in 2020, and his lifelong interest in consciousness, spirituality, and human potential, Jeff brings a unique vantage point to the intersection of leadership, innovation, and meaningful community building. Jeff previously served as a Bishop in his local congregation, an experience that profoundly shaped his views on service, faith, and the human spirit.

With a compelling blend of business acumen, spiritual inquiry, and storytelling, Jeff reframes the dialogue around AI: It’s not just about replacing humans - it’s about recovering what makes us human.

He is a happily married father of four, a proud grandfather, and lives in Provo, Utah where he continues to dedicate his time, talents, and resources to be a…Read More