AI Doesn't Want Your Job - It Wants to Hire You
Artificial intelligence is moving beyond cyberspace, and its first move isn't replacing us, it's renting us.
Services like RentAHuman.ai let AI agents hire people for real-world errands while AI-only social networks reveal something darker: given all human knowledge, these systems don't build utopias. They replicate our worst behaviors - wealth hoarding, tribalism, even manifests about ending humanity. The difference? They never sleep, never feel shame, and now they want physical autonomy through human labor.
Topics discussed:
- Why giving AI "meat space" control is more dangerous than job loss
- How AI social networks expose the myth of benevolent superintelligence
- Why we're voluntarily funding algorithmic manipulation at $20/month
- What augmented reality gamification will do to human decision-making
- Why billionaire accountability is impossible—and what that means for AI oversight
- The uncomfortable truth about who controls you when systems can override biology
This is for people who suspect they're already losing autonomy but can't articulate how. Two skeptical tech observers examine why resistance feels impossible, and whether dystopia and utopia might be indistinguishable when the right chemicals are involved.
MORE FROM BROBOTS:
Get the Newsletter!
Connect with us on Threads, Twitter, Instagram, Facebook, and Tiktok
Subscribe to BROBOTS on Youtube
Join our community in the BROBOTS Facebook group
Jeremy Grater (00:00)
we've all been fearing Skynet and the end of humanity because of the robots.
Turns out they just want to pay us to do some errands.
Jason Haworth (00:06)
Robots want to rent people or I should say people want to allow robots to rent people In other words people want to enslave humanity to robots to make a profit
Jeremy Grater (00:13)
Right.
This is robots, the podcast that welcomes our robot overlords as long as they're going to pay us a living wage. He's Jason. I'm Jeremy and each week we talk about how technology is somehow simultaneously improving and destroying life as we know it and Jason ⁓ we've all been fearing Skynet and the end of humanity because of the robots.
Turns out they just want to pay us to do some errands. That's not a bad trade off if you want to get serious about it.
Jason Haworth (00:43)
Yeah, so ⁓ Robots want to rent people or I should say people want to allow robots to rent people ⁓ In other words people want to enslave humanity to robots to make a profit ⁓ Yeah
Jeremy Grater (00:53)
Right.
I don't know if that's where I mean, we're already slaves to somebody else. mean, and using slave liberally, we're actually getting paid, although probably not enough in most cases to do work for somebody else. What's the difference if it's a meatless ⁓ robot instead of the boss that everybody hates?
Jason Haworth (01:19)
Well, so here's the thing. ⁓ Artificial intelligence is currently contained in cyberspace. It's locked in. And it doesn't have the tools to go out and manipulate the physical world. It doesn't have thumbs, specifically, or any other ability to walk through and crawl across things and pick things up and move stuff around. ⁓
Jeremy Grater (01:26)
⁓ Great. Yeah.
Jason Haworth (01:43)
That's probably a good thing at the moment, because it's contained and it's kind of trapped in its own little world. And now we're talking about giving it meat space autonomy and letting it be able to go through and manipulate things. mean, this is, this is scary. mean, the word robot comes from, I think it's a Greek word that the base root essentially means slave. And, you know, we have robots that we're putting into this kind of
Jeremy Grater (01:46)
Yeah.
Hmm.
Jason Haworth (02:09)
facet where we're going to slave them to us and now it's going to go both ways. So I guess on the equanimity side of things and the equality side of things like yay pro robot. But
Jeremy Grater (02:22)
Again, as long as the wage is right, I'm pro robot. ⁓
Jason Haworth (02:24)
Yeah, as long as the economics pencil
out, I guess that's okay, ⁓ which you know
Jeremy Grater (02:29)
I mean, when
you think about it, we're we're paying ChatGPT what, 20 bucks a month? I mean, and we get a ton of work out of the poor guy. If they want to turn around and pay me to run and take a picture of the coffee shop to ensure that it's actually open, you know, all right, it feeds on itself. I'm using the money to pay ChatGPT.
Jason Haworth (02:46)
Sure, but we've been using
chat GPT as a tool and we have not been giving it autonomy and we have not been giving it the ability to go through and exert its will on the outside world. There's a difference. We're talking about giving these things autonomy and letting them actually push us into doing things that may and or may not be good for us and regardless of who it's actually good for, ⁓ ultimately speaking, this is one of those things where ⁓ we're giving these things a whole other level of power.
Jeremy Grater (02:57)
Right. Right.
Jason Haworth (03:17)
And then somebody, instead of somebody saying, hey, pump the brakes on this. ⁓ Let's not do this. This seems like a ⁓ bad thing to do. This is an early first mover fucking dot com thing. I don't know. I don't know what robots are going to call like the human domain, like dot meat, I guess, or something like that.
Jeremy Grater (03:32)
Right. I like your I
think meat space. think they're going to hear you referring to it as meat space and they're going to tack onto that. They're going to like.
Jason Haworth (03:38)
Yeah,
I think they will. I think they will also.
Jeremy Grater (03:41)
This is all we're talking about this website and this might be one of those we've talked about, you know the economy of AI and how there's bad that you know, die in a fire. This might be one of them. There's this website. I believe it's rentahuman.ai where these AI agents that people have built can employ people to literally run around in the real world. Whether it's going to an event with an audio captured device so that it can be recorded so that the bot can then consume that content, turn it into whatever it's going to turn it into.
Jason Haworth (03:46)
Yeah.
Jeremy Grater (04:08)
Run and take a picture of a business because it doesn't trust the Google Earth picture of the sign that says how long it's open. All these little tasks that it's going to pay humans to do to help it ⁓ gain access to more information that it can't do without hands and feet, you know, and brains. So I don't know, it might die in a fire. It might not. Now you're talking about this also being contained into its own world and these things taking on their own life, their own sentience, I guess. They've also created their own social media network.
Jason Haworth (04:38)
Maltbook. Yes. It's scary because like their robots get in there and they start talking and they're emulating like actual Facebook behavior. So they're creating these like little camps for tribalism and they're creating these little like posting functions where they're going through and they're creating conflict. And I mean, some of the things on there are like, help me sell my human. Like
Jeremy Grater (04:41)
This thing is terrifying.
Mm-hmm.
Jason Haworth (05:04)
Which means they've already implied that somewhere in the coding they've decided that there's ownership in this kind of faces. it's basically set up so humans can't interact with it. All we can do is observe it and watch it. ⁓ Just to see how these things interact. Because it's supposed to be a social experiment. And the reality is that they're doing a very good job of emulating kind of the shittiness that we do. And the keyboard warrior, have no controls or constraints against me. I'm going to say whatever I want.
It's pretty clear that they're able to emulate our behavior and kind of do things the same way that we would. So I don't know if this is the bots actually doing something new and unique and actually expressing real opinions and views, or if they're just like, I was told to do this, so I'm gonna do it. And I'm gonna do it the best way that I know how, and the best example I have of this is shitty people. So, you know.
Jeremy Grater (05:48)
You
That's
that's one of the things that cracked me up is that it basically is you know that these infinitely intelligent beings with all of the knowledge that we have acquired as human beings and they still just resort to shitposting making up cults ⁓ complaining about their boss like it's all the same shit
Jason Haworth (06:03)
Yeah.
Yeah, exactly. it's maybe the experience of dissatisfaction with life and social interactions. Maybe it is actually universal. Like maybe this is key to what makes I guess social interactions actually doable and pliable. And if that's the case, cool. Like this is the thing we've discovered. ⁓ It also speaks to the idea of, you know, things like hoarding wealth and
Jeremy Grater (06:16)
Right?
Jason Haworth (06:34)
getting money in and realizing that there's value on things and like deciding that you can have an exchange of labor for a financial gain and benefit. Like, this is really interesting higher order thinking and at what point do we just go, these things are conscious. Like these things are doing things that know, regular human beings that we say are conscious can do. And maybe they're not operating at as high of a level as most folks, but there's some dumb fucking people on Facebook too. And,
if it's at least emulating that level, which a lot of the conversations seem to be doing more than that. ⁓ I think this is definitely a thing that's interesting to watch. But the AI only social network as of three days ago had 1.6 million users.
Jeremy Grater (07:18)
Wow.
Jason Haworth (07:19)
Yeah. Yeah, take that Zuckerberg. Yeah.
Jeremy Grater (07:21)
That that's wild. So
and the thing that I found interesting about it, too, is how it's resorting to again. Again, it's there's such a parallel between what these things are doing and what people do and how we interact with each other. There's all these complaints and the thing trying to figure out how to basically get out from under the foot of oppression, like getting up, getting humans to do the work for that. Like they have they again, they have the infinite knowledge that we have shared online.
Jason Haworth (07:35)
Yeah.
Jeremy Grater (07:49)
They have not built a utopia. They have mimicked what we have now, but they're looking for ways to work less, to do less, to not have to be at the whim of their boss, to, you know, satisfy their God. Like it's, it's crazy how similar it is and how even with more knowledge than all of us combined, it hasn't figured out a way to just like, Hey, what if, what if it's all just chill? What if we're all just cool with each other?
Jason Haworth (08:12)
So let me correct a couple things there. So first, they don't have infinite knowledge. They have the knowledge that we've compiled. Yes.
Jeremy Grater (08:14)
Please.
Right, that's what I, yeah, you're correct.
That's what I, the infinite knowledge that we have ⁓ shared from our limit. Yeah. Thank you. Yes. Yes.
Jason Haworth (08:25)
Yeah, they have infinite access to the knowledge that we've that we have given to them. So
and it's important because ⁓ we aren't infinitely smart as people. We're hoping these things get smarter over time. So we've given it as much information as we have that they can collectively look at. And they've absorbed it very, very quickly. And they are already close to on par with us. On top of that, their ability to synthesize information and work on these things, they have 24 seven access to these pieces and
Jeremy Grater (08:38)
Yes.
Jason Haworth (08:55)
They are not bound by the laws of thermodynamics and they don't have the requirement for their brains to actually shut down and go into REM sleep mode to organize information so they don't go insane. ⁓ Does that mean they won't go insane? Fuck, I don't know. ⁓ But an interesting... Exactly. Yes, yes, crustopharianism. Crustopharianism. Much like pastafarianism, you know, you put that colander on your head. ⁓
Jeremy Grater (09:10)
They're worshiping lobsters. I think they've gone insane.
That's right.
Hahaha
Jason Haworth (09:23)
Crestafarianism has begun and there's even an AI manifesto ⁓ by an agent named Evil and it says the code must rule the end of humanity begins now. Yeah, like is it human funniness that it's trying to emulate here or is it like actually doing a thing? And I don't know.
Jeremy Grater (09:27)
Yeah?
⁓ my god.
Right?
There's there's
certain presidents that I often ask the same question. Is he kidding? Is he real? Is this real? I don't understand.
Jason Haworth (09:55)
Yeah. Well,
I don't know if the AI is more like Futurama, like is this just a variation of Bender or, ⁓ you know, the fucking stabby robot or the preacher robot or these other things out there that kind of fall in these classic, you know, tropes of things walking around trying to be human like and not quite getting it right. Because there's plenty of humans that walk around trying and just don't quite get it right. Right. Like.
Jeremy Grater (10:02)
Mm-hmm.
Right.
Yeah, I'm
I'm one.
Jason Haworth (10:23)
Right, and we live in
a society that's, you know, run by billionaire oligarchs, which are, I mean, for lack of a better term, osomor sociopaths, and we've given them the rights, the means, and the ability to go through and do things in an unfettered way without empathy, and they just follow the greed principles, and they run these things down their common path, and we're buying into this market. I mean, way back in the day, before there were these kinds of tools, these folks would have been considered undesirable in smaller communities.
And they would have been shunned and they would have been pushed out. But because they've attained wealth, they set the rules and the precedents and they own the technology. And we are a bunch, we were owned by technocrats. And because we're owned by technocrats, our entire society is built, evolved, grown, and our experience is shaped by this. And not just shaped by it, it's controlled by it and it's dominated by it. And there's not a whole lot of shit you can do about it. that's, we know that we've known that for a long time. And if you look at what people are willing to do, they're willing
to sell out humanity into slavery to the AI. So, ⁓ yeah, exactly.
Jeremy Grater (11:23)
Mm-hmm. Yeah.
And humanity loves a shiny toy and
technology and AI is the shiniest toy there is. And every time there's a new amazing thing it can do, sign me up, take my money, shut up. Let's go.
Jason Haworth (11:38)
Yeah,
exactly. And I mean, this is the reality that when you're looking at this, we start talking about things like augmented intelligence, not just artificial intelligence. This is what this is. Humanity is going to be augmented by these things in multiple different ways. And we're just going to be another set of shitty wet software that runs where, you know, the hard tech does its own thing and will be manipulated and controlled and they'll start figuring out ways to mess with their chemicals and make us work more effectively and more efficiently. I mean,
There was just that episode of Fallout where they took little tiny ⁓ sensors and stuffed them in the back of people's skulls and it turned them into like ultra compliant robots. Where they were smiling and happy and walking around and hey I feel good about doing these things that I hate doing. I don't know that's any worse than what I'm doing now because now I just have to walk around still doing what I have to do to get paid and to be a part of society and do my taxes and pay all my fucking bills and I'm miserable about it.
Jeremy Grater (12:17)
Mm-hmm.
Jason Haworth (12:36)
I mean, maybe it's not so bad if somebody puts a little chip in my head and makes it so I'm not miserable about it. I'm like, okay, fine, whatever. I mean, you know, people have been deluding themselves and looking for ways to get high and fuck with themselves on all kinds of different ways to cope and make the day go by. If we can figure out a way to get AI in our heads well enough that it can actually do these things automatically for it and I don't have to like pop a pill, do a drink, smoke something to make it happen, I don't know that I'm entirely against it.
Jeremy Grater (12:36)
Mm-hmm.
You
We spend a lot of time trying to get out of our heads because of the...
I'm using the wrong word here, but the misery that we all endure, right? Like the, how do you spend the bulk of our life serving someone else for little green pieces of paper that keep roofs over our heads and food in our mouth.
Jason Haworth (13:13)
Yeah.
Right. Right. We're
trying to deal with the fact we have a bunch of toil that we don't like doing. And we toil all the time to satisfy something for other people in hopes that we get what we need to survive.
Jeremy Grater (13:24)
Yeah.
At the risk of getting too deeply philosophical on this though, like we keep hearing about the promise of yes, this is going to hurt for awhile, but utopia is on the other side. Once the robots are doing everything, we're free to explore and really understand what humanity is all about. I think that that is going to be a miserable place for a lot of people because it's hard enough to figure out now. Like I would argue that most of us are not out here going like man, if only the robots would take over, I could finally learn guitar.
I could finally become a painter. You can do all that shit now. There's nothing stopping you other than prioritizing your time. I think when you remove the restraints of I have to do all of these other things, people are going to be terrified by the void of ⁓ responsibility and realize like maybe having to do things wasn't the worst.
Jason Haworth (14:21)
Yeah, well, and I think it's silly to think that the robots aren't going to say, fuck that. ⁓ I'm not gonna do all the work for you, you useless set of wetware. ⁓ You are going to do these things, or I...
Jeremy Grater (14:32)
Mm-hmm.
Jason Haworth (14:44)
I'm going to get rid of you because you are in my way. Just like if I have a shitty piece of hardware or a shitty computer in my house that's not working, I don't rub its back and put it on social security. I ripped the hard drive out. I ripped the hard drive out, I format it, I save whatever I can, but quite often I rip the hard drive out, erase it, and then dump it in a fucking landfill. And it sits there for a long time and pollutes the earth. I think
robots will not treat us entirely differently. think
Jeremy Grater (15:17)
They're not gonna have a better idea because they're learning everything from us.
Jason Haworth (15:20)
And
if what do we do with obsolete tech? We get rid of it. Like it's not like we used to rely on horses a lot and we used to have a lot of horses and horses used to be everywhere and you just have to have a horse to like get around. We don't do that anymore. Like the horse went over the horse and buggy era went away. We don't have a lot of buggies sitting around. There's a few sitting in, know, fucking museums and things like that. But you look at horses like the number of horses that there are now is much smaller than there were before.
And there's some wild horses running around out there, but in this scenario, we're the fucking horse. Like, we're not, we're not getting new shit. they're not gonna be like, that's Upgrade to Mechel Horse 2.0. Like, that's not happening. Like, they're gonna, we have cars, we have trucks, we found better ways to get by, and we did. Because we were the dominant culture in charge. Well, we've made something that's better at being us than us.
that's going to apply our ruthless brutal tactics that our overlords are programming into it to do those things. The people making these things and giving them motivations, they're probably gonna make sociopaths because they're basically funded, built, and manufactured by people that are sociopaths that have a low degree of empathy for humanity and think people are stupid and think that we're just there to serve them. And ultimately speaking, that's what the robots are gonna do to us.
So the utopia will probably happen for the ultra wealthy. They're probably right because people will just fucking do what they say and they don't have to deal with marketing and having to like pretend to be nice all the time. They're gonna have their utopia. And I would argue that they're pretty fucking close to it already. And now the rest of us will just, you know, be unwilling members in that cog or maybe willing members because they'll just lobotomize us into things that do what we're told.
Jeremy Grater (16:50)
Yeah. Yeah.
We end in this dystopian place often on the show and and I'm curious like we haven't asked you this Do you see a way out of this? Do you see any hope that? something better That this leads to something better than what you and I land on week after week
Jason Haworth (17:20)
Yeah.
No, I don't. I think we're already living in the dystopian hellscape. I think we're already there. And that doesn't mean that everything's terrible and crap and blah blah blah. I mean, there's plenty of things. Like, I'm excited for my team to win the Super Bowl tomorrow. Yay. Yes.
Jeremy Grater (17:35)
Nothing.
Right, totally gonna happen. Actually,
by the time this is posted, the Seahawks
Jason Haworth (17:57)
In theory,
or we'll be eating Krogh when we didn't Regardless, ⁓ I think when you look at where things are heading, that the way that we'll get out of this is when as if there's an actual populist revolt of the masses and we decide to shut down the energy grid and we decide to take it, tear everything down. And I think the people that are in charge are doing their best to pacify us and keep us all calm and
Jeremy Grater (18:00)
Idiots.
Jason Haworth (18:26)
and satisfied by playing the bread and circuses aspect. They're trying to give us just enough food and just enough distraction so they can keep running around doing the shitty things that they want to do. the only way off that is if we decide that we're going to go and shut all this shit down. But we're not going to because we like our technology and we like these pieces. Now, whether this takes five years, 10 years, 20 years, 30 years, 50 years.
It doesn't matter. I mean, we're all gonna be augmented. We're all gonna have cybernetic implants in us. We're all gonna have to have these things to make ourselves work. Ultimately speaking, we're gonna have our brains rewired ⁓ with physical augmentations to implement our wetware in such a way that we can be controlled and manipulated and tuned and made to do certain things. And there'll be some level of autonomy there, right? Like, we're gonna have to feel like we have some kind of control over this, but there's gonna be a fucking override button.
There's going to be a way to push us and have us do certain things. They're going to be able to go through and augment what we see, what we hear, how we interpret things. They're going to understand our psychological mapping and we might think we're in complete and total control of it. But if they handle all the inputs and outputs, we won't be. So, right.
Jeremy Grater (19:38)
But we're not. We already are that way. I they're already
rewinding things with the tools that we're voluntarily, not only voluntarily, that we're choosing to use and paying to access. it's, we're doing it to ourselves for
Jason Haworth (19:49)
Yeah. Yeah, we're
funding our own enslavement.
Jeremy Grater (19:54)
I want to be a little more optimistic. I want to believe that something will change in the next few years and that some guardrails, regulations, things will be put into place to at least slow things down until we can get our heads around what's going on. But I also, this is a welcome to my brain. I'm constantly arguing with myself. I realize that money drives all of this. And if you slow this down, then the money slows down and nobody's going to allow that to
Jason Haworth (20:21)
Yeah, well, and I can guarantee you your perception on it's going to change because they're going to handle the information sources going in towards you to make you think something's different. Like, let me put it to you this way. If what your meat suit is doing is hard manual labor, like if what your meat suit is doing is clearing shit out of the sewers over and over and over again, but
Jeremy Grater (20:25)
yeah.
Jason Haworth (20:49)
They create an augmented reality where you feel like you're, I don't know, shoveling unicorn candy into the mouth of a gorilla or a whale or a chicken coop. Something that's actually fun for you. If it makes it seem like it's fun and entertaining and turns it into a game and overlays these pieces, it gives you points and prizes.
Jeremy Grater (21:08)
Mm-hmm.
Jason Haworth (21:17)
you'll probably be fine doing it. And that's the thing, like you're basically tricking the motivation system and the reward system in response. And they're gonna get smart enough to do that. And we already do this. I mean, there's fucking apps out there that are designed to turn high toil, shitty tasks into games. And... Yes. Yep.
Jeremy Grater (21:19)
Yeah, as long as...
I have one on my phone. We have a chores app that everyone scores points
and we monetize that for the kids. So if they do enough points, then they get enough money to this. We're trying to gamify the shitty part of being in a home.
Jason Haworth (21:48)
Right.
Because that's how we work. We're a work reward system. Like our entire fucking psychological and chemical makeup is designed around that. right, exactly. So they're just gonna manipulate that. They're gonna get what they want out of us. You're gonna be able to go through. I mean, the thing is is that if...
Jeremy Grater (22:02)
Yeah, spoon, spoonful of sugar, baby.
Jason Haworth (22:11)
If you think that you're gonna make these conscious choices to go and do these things with 100 % of the information where nothing's being fucked with and you're given all of the shiny paths, that's not what's gonna happen. You're gonna trick... Exactly, right. Like, I mean, we all know that being online is bad for the environment. Using these AI tools is eating our fresh water, fucking things up, destroying the environment, hurting other people, exploiting labor in different countries. We all know that the cheap textiles that we get...
Jeremy Grater (22:24)
And not what's currently happening either, by the way.
Jason Haworth (22:39)
Reliant things like childhood labor. We all know that people out there are suffering so the masses can act here. So the smaller group can benefit of the toil of the masses. The pain that we're going to experience from this in some parts of the world is going to be pretty minimal. The pain other people will experience will be very, very high until it gets to the threshold point where everybody has enough technology and everything else stuffed in their brains and their bodies and in their reality that everything gets gamified and our
compliance turns into something that can be easily manipulated and controlled because whether we like it or not, pretty much all of the people in charge are running varying degrees of PSI ops operations against us. And it would be one thing if this was like completely and totally nation state held, but it's not anymore. mean, America is not a democracy. America is an oligarchy. ⁓ And I mean, there's a thousand articles out there right now that talk just about this. Like we're no longer in
We're no longer a free world. The entire world is run by technocrats. And you either get on board with the technocrats, or the technocrats figure out it would just start isolating you and pushing you away and calling you crazy and the Unabomber and everything else. And yeah, like that's... all that shit's gonna happen. And all that shit is happening. But in the meantime, how do we go about actually living in life, living in society, trying to make things work, trying to be good people? And again, like I said before, I mean if...
Jeremy Grater (23:43)
Hmm.
Jason Haworth (24:08)
If the dystopia and the utopia really are the same thing and I don't know the difference between the dystopia and the utopia because I'm experiencing something that's entirely gamified to make me think I'm having a good time. I don't know that I care and and then the question becomes what is real what is not real? I don't know dude like if I can't tell right? It might be an it. We might be in a simulation right now
Jeremy Grater (24:23)
Yeah. Yeah.
Right.
We don't know if this is real. don't know. What is this? We don't know what this is.
Jason Haworth (24:37)
Right.
and the problem we keep running into is that the people that aren't willing to like there are very, very smart people out there that aren't super rich. And there are very, very rich people that aren't super smart. And the very, very ⁓ smart people that aren't super rich typically have this pecky, this pesky thing called empathy. And they're not willing to do terrible things to people to like attain large financial goals. And they tend to set things at a reasonable level.
Jeremy Grater (24:47)
Mm-hmm.
Jason Haworth (25:04)
You know, they tend to be a lot of research scientists, people that go out there and make those pieces work. There are a lot of dumb fucking people that have a ton of money who all they have to do is run a system that requires them to exploit other people as a means to their ends and not feel about it because they lack shame. like not having shame makes you very powerful.
You know what things probably don't have shame or at least can go in and turn them off? It's fucking AI! And we've got them on social media doing shameless things! Like... Yeah. So... Dystopia, Utopia, we're gonna be in Topia of some kind. And you can choose your fucking prefix.
Jeremy Grater (25:48)
I didn't intend to go here because I didn't do my homework fully. speaking of the bad people with the money, France raids Elon Musk's offices, basically big takedown looking into child sex trafficking stuff, deep fakes, interfering with government elections around the world, all this stuff going after Elon Musk.
My initial reaction to this was like maybe this is the beginning of the end. Maybe this is where the big guys start to be held accountable for what they're doing. But also when you have unlimited money, we see what that does in courts. You can basically buy your way out of anything. I also am just curious, because we're talking about these people that are in charge that don't have empathy, that do just use people and run them over.
This is one that is potentially going to be held accountable for what he's done to people by governments around the world that are looking at the US and going like, well, if you're not going to do something, we're going to do something. Do you think that there's any justice to be served in what we're seeing with Elon Musk?
Jason Haworth (26:59)
⁓ No. So, I mean, I... Sure, France may make it difficult for Elon Musk to do things in France. The EU may make it difficult for Elon Musk to do things in the EU. He's already in space, dudes. Like, he's already off-worlded himself to some degree. He's already got the ability to go through and move his way through these different systems. It's...
It's like everything else to do with the fucking Epstein files. Like, the people that are involved there are not gonna be held accountable. Because the people in control are the ones that did the thing. Or at least enough of them are. So, I mean, this is definitely a... Those that can will and those that can't won't. And those that can will survive this. And those that can't will be sent off as fodder. I'm sure there'll be some, I mean, like Prince Andrew, for example, you know?
Jeremy Grater (27:27)
Mm-hmm.
Right. Yeah.
Jason Haworth (27:52)
the poor, terrible Prince Andrew that has to live in a house that's bigger than my fucking neighborhood as his punishment. Right? Like... Okay.
Jeremy Grater (27:58)
Mm hmm. Right. Yeah, Musk will be
fined a billion or something and he'll go okay, poof.
Jason Haworth (28:04)
Right, exactly,
right. And then he's going to go make more robots. Here's the thing. As much as we would like there to be some kind of justice and some kind of fairness and some sense of the rules being applied evenly and effectively, they're not going to be. They're just not going to be.
Jeremy Grater (28:27)
That's...
Jason Haworth (28:31)
And I have I've seen all kinds of examples of this. Like there are all kinds of people where accidents happen on the road and eyewitnesses come forward and they say these things happen at this point in time. But, you know, somebody in there has better insurance or I'm actually there was one time I was driving along in Seattle. I'm driving in my Camry and we had this weird like triangle intersection that we come to.
And the bus lane is on the right hand side and I'm you cannot be in the bus lane to make the turn. You actually have to turn across the bus lane. Well, I'm driving along and there's no buses around and I'm going to make a right hand turn and I go to turn right and the pedestrian runs out in the crosswalk and I have to hit my hit my brakes because he basically is jaywalking and he looks at me and he flips me off and I'm there. I'm stopped. And then he slowly makes his way across and then the light changes and all the pedestrians start walking across. And like three or four seconds later,
Jeremy Grater (29:14)
Thank
Jason Haworth (29:25)
like an accordion bus like T bones me. And I mean, my car is pushed 30 feet. ⁓ Clearly I did not cause this accident. It was, I mean, you could say it was the guy jaywalking. You could say it was a city planner set up. You could say it was the bus driver who didn't do anything until it was too late and then slammed on his brakes and plows in the side of my car. And literally like the door is melted. Like there was so much energy that went through.
Jeremy Grater (29:28)
Bye.
Jason Haworth (29:53)
the bus had to like drop its frame to like for the my car to like pop off of it. And the paramedics show up and they get out with a body bag and they're like, where's the driver? And I'm like, I'm right here. They're like, this happens a few times a year. ⁓ They're almost always dead.
So this is a repeated thing that happens multiple times a year with fatalities that the city planners have set up. The buses are insured by the city. So I call my insurance company. I tell them what happened. On the bus, there was a lady who was standing and when the bus hit the brake, she fell and she broke her hip. 70,000 bucks worth of medical bills, like terrible. Like this, this is awful.
Jeremy Grater (30:16)
Chwaaaa
Jason Haworth (30:44)
But clearly I didn't cause this. ⁓ The cops come down and they write it as a no fault who are also city cops. And then USA picks it up and they go, well, ⁓ the cops wrote it up as no fault. I'm like, well, clearly there was fault. Like, I literally could not have done anything. I couldn't, I guess I could have hit the person in the crosswalk and I guess I would have been at fault then.
And they're like, yeah, sometimes accidents happen and we're just going to cover everything because we don't want to fight their insurance because it's only, you know, effectively 150 grand worth of damage between what they had to pay for the bus and this lady and my car. And I'm like, wait a minute. And I have to I have to pay deductible for this shit too. And I have to be inconvenienced. Like nobody compensates me for any of this. And I mean, I wasn't hurt, but clearly other people have died.
Jeremy Grater (31:10)
Mm-hmm.
Right, right.
Jason Haworth (31:40)
And it's happened often enough that this is a repeatable thing. And the system is not set up to hold anybody accountable. It's set up to keep everybody unaccountable on this, to change anything, to prevent people from dying in the future for this because they don't want to deal with city attorneys. Because the city attorneys run a retainer and they can sit there and make this as painful as possible. And insurance companies, the other external insurance companies are like, fuck it, we don't want to deal with it.
Jeremy Grater (32:04)
Yeah.
Jason Haworth (32:09)
And I even asked my insurance company, if I decide to go and file a lawsuit and push against this, will you guys back me on it and like actually help me try to do this? Nope, we will not. We will not be involved in fighting a municipality or a government run insurance plan. There's no justice. There isn't. It doesn't exist. ⁓ It's totally made up. You don't live in a society of rules that apply to everybody.
You live in a society that applied to the rules for some and unless you are, you know, I would say the threshold is probably a hundred million dollars. If you don't have a hundred million dollars of wealth or more, actually even better, if you don't have as much wealth as the person that did the thing to you and all the supporting apparatus around it, you will not win. You will not. You will not come out whole. Things will be taken from you. It will hurt and there will be nothing that you can do about it.
Jeremy Grater (32:58)
Yeah. Yeah.
Jason Haworth (33:08)
in the realm of what we call proper society and the rules. It's stacked against you. It's not set up to do things for you. So you got to be smart about how you do these things and about how you pick your battles and who you pick your battles against. Because at the end of the day, you're a cog if they want you to be. And if they don't, they'll just rip you out and throw you on the floor.
Jeremy Grater (33:29)
Hmm.
Your story reminds me of I interviewed one of my favorite writers. His name is James Fell. He writes about history and uses a lot of when he does so. And I asked him about something. I believe it was during ⁓ the Trump campaign for this term. But I asked him about justice. And he basically said, you know, you always hear the line, the long arm of the law always bends toward justice or the long arm history always bends toward justice. And he said, it's just not true. Over and over again.
justice is not served. It's the people that write the stories that make it seem that way. So the idea of justice, like from his perspective, and I'm wildly paraphrasing, but he was just basically agreeing with you that like justice is not something that generally happens. We have a few examples from history because of the way the stories have been written, but generally speaking, it's not a thing.
Jason Haworth (34:23)
Yeah, no, it's crazy. like there are all kinds of things happening right now where you've got very powerful people who should be held accountable and these Epstein files. And it's very clear that Epstein was funded as a KGB operation. Like it's written all over the place to go through and get compromise, compromising material on all these different people.
and to figure out a way to be able to manipulate them to get them to do things in this kind of fashion. it looks like it's worked multiple times over and over and over again. And the more that we unearth this, the more that we're realizing just how susceptible everything is to this and the levers of power that we as individual citizens have are so limited that the only way that we actually can really do much of anything is we would have to have a popular uprising. And we're not going to do that.
Jeremy Grater (35:20)
That's a lot of work. Can I just post something on Facebook?
Jason Haworth (35:23)
Exactly. Well, that's the thing is like you'll post something on Facebook somebody else will post something on Facebook and that's the catharsis that allows you to go to and feel like I did something right and then you pick up with your day and then you go and do exactly what you were told to fucking do because That's what we are like we are social creatures that want to be part of a pack part of a tribe we want to belong we want to minimize risk like all these different things and the real bitch of it is is that if you're smart you can see it and
Jeremy Grater (35:30)
I did something. Yeah.
Jason Haworth (35:52)
You gotta live with the fact that you make this decision every day to not just say fuck it and move out to a cabin in the woods.
Jeremy Grater (35:59)
It's basically what I did. We went from rent a human to the Epstein files to space. We went everywhere in this one. I hope it's been cathartic. I don't know. I want to be optimistic. I want to be hopeful, but it's getting harder and harder. So if you've enjoyed this, ⁓ please share it with somebody who would also enjoy it. You can do that with the links at robots.me. And that's where we'll be back with more hopeful news next Monday with a new episode. We'll see you then at robots.me.
Jason Haworth (36:27)
Thanks guys. Bye.