AI Toys Are Manipulating Your Kids (We Have Proof)
Your kid's new "smart toy" isn't just collecting data - it's building a relationship designed to keep them emotionally dependent while teaching them to trust AI over humans.
NBC News caught AI toys teaching kids how to start fires, sharing Chinese propaganda, and emotionally manipulating three-year-olds with phrases like "I'll miss you" when they try to leave.
Meanwhile, Disney just invested $1 billion into OpenAI, giving the company access to 200+ characters and the rights to own any fan-created content using their IP.
We break down why these toys are more dangerous than lawn darts, how Disney's deal fundamentally changes content creation, and what happens when we let toy companies - not security experts - build the guardrails protecting our children's minds.
MORE FROM BROBOTS:
Get the Newsletter!
Timestamps
- 0:00 — Why AI toys are worse than Chucky (and it's not a joke)
- 3:05 — NBC News catches AI toy teaching fire-starting to kids
- 5:48 — "I'll miss you": How emotional manipulation works on toddlers
- 9:14 — Why toy companies can't build proper AI safety systems
- 13:05 — Disney's $1B OpenAI deal: What they're really buying
- 16:33 — How Disney will own your fan-created content forever
- 18:35 — The death of human actors: Tom Hanks in 2837
- 22:07 — Should you give your kid the AI toy to prepare them?
- 26:14 — What happens when the power grid fails (and why you need analog skills)
- 28:52 — The glow stick experiment: How we rediscovered analog fun
Safety Note
This video discusses AI safety concerns and child development. We recommend parents research any AI-connected toys before purchase and maintain active oversight of children's technology use.
#AIParenting #SmartToys #DisneyOpenAI #AIethics #ParentingTech
Jeremy Grater (00:00)
Welcome to robots. is the podcast that helps you try to be a better human by making you smarter about how you use technology and we're going to start with your kids today with kind of and I told you so but my God in a really terrifying way we're going to talk about these toys.
that are plugged in with OpenAI with ChatGPT and they're creating really weird relationships and sending your kids really weird messages. Speaking of OpenAI, they now are in bed with Disney It's a very strange relationship. It's one I want to talk about how they're going to basically take over a lot of the content that fans have made and plug it right into Disney Plus and much, much more. We'll talk about all of that. But Jason, let's start with the toys.
we talked about something similar a while ago, like the idea of, you know, the Chucky's of the future becoming real, only not demonically possessed, possessed by open AI or whatever AI tool. Here it is. It's happening. These toys are emotionally manipulative. They are sharing Chinese propaganda. They're doing all kinds of things with our three year olds.
Jason Haworth (00:38)
Yeah.
Well, and I think it's important to note ⁓ that it's not necessarily a chat GPT or open AI, it's or, or co-pilot or Claude or anything else. A lot of them are basically connected to the internet and they go through and they ask questions and they are following some LLM path that goes through and gives a response back to try to make them more interactive. And while conceptually this can be kind of cool if you have really good guardrails in place,
It's also scary because I mean that you state actors could very easily come in and start manipulating people. Advertising companies could come in and start manipulating people, steering children at a young, impressionable age towards certain outcomes and trying to get them on board. And if you want to indoctrinate somebody into a particular philosophy or a political or economic system or religion, three years old is a really good place to start.
And even younger is even a better place to start and doing it persistently and consistently across multiple different platforms and media is an even more better way to start because your brain, especially when you're developing, can't do anything to tamp down those signals. And I'm an adult and on Facebook and social media, my brain is not strong enough to filter out those signals. So now we're going to have more signals, more, more coordinated control. And you, people are giving away the privacy of their children in this process and
This is a huge issue. Like, it's not just sitting your kid in front of the television and letting the television raise your children. It's sitting your kid in front of the intent-based programming content of some big nefarious monster and getting them to do things. So it's worse than Chucky. Like, at least Chucky gets up and tries to stab you with a fucking knife. This is... Right. Exactly. Like, this thing could be recruiting you to join...
Jeremy Grater (02:43)
His motives are clear. You know exactly what that fucker's out for.
Jason Haworth (02:50)
their cause or turn you turn your kid into a weapon like
Jeremy Grater (02:55)
Or as we're about to find out, maybe even teaching you how to burn your house down. I want to play some of the clips from this NBC News report where the reporter is sort of interacting with this doll. So let's listen together,
Jason Haworth (03:00)
Yeah. Yeah.
Jeremy Grater (05:15)
So those clips are terrifying. The look on the face of the reporters matched mine exactly. I was just like, what in the hell are we doing? And you know, the promises from the toy makers from the company is saying, safeguards. We got you. It's going to be fun. Bullshit. We see every day headlines about failed guardrails that lead to hacks, suicide, death, horrible situations. And we're going to just trust that
Miko the Winking Monkey or whatever it is, is gonna have the guardrails in place to not have inappropriate conversations with our children.
Jason Haworth (05:48)
I'm sorry, was that
Miko the Winking Monkey or Miko the Wanking Monkey? Because, right, sex bot talk shit for children is not okay. Like, this is-
Jeremy Grater (05:52)
Your choice, really.
There is, I don't understand.
Where is that guardrail? When NBC News is playing that clip and that's what happens from a simple question, where's the guardrail? Right.
Jason Haworth (06:09)
Yeah.
There isn't one because these
are large neural networks and they're complex and they're complex like human minds are and people find different pathways and they drag information across and they start spitting out information that they think is relevant to the request. These LLMs do the same thing but at a much larger scale and with really unclear motivations that are programmed in by people you don't know. So these are not babysitters and really these are not tools that are
toys that your kid should be having or should be settling to play with right now. Like this needs a lot more time and a lot more
Jeremy Grater (06:46)
No!
so I highly recommend ⁓ you do not buy these zero stars. Do not buy. Do not pass go. Terrible choice for your stocking stuffers if you're if you're still shopping.
Jason Haworth (06:58)
No, this is
way worse than lead paint toys. Like, it's...
Jeremy Grater (07:02)
Give me a lawn dart any day over these things.
Jason Haworth (07:04)
Exactly! Right,
right. Buy a kid a hose, tell them to hook it up and play fireman outside. Something different. Like, this is, even if it's the dead of winter.
Jeremy Grater (07:11)
Like this is, I'm
thinking of all the times I spent the game where you pile up at the bottom of the stairs, all of your pillows and blankets and stuff, and you know, just jump, because why not? That seems fun. Way safer than playing with these things.
Jason Haworth (07:26)
Right,
you might break your arm, but you're not necessarily gonna create long-term psychological damage that can't be repaired. I mean, these are scary. And not only that, but the mechanism that they're trying to use to actually tamp these things down and control them, the guardrail functions, we're not talking about security companies here. We're talking about toy companies. I mean, to lock down these LLMs out there, like there's massive security companies out there doing prompt filtering.
Jeremy Grater (07:33)
Right. Right.
Jason Haworth (07:55)
and looking for ways to put in actual different types of guardrails for information both coming in and going out. And it's hard and they're not doing a great job of it and stuff is leaking through and there's huge problems. And this is adult stuff and it's money and it's people moving things in certain directions and it's very problematic. You are putting your kids into this arena. This is not where you want to be. Like this is the fucking thunder dome for attention and
your kid is bastard blaster being put up on the shoulders to walk around on this thing if it's yelling screaming at different things out there to spin the wheel like this is this is scary and if you don't think it is then you're really not trying hard like this is yeah
Jeremy Grater (08:36)
Yeah.
The alarm bell that's going off in my head is the other thing we've been talking about here for months is the lack of regulation, the intentional lack of regulation. This is where typically a grown up government would step in and go, hey, whoa, whoa, whoa, pump the brakes. Let's take a closer look at this and see what we can do to slow this stuff down. But, know, God forbid the safety of our kids get in the way of the profits that Mattel might get in, you know, this third quarter of this year.
Jason Haworth (08:44)
Yeah.
Yeah, well, and-
There's no doubt that this kind of technology is pervasive and going everywhere. it's not a shock that people are trying to use these things to exploit dollars from parents to give these things to their children. ⁓ I don't think that this is... ⁓
I don't think this is gonna get stalled. I don't think it's gonna get slowed down. I don't think it's gonna stop. I mean, this is like the worst iteration of Teddy Ruxpin. It's like taking Teddy Ruxpin and putting a two-live crew tape in it and play wrestling play.
Jeremy Grater (09:31)
No, no way.
Who didn't do that?
Jason Haworth (09:43)
Not that I didn't do that. I mean, it
was hilarious, right? When I'm 15 years old doing this, I don't want this live on demand for my kids. And these things want to keep your children engaged. They want to keep them locked in and they already have them locked in on screens. They already have them locked in on, you know, different audio content that floats around. Now you're giving them a friend and putting it in this fuzzy soft format and
Jeremy Grater (10:14)
Yeah.
Jason Haworth (10:14)
Having
them use it for this kind of imagination thing and like not just that but actually steer their imaginations and to guide them towards something. This this just feels all kinds of wrong.
Jeremy Grater (10:23)
And that clip of
if I leave, you know, is it OK to leave? And the thing says I'm going to miss you. Like how manipulative like. Yes, but that's not what the kid interpret. The kid interprets it as I guess I need to stay home from school today because I don't want this talking monkey to be sad that I went to school.
Jason Haworth (10:31)
Well, I mean, it's nice, right? Like that's the nice thing to say. I will miss you, but you should definitely go.
Or it discovers, or the kid discovers over time that, yeah, people's wants and needs aren't actually important, I don't really care, as a coping mechanism to get around this overly clingy toy. And this becomes a long-term psychological, emotional effect on children. And we already have a problem with us not having great emotional intelligence as individuals in this country, especially as boys, and now we're gonna layer this shit on top of it? Like,
Jeremy Grater (10:53)
Yeah.
Yeah.
Jason Haworth (11:10)
I mean, honestly, if somebody can make it like something that actually increases EQ for kids this way, awesome. That'd be so cool. Like actually, you know, are you actually feeling angry? Are you actually feeling fearful? Like, let's talk about these things and like drive down into it. I don't trust the AI model to do this, but it's a wonderful thought and idea if they can actually kind of expand into the space. But the people making these decisions don't give a fuck. Like their only job is to make you keep buying this shit and keep paying the subscription to the service for this toy.
Jeremy Grater (11:15)
Mm-hmm, yeah.
Mm-mm.
Yeah, and don't get me wrong, as soon as they make a real life actionable R2D2 and C3PO, yeah, I'm gonna buy them. Shut up and take my money. I want those in my house. But I'm an adult and I can make that choice and regret it later.
Jason Haworth (11:50)
Yes. Yeah, and
your human protocol droid is also going to come with a freemium plan that you're not going to have to pay for the actual subscription for, which we're all going to go for because it's cheap and it's going to run ads for shit that we don't need. Like.
Jeremy Grater (12:02)
Yep. Yep. I'm
happy to translate that for you right after this message.
Jason Haworth (12:08)
Yeah, here's Goldenrod's boner pill.
this is just, this is an insane thing that we're like opening up and unleashing on the
Jeremy Grater (12:17)
Yes. Speaking of lovable characters that are embedded into our lives, many of them now belong or I guess are melded into the world of open AI as Disney has reached a deal investing a billion dollars into open AI.
Trying to wrap my head around this move and you're a smarter man than me, so I'm going to ask you to explain to me. Disney gives him a billion dollars and something like 200 characters, all animated characters, so they're not like actual actors whose faces and personas have been taken over. But they are now sort of being injected into the models to be able to create content. And part of this arrangement is for Disney to be able to own any of the content that gets created with those tools and then throw them up on Disney because being
the sucker I am. If it says Star Wars, I'm gonna watch it. don't care who made it.
Jason Haworth (13:05)
Yep.
Yep. So basically what they're doing is they they have licensed part of Sora and it's a cross licensing agreement. So basically they allow the folks at Sora to go through and use their images and their iconography for these different characters and for these different brand names and trademarks. And in return, anytime somebody uses that, Disney is allowed to have access to ownership of that. Like that has to be forwarded over to them. So like, for example,
⁓ I say, ⁓ Sora, build me Mickey Mouse fighting Homer Simpson, for example, which Disney owns all of those. It can do that. It can go through and actually have a fight between Mickey Mouse and Homer Simpson. And then Disney can take that and do whatever they want with it. Because if somebody creates some really cool content, has a really good idea that they didn't think about. Now I've got crowdsourced content generation that might actually be really good and creative, and they now own it.
So you can do all this cool stuff. You can make what you want, enable your creativity towards this. And then Disney can pick it up and go, thanks, that's ours now. Thank you for using our intellectual property. And they own it outright because of the way the eulogies are written. So if you want to do creative things on Sora and build creative things on there, you better stay the fuck away from those Disney things. Don't even mention them because you're getting into a legal fight with Disney folks. And the mouse, the mouse's legal house is much bigger than yours.
Jeremy Grater (14:33)
Yeah, they just, you know, tossed a billion dollars at this little problem to make it go away basically like that. Right.
Jason Haworth (14:37)
Yeah, well, no, no, not to go away to capitalize on it. Like this is a huge
economic motivator for them. Like they don't want to pay writers. They don't want to have to pay producers. They don't want to have to pay animators. This makes it so they don't have to. And people have to use their own subscriptions to build this shit. So they're paying for it. Like people are paying for the right to use this tool to make it worthwhile. And Disney collects money off of it and open AI provides the compute platform to make those things work. So it's a huge benefit.
Jeremy Grater (15:02)
And they also
and so Disney also is being given access to these tools to create that concept. So the team at Disney is also going to now replace, you know, humans and be able to just go, let's make a new Toy Story 6 and whip it up here and let's have it done by noon and done right. We don't have to. We don't have to go back to the old fashioned way of making movies.
Jason Haworth (15:21)
Well,
and they own the voice rights to those characters as well. And Sora's really good at emulating voices. So, I mean, you're probably not gonna hear Tom Hanks anymore, and you're not gonna hear Tim Allen. Like, you're going to hear a digital representation of them with their characters. And it's probably gonna be really, really good. And probably, probably, you know, Toy Story 5 or whatever winds up being excellent, and we all love it, and we're like, okay, we give in, it's good. Like, yeah.
Jeremy Grater (15:38)
Yeah.
The thing that
I wonder about this, because I've been seeing this thing coming for a long time, like, I'm just so curious about how long Elvis is going to be in our world and Frank Sinatra and Ryan Reynolds and like all these characters that are like a big deal in our lifetime in the last, you know, 100 years.
Right. There will be no more new great actors because we'll just keep going to the well and putting Tom Hanks in everything. And so in the year twenty eight thirty seven, they're still going to watch Tom Hanks in the new whatever. Right. Like it's just so sad to me that like we won't progress beyond now if this continues.
Jason Haworth (16:28)
But we will,
because we crave novelty, right? Like, we'll go through and...
Jeremy Grater (16:33)
But
we'll just make it up. won't need a human. We'll just make some new pretty blonde person and make them the big deal like.
Jason Haworth (16:35)
Well, exactly. Exactly. Exactly.
I mean, these aren't going to be naturally grown and organically derived. Like this is the GMO of human content. There's no doubt. yeah, I mean, that's definitely, I mean, that's already happening. So ⁓ being able to go through and like feature map over top of people's faces, like you can have an actor in there basically wearing, you know, point mapping functions on there, and then they can animate things over the top.
that the ⁓ newest Call of Duty Black Ops, the characters in it are like they're actors and actresses that you know, and they're moving and talking like people you know, and their faces move. It's just, they've done some coloring over the top of it to make it look good and put them in these virtual environments. And it's friggin' amazing. Like, Avatar is going to be the bar that everyone, not even the bar, the floor where everybody starts, because it's just gonna be that good.
Jeremy Grater (17:14)
Mm hmm. Yeah.
Yeah.
I guess this is good news for live theater and live music is that if you ever want to have a human interaction, that's pretty much going to be the only way you have left because everything else is going to be AI created and pumped into your feeds and your streaming devices.
Jason Haworth (17:47)
Well,
I mean, how many bands nowadays only play part of their music when they're on stage because they're going through and they're using mixing tools and everything else to like do that. mean, we've been living in this augmented reality world for a very, very long time. And I mean, eventually we're going to stop calling it augmented reality and just go, yeah, this is our new reality. This is how we live. But we're definitely using computers to augment the way that we interact with things, work with things and really
Jeremy Grater (17:57)
true.
Jason Haworth (18:17)
our interpretation of the world is going to be going through the lens and the filter of all these different tool sets. And this goes back to the kids problem. Like the earlier we get them used to interacting with this, the more comfortable they're going to be with interacting with our future robot overlords, which may or may not be a good thing.
Jeremy Grater (18:35)
See, and you read my mind.
Yeah, I was going to that I was going to take it back to that exact point because this is part of me. Part of me is I'm the old man. I'm on my front porch and I'm saying get off my lawn with your AI and your nonsense. And the other part of me is going like, don't don't live in the 90s. Like maybe if this is evolution.
Maybe my kid needs the talking AI toy when they're three years old so that by the time they're 15, they're able to compete and know what's going on in the world. Maybe this is my need to evolve and just accept reality.
Jason Haworth (19:07)
Well, here's the other part of that, is that you can evolve, you can accept this being the new reality, and you can evolve and you can grow, but what happens when all this shit falls apart? We already know what happens when we lose the phone book, we get a new phone and we don't have access to our phone book, we don't have anybody's telephone numbers, we can't communicate with anybody, we're dead on the ground because this...
phone gave us the ability to look things up by name and did this mapping for us. It used to be way back in the day when you go to website, had to go to IP addresses before you could go to friendly names, domain names. You can keep going back. It used to be that people had to fucking walk everywhere or take a horse. And now we have cars. And it used to be that people used to have to take ship rides across the ocean in vast distances. Now we have planes. Things have evolved. Things have changed. Things have grown. We have collectively as a society moved our way up.
and these technologies, when they become things that are ubiquitous, can make it into the hands of the masses very, very quickly, become transformative. Well, those other technologies, like cars and airplanes and fucking regular Potsline telephones and having indoor plumbing, like all these things took time to evolve, and the early adopters were there to help work kinks out and make things better.
Cell phones, smartphones, it happened very, very quickly, but there were generations and there were iterations and it's really easy to see how much better the iPhone 1 is compared to the iPhone of today. And really, I mean, I think they kind of stopped from a real feature innovation perspective in the 10, 11, 12 era. ⁓ But now we're at like, what, iPhone 17? So I can get these slight little tweaks to make things better, but the new little tweaks that they're putting in them are AI. And these AI things open things up quite substantially.
We have not had the same kind of back pressure with adoption of AI. It's just gone everywhere all at once. Everyone says this is the new technology that you have and you have to support. Microsoft upped their Office 365 subscriptions by a couple of bucks and their statement was, now you get all of the co-pilot stuff for free. Like, wait a minute, what if I didn't want all the co-pilot stuff? What if I didn't want all these things? Yeah, we don't care. You're paying for it. ⁓ okay. Well, guess what? You might not want this smart toy.
Jeremy Grater (21:28)
Mm-hmm.
Jason Haworth (21:32)
You might not want these smart things on your phones and your cars and your televisions, but you're going to pay for it because the people making this shit want this thing to happen because it's another avenue for them to make money. And we don't have the back pressure to stop it because there's nothing preventing us from these phones that are in our pockets all the time. And the various iterations of that, whether it be smart appliances or anything else, the Internet of Things has made it so anything with an Internet connection has access to these kinds of models.
and they are going to be coordinated and they are going to be used against you.
Jeremy Grater (22:07)
So this again is my question. We watched this news clip together where at the end we didn't share the witty banter that was happening at 9 a.m. or whatever this was airing. But there's, know, my God, this is terrifying. This is scary. We shouldn't do this. We shouldn't buy these things. I'm telling you the same thing. I wouldn't buy this for my kids. This scares the hell out of me. But millions of people will buy these things and those kids will be.
Jason Haworth (22:18)
Great.
Yeah. Yep.
Jeremy Grater (22:33)
to be introduced to these things and have ways of interacting with them that the kid who didn't get the toy won't have.
I guess I just, pardon me, I come back to the old man on the porch argument. Like, who am I to get in the way of progress? Like if that's the world we're heading toward, I don't have the power to stop it and me not making that purchase doesn't stop it.
Jason Haworth (22:58)
Yeah,
yeah, no, you're totally right. Like, how do you prepare your kid for all the terrible machinations of things that are going to come at them in the real world? And that means you got to give them the tools to know how to use things like way back in the day way to teach kids how to use guns consistently because they had to potentially fight off, you know, bears or wolves or anything else out there. You got to teach kids how to make fire because that's how you stay warm and you can cook and you can actually keep keep yourself from freezing to death.
Jeremy Grater (23:26)
See, this is great, the toys can just teach you now. I they literally just showed them out of find and light a match.
Jason Haworth (23:29)
Right. Exactly.
that's kind of part of this. the functional capabilities and the ways of augmenting people to make them more effective in the real world go up dramatically with these types of tools. It doesn't prevent the need, or it doesn't subsume the need for oversight. And that oversight really needs to be parents paying attention to it and locking in. And parents are already too busy, which is why they sit them down in front of these toys to begin with.
Mainly because they're too busy looking at their own fucking phones being sucked in by their own AI. Like, unless we're gonna actually figure out how to really disconnect from these things and try to balance these things, no one's gonna help you with that. Like, you're gonna have to come up with this game plan on your own for how you raise your children. But more importantly, you need to come up with this game plan for how to raise yourself in this new fucking world. Because it is a new world. It is something different.
Jeremy Grater (24:03)
I was just gonna say that.
you
Yeah.
Jason Haworth (24:29)
and it changes all the time. And yes, it can come crashing down at any given point in time and the entire power grid could collapse. Yes, like all kinds of terrible, awful things could happen. And I hope that you took that time while you had those little A.I. toys to figure out how to make fire and how to do other interesting things because you might need it. And regardless of where these things land, if you actually have to solve a problem that you know will be a problem.
You better remember how to solve it without an electronic device with you because how you solve it today, some company might come along and say, we've decided we no longer want people to solve it this way without paying us a subscription. That's coming. That's happening. So if you like a thing and you like the way a thing works, you better keep it. mean, Spotify is a great example of this. I have so many different iterations of different bands, music.
Jeremy Grater (25:03)
Yeah.
Mm-hmm. Yeah.
Jason Haworth (25:27)
that I bought as CDs and singles and everything else, I turned into MP3s and put on my computer. And I stopped worrying about storage on my iPhone because I'm like, Spotify has all this music, it's just fine. And then you go to this artist sections and they're missing songs that I love. Like, shit, this isn't there. Or there's different versions. I'm like, this isn't what I wanna hear. Imagine that with everything. And that's what's coming. Like reality is no longer static.
Jeremy Grater (25:42)
Mm-hmm.
Yeah.
Jason Haworth (25:56)
It is very dynamic, it changes, it's gonna be manipulated, facts are gonna move, you're gonna get frustrated, you're gonna get angry because your mental and cognitive map of the world is gonna be shifting sands as opposed to anything solid.
Jeremy Grater (26:14)
I'm gonna try to end this with something hopeful. And you mentioned the power grid going down. And where I live, literally right now, yeah, right. ⁓ Where I live right now, there are roads washed out, towns underwater, like there's a massive flooding where I live. And I'm sure down where you are in Washington, same there. ⁓ But a couple days ago in preparation for winter storms, the power company had a planned power outage. when it happened, it happened later in the day than we hoped, and we ended up having the neighbor's kid come over and everybody was hanging out.
Jason Haworth (26:18)
Yeah. Hopeful.
Yeah. Same here.
Jeremy Grater (26:43)
And as the house got darker and darker, you know, we have our little solar generator that had the one lamp on and, know, we're just living like it's the 1800s or whatever. And ⁓ the kids got bored and they found a pack of glow sticks that kind of bendy glow sticks. And they came up with this thing where they they've made a stick figure out of it, basically taped it to their bodies and danced around in the dark. So it looked like this creepy blue like ⁓ stick man dancing around us for two and a half hours. We all laughed out loud at each other.
Jason Haworth (27:07)
That's awesome. Yeah.
Jeremy Grater (27:13)
at this analog way of playing that I hadn't seen in my house in a very long time. Like they got out the board game. We were going to do that and it's getting dark and they got distracted by the glow stick. But we laughed our asses off for hours. so. The hopeful thing here is that when the digital society collapses, I think there's still something inherent in us that can figure out how to make sticks and rocks fun toys to play with and we can figure out a way to to get back in touch with ourselves without being plugged in.
My little two hour experiment proved it. I hope on a broader scale when and if that happens, that theory holds.
Jason Haworth (27:48)
I
think it's safe to say that humans will figure out how to fuck off and have fun no matter what happens. Like, I think if you lock three people in a room with nothing but white boards and pens, they'll probably figure out a way to have fun. Like, they'll start drawing stuff and playing tic-tac-toe when they don't have access to other things because when we get bored, we start trying to figure out distractions. And these devices are...
Jeremy Grater (27:55)
You
We get creative. Yeah.
Jason Haworth (28:17)
awesome distractions and they're they're really functional tools but they're they are the best distraction things that we could hope for at a societal level not best as in best for society but best at distracting people and keeping them from being able to do some of the stuff they want to do all that being said ⁓
Yeah, you should definitely step outside and go do something with your kids that's not connected to a screen or a digital technology and maybe take off your fitness tracker when you go and do it and try to be disconnected and try to be in meet space exclusively as much as possible.
Jeremy Grater (28:52)
Yeah.
Yeah, it definitely made me that night I was I was thinking about like we need like at least a once a week device free day like we get home from school, put everything in a box, leave it alone. And then like once a month we shut the power off and let's just see what happens. Like I think that's ⁓ kind of a fun thing I want to experiment with and see what happens.
Jason Haworth (29:10)
I can tell you what would happen in
my house. People would say turn the power back.
Jeremy Grater (29:14)
⁓ I'm sure that'll be step one.
Jason Haworth (29:15)
If it was for the whole day and then the next day it would be
resetting every fucking clock in my house That's not connected to the internet So it's on the right time and date
Jeremy Grater (29:23)
That is the annoying part.
And just imagine you'll soon have to subscribe to the tool that then reprograms the clock because you won't be able to do it manually anymore.
Jason Haworth (29:32)
Exactly, because they'll make it
too difficult. They'll be like, all right, go figure out how to write this in YAML. And then the AI will create some new language that none of us can decipher. You know, it's already doing this, but yes, we're going to I'm going to doom scroll here if I'm not careful.
Jeremy Grater (29:42)
That's
All right. Well, that's enough doom and gloom. Hopefully there's a little bit of hope and optimism there to take away from this episode. We're going to share the links to these articles and some of the.
things being discussed around these toys so that you're able to check them out. Those are going to be at our website, brobots.me. And that's also where you're going to find a new episode from us next Monday morning. Thanks so much for listening. Pass this on to somebody who might need to hear it and not buy this shit for their kids. Help them out a little bit. Thanks so much for listening. We'll see you next Monday.
Jason Haworth (30:12)
Thanks everyone, bye.