Jon Haidt finds Our Youth is the Most Unhappy
At least 600 published papers indicate that happiness follows a U-shaped trajectory as people age. While conversely, unhappiness exhibits a hump-shaped pattern. Essentially, individuals tend to reach peak happiness in youth, with their student years often representing the apex of their contentment, before a midlife dip or midlife crisis around age 50. This phenomenon has been observed across the globe with data going back to the 70’s.
However, Jon Haidt, the author of The Anxious Generation, reported that a significant shift has occurred in recent years. Since around 2017, the well-being of young adults, particularly young women aged 18-25, has plummeted steeply. This contrasts sharply with previous patterns.
The U-shaped happiness trajectory and hump-shaped pattern of unhappiness seem to have vanished. Now, young adults are, on average, the least happy demographic group, with happiness increasing with age while unhappiness declines—a trend observed globally.
Jon’s post on his Substack walks through some of the data on the shift between happiness and age. While more research still needs to be done, we can’t help but speculate how social media, addictive technology, and kids being exposed to mature content too soon must be playing a role. In other words, there’s no risk in NOT exposing our children to addictive media that often includes mature themes their precious brains aren’t ready for.
Washing Post’s Social Media Etiquettes – We Agree!
The Washington Post recently shared an article highlighting 9 social media etiquettes. While we aren’t huge fans of social media, we do appreciate and agree with their first tip: “Don’t overshare your kids.”
The post does a great job of reminding parents that the internet is permanent! Once you post, it’s out there. And how “oversharing kids can be a form of exploitation or expose them to predators.” How many of us are grateful that our childhood photos are hidden in photo albums on dusty shelves, away from strangers?
And they offer some tips for how to post your kids safely. The first suggestion is to post privately. We like to consider this question, “Would you print out the pictures and physically hand them to your followers?” If not, then don’t post it!
We also love posting with permission. That means before you post, you ask your child if it’s okay to share the image. Get their consent, teaching them that pictures are important, and they have the power to say no.
Lastly, if you are sharing images of your kids, a great idea is to blur their faces or cover them with an emoji. To learn about another reason why we should be cautious with sharing images of our children, see the link below: https://bit.ly/3wMTR3
OpenAI wants ChatGPT to Mimic Human Thinking
OpenAI, the creator of ChatGPT, disclosed that its new AI model is slated to supersede GPT-4 technology (the latest version of ChatGPT). This represents a significant step toward the pursuit of what’s called “AGI” (artificial general intelligence), a concept fiercely debated, referring to computers attaining human-like cognitive abilities.
Furthermore, OpenAI unveiled the formation of a new safety and security committee, chaired by CEO Sam Altman. The company committed to issuing recommendations following a 90-day review of its technology.
While a new safety team sounds great, a new AI model that matches human thinking sounds concerning. AI has been the center of so many issues this year. To learn more about how to keep your family safe, see our ultimate guide to AI and Deepfakes linked in the caption or description: https://bit.ly/3TVFoeK
Telegram is a Playground for Misinformation
Pro-Russian disinformation groups are taking advantage of the “extremely lax” regulations on Telegram, a widely popular messaging app, and EU authorities have limited options to address the issue.
Minutes after Slovakia’s Russia-friendly Prime Minister was shot, Telegram exploded with conspiracy theories.
The claims were eventually debunked, but the theories had already gone viral on Telegram.
The messaging app has become a known tool for pro-Kremlin accounts to disseminate disinformation aimed at weakening support for Ukraine. Recently, Russian intelligence officers have allegedly recruited petty criminals for sabotage activities through Telegram.
These incidents reveal Telegram’s main issue: there’s no accountability. European officials, who have prioritized combating fake news ahead of continent-wide elections in June, are practically powerless to the flood of disinformation. Despite their new regulatory powers over online information, they find themselves largely unable to control Telegram.
“Disinformation is spreading openly and completely unchecked on Telegram,” Estonian Prime Minister Kaja Kallas told Bloomberg.
Not many kids use Telegram. But we’ve written about it for parents to learn more. Just tap the link in the caption or description: https://bit.ly/3yDwKZP
Nebraska Sues TikTok for Creating a Mental Health Crisis
Nebraska filed a lawsuit against TikTok, claiming the platform is intentionally designed to be addictive, and therefore harmful to children.
The lawsuit is accusing TikTok of creating an algorithm that delivers a continuous stream of videos aimed at capturing and maintaining the viewer’s attention. The complaint compares the algorithm’s effect on behavior to that of a “sophisticated gambling machine” and alleges deceptive trade practices.
The lawsuit also claims that TikTok’s addictive nature has led to a mental health crisis among Nebraska children, including depression, anxiety, and eating disorders.
At Protect Young Eyes, we recommend parents keep their children away from social media until age 16 for these very reasons. The risks of addictive social media are so much higher than any of the potential rewards. #delayistheway
Trigger Warning – New Lawsuit Regarding a 2022 School Shooting
This topic is tragic, heartbreaking, polarizing, and can be very triggering. Feel free to skip ahead if you need to.
In 2022, there was a school shooting in Uvalde, Texas. The families of the children killed are now suing the makers of Call of Duty, Instagram, and Daniel Defense, an AR-15 rifle manufacturer (that made the gun used in the shooting).
The two lawsuits, filed in California and Texas, claim that Daniel Defense marketed the gun to the shooter, before they turned 18, through Call of Duty games and Instagram ads.
Now, Call of Duty is rated 17+, so it’s not very surprising for a gun manufacturer to allegedly feature its products in such a popular and mature game. Instagram on the other hand is rated 12+, so if Daniel Defense did target minors in their advertising, that’s a problem.
Regardless, the shooting in 2022 was heartbreaking. Our prayers go out to the families that are still suffering. No one should ever have to endure this kind of loss. Whether or not Daniel Defense is at fault, we are certain of this; kids are being exposed to mature content ahead of schedule. It’s too much, too soon. Their young, developing brains simply aren’t ready.
If you’d like to learn more about how mature content affects developing brains, click here: https://bit.ly/3V3KV1V
How Big Tech Could Help Protect Kids
Protecting children at the device layer has long been the focus of PYE founder Chris McKenna, co-author of the nation’s only device-level legislation to pass (the Child Device Protection Bill – insert NCOSE landing page link: https://endsexualexploitation.org/device-protection-bill/).
Last week’s Substack post from John Haidt allowed Ravi Iyer – professor, data scientist a moral psychologist to also explore the idea of device-driven protections for children. It makes perfect sense – your iPhone already knows everything about you. A digital identification that could be used to allow or disallow access to any number of digital spaces on the device. Imagine automatic blocking of explicit, violent, and extreme websites accessed via browsers and search engines. Only allowing social media apps to be downloaded after receiving parental consent (afterall, downloading an app creates a “contract,” which can’t be created with a minor).
Currently, complex parental control settings buried deep in devices demand caregivers stay in front of constant updates, bugs, and confusing UX. The App Store is the “pinch point” for all access to anything children do on smart devices. Controlling who can use certain apps in certain ways based on known user data only makes sense. It’s far beyond time for Apple, Google, Amazon (Kindle), and Samsung to fulfill their duty of care for protecting children and empowering parents.
“Chris, help! We have a tech emergency!”
That’s why we created The Table, our PYE membership where you can get immediate, emergency help, ask basic tech questions in our Tech Support space, become a specialist in YouTube setup, pornography education, or compulsive gaming, and interact with other, like-minded parents. Abby is there to take care of you! Join for 7 days free, try it out, and then stick around for $7/month or $70 for the year. I can’t wait to meet you at The Table!
Hello,
I would like some help managing my daughter’s iPhone. I do have parental controls on her phone from screen time, wonder if there’s a way to prevent her from deleting texts? She deletes certain texts so I don’t see them. I’m assuming she deletes them because she’s not following our rules for her phone usage.
Thank you!
Hi! Unfortunately, there’s no way to prevent her from deleting texts. It’s a huge weakness on iPhones.