Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
If your teen suddenly has a new lexicon of mental health terms, like “trauma response” and “major depressive disorder,” TikTok may be to blame. A poll by EdWeek found that 55% of students use social media to self-diagnose mental health conditions, and 65% of teachers say they’ve seen the phenomenon in their classrooms.
“Kids are all coming in and I’m asking them, ‘Where did you get this diagnosis?’” said Don Grant, national adviser for healthy device management at Newport Healthcare, in an interview with The Hill. Grant said he would get responses such as “Oh, there’s an [influencer],” “Oh, I took a quiz,” or “Oh, there’s a group on social media that talks about it.”
Social media can help kids understand their feelings and find ways to cope. The EdWeek poll found that 72% of educators believe social media has made it easier for students to be more open about their mental health struggles. And it makes sense that kids would turn to a space they know — social media and online groups — to get information, rather than finding a mental health professional first (or talking to their parents).
However, the topic gets tricky when you consider the fact that social media sites don’t exactly verify that the people sharing medical advice are, in fact, medical experts. While there are plenty of experts sharing legitimate information online, there are also influencers who are paid to talk about products that improved their anxiety and off-label medications that cured their depression.
Big picture: Self-diagnosing on social media is also problematic because algorithms can create a self-fulfilling prophecy. Most algorithms, like TikTok, use a user’s activity to determine what they see next on their feed. If a teen thinks they have depression, they’ll see more content about depression — which may confirm their self-diagnosis, even if they aren’t clinically depressed.
As parents, it’s important to talk to your child about mental health, how to cope with big emotions, and what to do if they need a professional. But it’s also essential to know where they’re getting their mental health information and what they’re seeing on their social media feeds.
Don’t dismiss their feelings outright — be curious. Talk to your child about verifying their sources of information. If they’re getting medical advice from an online creator, are they an actual doctor or therapist? Or are they simply someone who’s popular online?
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
Gov. Ron DeSantis recently signed a bill that bans kids under 14 from creating social media accounts and requires parental consent for kids under 16. The bill requires that companies delete accounts belonging to 14- and 15-year-olds and implement age verification measures to ensure that kids aren’t lying about their ages.
Florida’s bill is the most restrictive social media ban in the nation, and that’s after DeSantis vetoed an earlier version of the bill that would have banned all accounts for kids under 16. At the bill-signing ceremony, Republican Speaker Paul Renner said, “A child in their brain development doesn’t have the ability to know that they’re being sucked into these addictive technologies and to see the harm and step away from it, and because of that we have to step in for them.”
Legal upheaval: The bill takes effect Jan. 1, 2025, pending any legal challenges. Tech industry groups have already come out against the bill, including NetChoice, an association that represents major social media platforms and is currently battling with the Supreme Court over a separate social media law.
“This bill goes too far in taking away parents’ rights,” Democratic Rep. Anna Eskamani said in a news release. “Instead of banning social media access, it would be better to ensure improved parental oversight tools, improved access to data to stop bad actors, alongside major investments in Florida’s mental health systems and programs.”
In our last issue, we covered Utah’s decision to repeal and replace its social media law after months of legal challenges that delayed the bill’s implementation. Although DeSantis and Renner have signaled that they’re ready to fight to keep Florida’s social media ban in place, time will tell whether or not Florida’s kids will have to wait until their sweet 16 to get on Snapchat.
How will you check in with your child about online safety this week? Save these conversation-starters for your next check-in.
Sleep can impact everything from brain performance, to mood, to mental and physical health. Our children aren’t getting enough sleep, either, and screens are one of the prime suspects. But how does screen time affect sleep?
Pinterest use is up among teens. Gen Zers are using the website as a canvas for self-expression and exploration. Learn more about how to keep your child safe on the site with Pinterest parental controls.
😮💨 What is the “mental load” of parenting, and how does it affect your emotions, sleep quality, and job performance?
🚩 What are the red flags that you need to worry about your child’s mental health? Save this list from Techno Sapiens.
🤝 Rules and restrictions aren’t the end-all, be-all to parenting in the digital age — you also need a healthy, emotionally rich relationship with your teen. Read more at Psychology Today.
📵 When it comes to protecting kids’ mental health, Florida’s social media ban won’t be that simple, writes David French for the New York Times.
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
Last year, Utah passed a first-in-the-nation law that prevented kids from accessing social platforms without parental consent, among other restrictions. Fast-forward to now, and Utah Gov. Spencer Cox signed a pair of bills into law that repeal and replace almost everything. What happened?
As a recap, the 2023 Social Media Regulation Act: “required parental consent before kids can sign up for sites like TikTok and Instagram, prohibited kids under 18 from using social media between the hours of 10:30 p.m. and 6:30 a.m., require age verification for anyone who wants to use social media in the state, and sought to prevent tech companies from luring kids to their apps using addictive features,” via NPR.
Following major Big Tech lawsuits, Utah’s legislature recently passed H.B. 464 and S.B. 194. The new bills maintain age verification but repeal the ban on addictive design features, only require platforms to obtain parental consent if a child attempts to change certain privacy settings, and don’t require platforms to enable parental controls unless the minor agrees.
Like the previous version, the new legislation creates a process where parents can take social media companies to court. Parents can sue for a minimum of $10,000 per incident if a child has an “adverse mental health outcome” as a result of excessive social media use.
Big picture: Utah’s about-face underscores both the importance and difficulty of implementing social media regulation. After signing the Social Media Regulation Act into law in 2023, Gov. Cox nearly dared critics to sue the state over the law — and they did. NetChoice alleged the restrictions violated First Amendment free speech protections, and Foundation for Individual Rights and Expression filed a second lawsuit claiming that the age verification requirement is unconstitutional. Florida’s own social media ban, which was recently signed into law, faces similar legal challenges and delays.
Utah’s “repeal and replace” version of the bill aims to address some of the concerns raised in the lawsuits, while still taking steps to protect kids online. Sen. Mike McKell, one of the revised bills’ authors, said that there is data to support — and justify — the state’s push to put guardrails around social media use at the state level.
“One of the bars that we have to overcome in legislating when we’re looking at First Amendment issues is whether there is a compelling state interest,” he told the Salt Lake Tribune. “We’re trying to tell the court explicitly why we’re passing this. Here’s the intent behind it. Here’s what we’re seeing in our state and why we’re passing this law.”
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
It’s another bad PR week for Meta: the Wall Street Journal reports that federal prosecutors are looking into whether Meta, the parent company behind platforms such as Facebook and Instagram, facilitate and profit from the online sale of drugs.
It’s alarmingly easy to find controlled substances for sale online. A 2021 report by the Tech Transparency Project found that it takes just two clicks for kids to find potentially deadly drugs for sale on social media. According to the Drug Enforcement Administration, “Drug traffickers have turned smartphones into a one-stop shop to market, sell, buy, and deliver deadly, fake prescription pills and other dangerous drugs” — which can easily contain deadly doses of fentanyl.
US prosecutors sent Meta subpoenas last year and have been asking questions as part of a criminal grand jury probe. They have also requested records related to drug content or illicit sale of drugs via Meta’s platforms.
Bottom line: Investigations don’t always lead to formal charges, but this report places even more scrutiny on social media companies and how accountable they are for the content posted on their platforms. In a statement, a spokesperson for Meta said, “The sale of illicit drugs is against our policies and we work to find and remove this content from our services. Meta proactively cooperates with law enforcement authorities to help combat the sale and distribution of illicit drugs.”
We advocate for regular conversations about tech use and online safety. But how do you start those chats? We’re launching a new section this week: conversation-starters to kick off important dialogues with your kiddo about their devices, online interactions, and more. How will you check in with your kid about online safety this week?
You check your child’s phone or get an alert from your monitoring app, and you learn they’ve been messaging friends about drugs or looking at drug-related content online. Here’s what to do next.
“Sexting” refers to sending or receiving sexually explicit videos, images, or text messages. Here are some tips to talk to your teen about sexting, including the potential consequences and a plan for safe texting practices.
🕒 TikTok ban update: H.R. 7521 is sitting in a Senate committee, which is kinda like the waiting room of bills. The measure would ban applications controlled by foreign adversaries of the United States that pose a national security risk, and it unanimously passed the House earlier this month. The vote was held following a closed-door security briefing about TikTok’s risks, and a bipartisan group of legislators are pushing to declassify that information and hold a public hearing. Sens. Richard Blumenthal and Marsha Blackburn said, “As Congress and the Administration consider steps to address TikTok’s ties to the Chinese government, it is critically important that the American people, especially TikTok users, understand the national security issues at stake.”
📵 The costs of a phone-based childhood are harming our kids, writes social psychologist Jonathan Haidt.
👻 Snapchat is rolling out a feature that makes the messaging experience more like texts. The messages won’t vanish, but both users have to opt-in to the new setting.
👀 72% of teens feel peaceful without their smartphone, according to a new Pew Research Center survey — but 46% of teens say their parents are sometimes distracted by their phone when they’re trying to talk to them.
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
Today, the House overwhelmingly voted to pass a bill that would effectively ban TikTok in the United States. The bill now heads to the Senate, where its future is less certain. The measure, H.R. 7521, would ban applications controlled by foreign adversaries of the United States that pose a clear national security risk.
For years, US officials have dubbed TikTok a national security threat. China’s intelligence laws could enable Beijing to snoop on the user information TikTok collects. Although the US government has not publicly presented evidence that the Chinese government has accessed TikTok user data, the House vote was preceded by a classified briefing on national security concerns about TikTok’s Chinese ownership.
If H.R. 7521 is passed, ByteDance will have 165 days to sell TikTok. Failure to do so would make it illegal for TikTok to be available for download in U.S. app stores. On the day of the vote, TikTok responded with a full-screen pop-up that prompted users to dial their members of Congress and express their opposition to the bill. In a post on X, TikTok shared: “This will damage millions of businesses, deny artists an audience, and destroy the livelihoods of countless creators across the country.”
“It is not a ban,” said Representative Mike Gallagher, the Republican chairman of the House select China committee. “Think of this as a surgery designed to remove the tumor and thereby save the patient in the process.”
The bottom line: The bill passed the House Energy and Commerce Committee unanimously, which means legislators from both parties supported the bill. Reuters calls this the “most significant momentum for a U.S. crackdown on TikTok … since then President Donald Trump unsuccessfully tried to ban the app in 2020.” The TikTok legislation’s fate is less certain in the Senate. If the bill clears Congress, though, President Biden has already indicated that he would sign it.
If your child uses TikTok, it’s natural that they may have questions about the ban (especially if they dream of becoming a TikTok influencer). Nothing is set in stone, and it’s entirely possible that TikTok would simply change ownership. However, this is a good opportunity to chat with your kids about the following talking points:
Set your child’s account to private, limit who can message them, and limit reposts and mentions. With a few simple steps, you can make Instagram a safer place for your kid. Here’s how to get it done.
Yikes — you found out that your child has been sending concerning videos, images, or messages to someone else. We break down some of the reasons kids send inappropriate messages and how to approach them.
🏛️ An update on Florida’s social media ban: as expected, Governor Ron DeSantis vetoed a bill that would have banned minors from using social media, but signaled that he would sign a different version anticipated from the Florida legislature.
📵 Nearly three-quarters (72%) of U.S. teens say they feel happy or peaceful when they don’t have their smartphones — but 44% say they feel anxious without them, according to Pew Research Center.
📖 Do digital books count as screen time? The benefits of reading outweigh screen time exposure, according to experts.
🗺️ How can parents navigate the challenges of technology and social media? Set limits, help your child realize how much time they spend on tech, and model self-restraint. Check out these tips and more via Psychology Today.
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
A Florida bill that bans minors from using social media recently passed the House and Senate. The bill, HB1, is now on Governor Ron DeSantis’ desk. He’ll have until March 1 to veto the legislation or sign it into law.
DeSantis has previously said that he didn’t support the bill in its current form, which bars anyone younger than 16 years old from creating new social media accounts — and closes existing accounts for kids 16 and younger. (DeSantis has called social media a “net negative” for young people, but said that, with parental supervision, it could have beneficial effects.) Unlike online safety bills passed in other states, HB1 doesn’t allow minors to use social media with parental permission: if you’re a minor, you can’t have an Instagram.
Even if DeSantis vetoes the bill, the fact that such an aggressive bill passed both the House and Senate with bipartisan support signals that the conversation about online safety legislation is reaching a tipping point.
The Kids Online Safety Act (KOSA), which implements social media regulations at the federal level, also recently reached a major milestone: an amended version gained enough supporters to pass the Senate. If it moves to a vote, it would be the first child safety bill to get this far in 25 years, since the Children’s Online Privacy Protection Act passed in 1998.
If passed, KOSA would make tech platforms responsible (aka have a “duty of care”) for preventing and mitigating harm to minors on topics ranging from mental health disorders and online bullying to eating disorders and sexual exploitation. Users would also be allowed to opt-out of addictive design features, such as algorithm-based recommendations, infinite scrolling, and notifications.
In a previous iteration of KOSA, state attorneys general were able to enforce the duty of care. However, some LGBTQ+ groups were concerned that Republican AGs would use the law to take action against resources for LGBTQ+ youth. The amended version leaves enforcement to the Federal Trade Commission — a move that led a number of advocacy groups, including GLAAD, Human Rights campaign, and The Trevor Project — to state they wouldn’t oppose the new version of KOSA if it moves forward. (So, not an endorsement, but not-not an endorsement.)
What’s next? As of this publication, DeSantis has not signed or vetoed Florida’s social media ban. Plus, KOSA has yet to be introduced to the Senate for a vote, and it’s flying solo — there is no companion bill in the House, which would give the House and Senate time to consider a measure simultaneously.
However, the fallout from January’s Senate Judiciary Committee — in which lawmakers grilled tech CEOs about their alleged failure to stamp out child abuse material on their platforms — may build momentum for future online safety legislation. We’ll keep our eyes peeled.
Spotify offers everything from podcasts to audiobooks — and with all of that media comes content concerns. The good news: both Spotify Kids and Spotify parental controls allow kids to enjoy their tunes while keeping their ears clean.
If you remember watching the pirate-themed anime series One Piece, you might be excited about the recently released live-action remake now streaming on Netflix and eager to share your love of the show with your kids. But is One Piece for kids?
🔒 Did you know that 90% of caregivers use at least one parental control? That’s according to a new survey from Microsoft.
📱 Social media is associated with a negative impact on youth mental health — but a lot of the research we have tends to focus on adults. In order to really understand cause and effect, researchers need to talk to teens about how they use their phones and social networks. Read more via Science News.
🛑 Meta announced the expansion of the Take It Down program, which is “designed to help teens take back control of their intimate images and help prevent people — whether it’s scammers, ex-partners, or anyone else — from spreading them online.”
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
Odds are high that your child is currently involved in at least one group chat if they own a smartphone.
From social media to text messages, group chats are the modern equivalent of cliques. However, just like cliques that cluster next to lockers and gossip that spreads through whispers, group chats come with their own set of issues. It’s crucial for parents to understand this digital landscape so they can guide and support their kids through the ups and downs.
When posting on social media, teens have to negotiate the dynamics of different audiences seeing their posts. But group chats can feel more private and protected, allowing kids to share inside jokes and video calls with a smaller group of friends. As opposed to passively scrolling through a feed, these more active types of behavior can support greater perceptions of social support and belonging. Being part of a group chat, and keeping up with it, can help teens express their identity and feel closer to their friends.
At the same time, group chats come with risks.
We’re big proponents of staying involved in your child’s digital life. That includes setting boundaries around device usage and regularly monitoring their text threads and social media inboxes.
It’s also important to keep the lines of communication open. Ask your kid who they’re messaging, and let them know they can come to you when problems arise. You can also use a text monitoring service like BrightCanary to keep tabs on their messages and step in when they encounter anything concerning.
You know your child best. Check in with them, start the conversation about personal safety, and discuss when it’s time to leave a chat — especially if things turn harmful or make them feel bad.
Since 2018, Instagram users have had the option to create a list of Close Friends, and use it to limit who could see their Stories. Recently, Instagram expanded this option to include posts and Reels — we break down why we love this for parents.
It’s a familiar scene of modern parenting: your kid, hunched over their iPhone, furiously texting. You, dying to know what they’re saying. But should parents read their child’s text messages? If you decide to monitor your kid’s text messages on iPhone, how do you do it?
🏛️ The problems with social media got a lot of attention late last month around the Senate Judiciary Committee hearing, in which lawmakers grilled five tech CEOs about concerns over the effect of technology on youths. Following the 3.5 hour hearing, some experts say that the momentum will help pass rules to safeguard the internet’s youngest users, while others say congressional gridlock will keep potential legislation in stasis.
💼 One takeaway from the Senate hearing: don’t mess with the APA because they will fact-check your claims. After Meta CEO Mark Zuckerberg claimed social media isn’t harmful to mental health, Mitch Prinstein, PhD, chief science officer of the American Psychological Association, clapped back and accused Zuckerberg of cherry-picking from the APA’s data.
🤖 How can AI help give teens protection and privacy on social media? Afsaneh Razi, assistant professor of information science at Drexel University, writes about how machine learning programs can identify unsafe conversations online (the same approach that BrightCanary takes!).
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
Last week, New York City Mayor Eric Adams issued a health advisory about social media due to its impact on children. Mayor Adams designated social media an “environmental health toxin,” stating, “Companies like TikTok, YouTube, [and] Facebook are fueling a mental health crisis by designing their platform with addictive and dangerous features. We can not stand by and let big tech monetize our children’s privacy and jeopardize their mental health.”
The health advisory aligns with findings from a recent survey published by Common Sense Media, which examined the state of kids and families in America in 2024. Based on responses from about 1,220 children and teens aged 12–17 nationwide, more adolescents are concerned about their mental health today than were previous generations.
Some notable statistics from the survey:
Social media isn’t the only factor impacting the youth mental health crisis. In a conversation with Education Week, Sharon Hoover — co-director of the National Center for School Mental Health — pointed to a range of factors that could be contributing to declining mental health among kids and teens, such as housing insecurity and food insecurity. These issues were exacerbated by the pandemic, to the point that living in an area with more severe COVID-19 outbreaks was deemed a risk factor for youth mental health symptoms.
Last year, the U.S. Surgeon General issued a health advisory warning that social media is a concern for adolescents. Excessive social media use is associated with depression and anxiety, as well as downstream effects from negative impacts on sleep quality. The key word here is “excessive” — it’s important for parents to set guardrails around the level of access kids have online, including how much time they spend on social media (and screens in general).
Here are some places to start:
Social media monitoring refers to supervising your child’s activity on social networks, such as Instagram and TikTok. The most effective plan for monitoring a child’s social media accounts employs a mix of approaches. Here are some options to explore.
A frustrating number of parental control settings are designed in such a way that kids can easily bypass or even change them, rendering them all but useless. Luckily, there are options which allow parents to set boundaries and have some peace of mind.
👀 Do social media insiders let their kids use platforms like Instagram and TikTok? Parents working at large tech companies said they did not trust that their employers and their industry would prioritize child safety without public and legal pressure, the Guardian reports.
📱 Meta has rolled out a few updates for teen users: new “nighttime nudges” will remind teens to go to sleep when they use Instagram late at night, and Instagram will restrict strangers from sending unsolicited messages to teens who don’t follow them. Meta will also allow guardians to allow or deny changes in default privacy settings made by teens.
🏛️ The Florida House has passed a bill to ban social media accounts for users under age 16. The bill doesn’t list which platforms would be affected, but it targets any social media site that tracks user activity, allows children to upload material and interact with others, and uses addictive features designed to cause excessive or compulsive use.
📍 With just a few pieces of information, this TikToker can pinpoint your exact location — and it’s a great lesson in online safety.
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
In light of mounting claims that companies aren’t doing enough to protect the mental well-being of young people, Meta recently announced that it will restrict the type of content that teenagers can see on Facebook and Instagram.
The updated settings are designed to “give teens more age-appropriate experiences on our apps.” Meta will default all teen users to the most restrictive content settings, which make it more difficult for people to come across potentially sensitive content or accounts. Teens will receive a prompt to update their privacy settings and restrict who can contact them. Meta will also prevent teen users from seeing content that references self-harm, eating disorders, or nudity, even from people they follow.
In October, a bipartisan group of 42 attorneys general announced that they’re suing Meta, alleging that the company’s products are hurting teenagers and contributing to mental health problems. New York Attorney General Letitia James said, “Meta has profited from children’s pain by intentionally designing its platforms with manipulative features that make children addicted to their platforms while lowering their self-esteem.”
In November, Meta whistleblower Arturo Bejar told lawmakers that the company was aware of the harm its products cause young users, but failed to fix the problems. Meta and other tech companies are incentivized to keep young users on their platforms — a recent study found that social media platforms (including Facebook, Instagram, and YouTube) generated $11 billion in advertising revenue from U.S. users younger than 18 in 2022 alone.
Our take: Even before this announcement, Meta already had protections in place for younger users. (We’re fans of Instagram’s Family Center.) However, many parents aren’t aware of these features, and it makes a lot of sense to automatically implement protections that are developed in alignment with experts in adolescent development. While we wish those protections were instituted sooner (before everyone started suing Meta), we’re still calling this a win.
At the same time, the success of these features depends on two things: whether or not your child listed their age correctly in their account, and the level of supervision parents have over their children’s accounts. If your child created their own online account, double-check that they’ve listed their age correctly.
And if you don’t already have a parental monitoring practice in place, now’s the time to start. We always suggest having regular tech check-ins with your kids to go through their phone together, creating space to discuss internet safety issues, and using a monitoring tool like BrightCanary to get alerts if your child encounters anything inappropriate.
So, you found out that your kid saw something definitely meant for adults. Don’t panic! Here’s how to talk to kids about inappropriate content they may encounter online.
Platforms that allow users to interact are prime places for predators to solicit kids. From YouTube comments to linked social accounts, there are still several ways strangers can talk to your kids that parents should know.
⚖️ The Washington Post covers the states looking to pass online safety bills in 2024, including California, Minnesota, Maryland, and New Mexico. In Florida, a bill that could ban minors from using social media is up for legislative consideration today.
📱 A new study says that almost half of British teens feel addicted to social media. Out of 7,000 respondents, 48% said they agreed or strongly agreed with the statement, “I think I am addicted to social media.” A higher proportion of girls agreed compared to boys (57% vs. 37%).
📖 How is growing up in public shaping kids’ self-esteem and identity? Pamela B. Rutledge, Ph.D., discusses how parents should actively engage with kids’ digital activities as guides, not intruders or spies. She also reviews Devorah Heitner’s new book about kids coming of age in a digital world: Growing Up in Public.
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
Does this sound familiar? You’re about to go to bed, but you quickly check Instagram — only to find yourself an hour later, still scrolling and sending parenting memes to friends. Or maybe your teenager, despite trying various apps and settings to curb screen time, inevitably gravitates back to their phone when they should be studying.
It’s a scene that plays out in households throughout the country, a testament to the addictive allure of social media and modern technology for both parents and kids. Enter: Unpluq. Founded in 2020, Unpluq promises to help you reclaim 78 minutes of your day from screens. It combines a screen time management app with a unique physical tag, which acts as a key to access blocked apps. That action is meant to change your screen time habits, making scrolling or opening apps a conscious choice.
We emailed with Caroline Cadwell, Unpluq CEO and co-founder, about the habit loops that keep us scrolling, how parents can model appropriate digital behavior for their kids, and the future of screen time management.
What inspired you to create Unpluq, and how do you believe it can play a role in fostering a healthier digital environment at home?
I realized that I was always connected and overworked. Walking my dog? Responding to a Slack message. Dinner with my partner? I just need to do one more thing on the computer. My cofounders, Tim and Jorn, had similar struggles. Tim lost sleep to endless scrolling and watching short videos before bed. Jorn was distracted from his studies. There was no intentionality to it. We didn’t have control of our time.
We noticed that existing tools like Screen Time on iOS and Digital Wellbeing on Android were too easy to bypass. With Tim’s background in interaction design and our understanding of Rational Override Theory — basically, making the things you don’t want to do, harder to do — we developed the Unpluq Tag.
It acts as a physical, wireless key to your addictive apps. By using physical interruption, we’ve been able to tap into our brain’s ability to make conscious decisions over automatic, addictive behavior.
As parents and as adults, we’re constantly setting examples for the kids around us. But trillions of dollars have been poured into keeping your attention on that screen. We know that more time with our devices makes us unhappy, and we don’t want that for the next generation — but it is so hard to break the behavior pattern without effective tools.
Many people equate mindfulness with reducing screen time, but in today’s digital age, complete disconnection isn’t always feasible. How does Unpluq strike a balance between necessary connectivity and mindful detachment?
Life goes fast with smartphones. We constantly have to process so much information at once. Some of us wish we could go back to the Nokia 5160 era, but we still rely on our phones for essential tasks like banking, work, and socializing.
That’s why leaving our phones in another room, using airplane mode, and deleting social media apps aren’t solutions that work for people. “Digital detox” is a popular phrase, but it’s unsustainable. How many people do you know who announced they were leaving Facebook for a while, who cropped back up again a few days later?
What Unpluq allows you to do is add a layer of intention to how you use your phone, and that makes it a lasting habit change. It makes it harder to slip into unwanted patterns, encouraging lasting habit changes and allowing you to follow through with your intention to read a book or spend quality time with your kids.
In an interview with GeekWire, you described Unpluq as something that helps people “overcome what has been engineered against the very biology of being human.” This is such a fascinating comment, particularly when considering research that shows babies react negatively when they see their parents engrossed in their phones. Can you expand more on your statement and explain how you believe Unpluq can help people be more present for their families?
For the past decade, as smartphones become more commonplace, we’ve seen a rise in mental health issues — especially with young adults. Rates of anxiety, depression, and suicide have increased. It’s no surprise that these outcomes strongly correlate with screen time.
The attention economy, fueled by features like the “endless scroll” and “like button,” exploits the parts of our brain associated with reward and pleasure. It’s no shock that these technologies and features can be labeled addictive. It’s unreasonable to tell a human brain, programmed to seek reward, to stop seeking it.
Since the launch of Unpluq, what feedback have you received from parents?
The happiest feedback I’ve gotten from families using Unpluq is that as parents, they were able to set a good example. They were able to be present with their children.
One customer said he’s been more present at dinner, when previously, he was still engaged with work. Another said, “My son, 13, is a big fan of Unpluq. He used to scroll TikTok and Instagram for hours and hours — now he does his homework.”
It’s pretty universal that parents want to be the best they can for their kids, to raise them as well as they can, and to set them up for success. Much like any other important life skill, parents have to teach their kids what responsible phone and internet usage looks like, how to self-regulate, and how to moderate our intake. By and large, we’re failing to do so because we lack effective tools.
A friend is an elementary school teacher. She asked her class what their parents could do better, and roughly 80% of the kids said some version of “stop using the phone and pay attention to me when I’m talking.” It’s heartbreaking — but understandable, too.
What has surprised you most about your work with Unpluq?
The lasting change that Unpluq helps people accomplish has surprised and delighted me. Most people continue to use it long-term, saving an average of one hour and 22 minutes a day. That means, in a year, they’ve saved 35 waking days of their life. How incredible is it to gain an extra month every year, all in your control?
Learn more about how Unpluq works. This interview has been edited for length and clarity.
Giving your kid their first phone can feel overwhelming. Will they talk to predators? Will they end up in a dangerous situation? It’s so important to stay engaged with what your kids do online — and that’s exactly what the BrightCanary iPhone parental monitoring app is designed to do. But how does BrightCanary work?
If you’re here, you want to keep your kids safe. BrightCanary provides better monitoring where most kids are facing risks, like text messages and Instagram. And a growing number of experts recommend monitoring your child’s online accounts, from the American Psychological Association to the U.S. Surgeon General.
In this guide, we’ll share how to set up BrightCanary. Good news: it’s easy and just takes a few steps. Let’s dive in.
BrightCanary monitors your kid’s accounts on Google, YouTube, Instagram, and TikTok to help you supervise what they watch, send, and share.
The BrightCanary app runs on your phone, not your kid’s phone. That means your child can’t easily delete it, and you don’t have to install anything extra on their device.
Once you download BrightCanary from the App Store, you connect the online accounts you want to monitor. (We’ll walk you through how to do that later.) Our app takes care of the rest.
We use advanced AI filters to track what your child is watching, posting, and searching. If your child runs into concerning content like violence, hate speech, or self-ham, we’ll alert you that you need to get involved — so you don’t have to review every message or post yourself.
Our advanced technology even summarizes text message threads, so you don’t have to read every message. Review detailed summaries with insights based on the American Psychological Association’s emotional communication guidelines, and get helpful parent coaching tips and conversation-starters — all from your phone.
And unlike other monitoring solutions, you don’t have to plug your child’s phone into your computer or sync with your home WiFi network. BrightCanary simply runs in the background on your iPhone.
We’ve covered how BrightCanary works, but how do you set it up? BrightCanary is designed to monitor your child’s Google, YouTube, TikTok, and Instagram accounts, plus text messages on Apple devices.
First, you’ll need to create your child’s online accounts if you haven’t already done so. Setting up with their own account is helpful for a few reasons:
Here’s how to create your child’s profile and add their online accounts.
From the home screen, select “Set up BrightCanary for your family.” Then, follow these steps:
That’s it! If you want to monitor multiple children, you can create multiple profiles by clicking the + button at the top of the screen. And if you don’t know your child’s logins, here’s how to ask your children for their passwords.
Here’s a more detailed breakdown of how to add your child’s accounts to BrightCanary, divided by platform:
With BrightCanary, you can monitor your child’s texts and iMessages on their iPhone, iPad, and other Apple devices.
What you’ll need to begin set up:
To connect your child’s account in BrightCanary:
New text messages may take some time to appear. BrightCanary updates with any new text messages every few hours.
First, see if your child has a Google account or if they need to create one. (If your child has their own YouTube account, use those same credentials to sign into their Google account.)
It’s easier to monitor your child’s activity when they have their own Google account. For example, if they use your account to watch YouTube and you sync it with BrightCanary, our app might flag your activity.
Once you connect your child’s Google account, you’ll be able to monitor what they’re watching and searching on Google and YouTube.
If they already have a Google account:
If you need to set up a Google account:
That’s it! Once you’ve created their Google account, make sure your child uses this information to log into Google and YouTube on their devices.
If you haven’t already, this is a great opportunity to talk to them about how and why you’re using BrightCanary to keep them safe online. A digital device contract can help lay out rules and expectations for both of you.
TikTok is wildly popular among tweens and teens: according to Pew Research Center, 67% of kids are on this social platform. Kids use TikTok for entertainment, messaging friends, and as creative inspiration. You can monitor their videos, posts, and direct messages with BrightCanary.
Pro tip: Set up TikTok Family Pairing to add extra parental controls on your child’s TikTok account, like content filters and time limits.
Sixty-two percent of kids use Instagram for everything from messaging their friends to following their favorite influencers. BrightCanary will show you their Instagram feed, posts, comments, and direct messages.
If your child used their Facebook account to create their Instagram (confusing, we know), tap “Continue with Facebook” instead. Enter their login information when prompted.
Once you’ve added your child’s accounts, BrightCanary’s dashboard will populate with an overview of their online activity. From your dashboard, you’ll be able to view information like how many videos they’ve recently watched and top keywords searched.
We use cutting-edge technology to monitor your child’s text messages and provide helpful summaries, so you don’t have to go through everything yourself. Get quick insights into the emotional tone of your child’s text threads and use our parental coaching prompts to have more meaningful conversations with your child.
Activity reports give you a detailed overview of activity in individual accounts, like Google or Instagram. Tap an account in your dashboard to view.
For example, if you tap into YouTube, you’ll see an overview of what your child has recently watched and searched.
If you specifically want to review concerning content, just tap the “Concerning” tab. BrightCanary will flag content with tags. For instance, “viewed image” means your child opened an image with inappropriate material.
You can customize what alerts you receive and how often BrightCanary alerts you. Here’s how:
Once you’ve tracked down your child’s passwords, keep them somewhere safe. Our Password Vault is designed to store your child’s login information in one secure location. To access the Password Vault, tap “More” and then tap “Password Vault.”
And if you’re looking for answers to your toughest questions about parenting in the digital age, we’ve got you covered. Our Ask the Canary AI chatbot is designed to give you helpful conversation-starters, advice, and resources, directly from your phone. It’s an anonymous service that you can use any time — just tap “More” and then “Ask the Canary.”
Unlike other parental monitoring apps, BrightCanary was designed for iOS devices. That means we show features that other apps don’t, like full text message conversations and online activity on social media. Here’s how BrightCanary compares on Apple devices.
You don’t have to download anything on your child’s phone, so they won’t notice any interruptions on their accounts. However, we recommend that parents use BrightCanary in collaboration with their kids. Explain why you’re using BrightCanary, how you’ll use it, and why it’s important to help keep them safe. Some of our parents make BrightCanary monitoring a requirement if a child wants their own phone and add it to their family’s digital device contract.
Yup. We’re real people with families behind this app! BrightCanary was founded in 2022. Based in Seattle, we’re a small team of parents who want to help parents guide, protect, and connect with their kids as they learn how to navigate the digital world. Learn more about our story.
Parenting in the digital age can feel like an uphill battle. Fortunately, parental monitoring apps like BrightCanary are tools you can add to your parenting toolbox. Remember, parental monitoring is just one piece of the puzzle — it’s just as important to have regular conversations with your child about their online activity and how to handle what they see online.
For more parenting tips, here’s how to help your child use social media responsibly. Subscribe to our parenting newsletter for weekly advice, news, and resources.
If you’re a parent of a tween, you’ve likely heard about Roblox. But what is Roblox, exactly? And how can Roblox parental controls make your child’s experience safer?
To help you make sense of your kid’s Roblox obsession and navigate potential safety concerns, we’ve compiled this handy Roblox parent’s guide.
Roblox is a wildly popular online gaming platform with over 40 million games for users to choose from. (Yes, you read that number right.)
All of the games (aka “experiences”) on Roblox are user-generated, immersive, 3D worlds with open-ended play and the ability to interact with other players. Popular games allow users to do things like adopt and raise pets, work in a pizza parlor, and live in a fictional town.
Roblox uses a freemium/premium model, meaning it’s free to download and play. But upgraded features, such as special avatar outfits and unique abilities inside games, come at a price.
(Pro tip: Check out our section below on Roblox parental controls to prevent your kid from racking up unauthorized charges.)
If you’ve landed on this article, you’re probably curious if Roblox is safe for your kid to play. The answer is yes … and no. Like most things online, it comes down to how it’s used.
I personally allow my 8-year-old to play Roblox, and it would seem I’m not alone, considering over half of users are under the age of 13.
While there are no official age restrictions for using the platform, the open-chat feature and user-generated content has led Common Sense Media to rate Roblox as safe for ages 13 and up. It’s possible for your child to encounter cartoon weapons, violence, and inappropriate language on Roblox.
Roblox also takes safety seriously and has a number of guardrails in place, such as automatic chat filtering for younger users and age recommendations for all content on the platform. These age categories are all ages, 9+, and 13+.
Despite the potential risks when playing Roblox, there are several big benefits. For one thing, the open-ended play and immersive worlds lend themselves very well to the way kids naturally play. Add to that the ability to design games and play online with friends, and it’s easy to see there’s plenty of wholesome value to be gained.
Given the benefits and the ability to customize the experience to fit the age and maturity of your child, Roblox is safe for kids with proper precautions.
Roblox features a robust suite of parental controls, which they refer to as Account Restrictions.
Because of the open-chat feature, user-generated content which could be unsuitable for children, and the existence of in-game purchases, we highly recommend parents take full advantage of these safety features.
Roblox has a comprehensive guide to using Account Restrictions on their website, which will lead you step-by-step through the process of securing your child’s account.
Here are some of Roblox parental controls you can use:
The chat function and in-game purchases are two of the highest-priority settings to review. Roblox recently expanded its platform to encourage creators to make experiences for users ages 17+. Kids won’t be able to engage with these experiences, but a higher portion of adult users means that it’s a good idea to limit how your child can interact with people they don’t know.
Additionally, Roblox’s in-game currency (called Robux) enables kids to get special objects in games, special outfits for their avatars, and other perks — but they require real money to purchase. You can set limits on your child’s spending habits with Account Restrictions.
It’s worth noting that all of the Account Restrictions can be overridden by the user, unless there’s a parent PIN on the account — so pick a PIN your child won’t guess.
Using Account Restrictions is paramount to keeping your child safe on Roblox, but your responsibility doesn’t end there.
Roblox has users of all ages. As your child begins using the app, take the time to review basic online safety measures with them, including the importance of not sharing personal information online.
At BrightCanary, we always advise against a set-it-and-forget-it approach to your kid’s online activity. Keep an eye on their Roblox use and make it a point to regularly sit down with them to see what they’re playing. These regular check-ins will help you spot any problems that may sneak through the safeguards — and you get the bonus of some bonding time with your kiddo.
Roblox is a popular online gaming platform that offers many benefits to kids, from creativity to social bonding. Potential safety concerns can be effectively mitigated by taking advantage of parental controls, discussing safe use with your child, and practicing regular tech check-ins.