As a parent, you want your child to surround themselves with good influences. That’s true not only for who they spend time with in real life, but also for the people and ideas they’re exposed to on social media.
If you or your child are concerned about the content appearing in their feed, one beneficial step you can take is to help them reset their social media algorithm. Here’s how to reset your child’s algorithm on TikTok, Instagram, and other platforms.
Social media algorithms are the complex computations that operate behind the scenes of every social media platform to determine what each user sees.
Everything on your child’s social media feed is likely the result of something they liked, commented on, or shared. (For a more comprehensive explanation, check out our Parent’s Guide to Social Media Algorithms.)
Social media algorithms have a snowball effect. For example, if your child “likes” a cute dog video, they’ll likely see more of that type of content. However, if they search for topics like violence, adult material, or conspiracy theories, their feed can quickly be overwhelmed with negative content.
Therefore, it’s vital that parents actively examine and reset their child’s algorithm when needed, and also teach them the skills to evaluate it for themselves.
Research clearly demonstrates the potentially negative impacts of social media on tweens and teens. How it affects your child depends a lot on what’s in their feed. And what’s in their feed has everything to do with algorithms.
Helping your child reset their algorithm is a wonderful opportunity to teach them digital literacy. Explain to them why it’s important to think critically about what they see on social media, and what they do on the site influences the content they’re shown.
Here are some steps you can take together to clean up their feed:
Resetting all of your child’s algorithms in one fell swoop can be daunting. Instead, pick the app they use the most and tackle that first.
If your kiddo follows a lot of accounts, you might need to break this step into multiple sessions. Pause on each account they follow and have them consider these questions:
If the answer “yes” to any of these questions, suggest they unfollow the account. If they’re hesitant — for example, if they’re worried unfollowing might cause friend problems — they can instead “hide” or “mute” the account so they don’t see those posts in their feed.
On the flip side, encourage your child to interact with accounts that make them feel good about themselves and portray positive messages. Liking, commenting, and sharing content that lifts them up will have a ripple effect on the rest of their feed.
After you’ve gone through their feed, show your child how to examine their settings. This mostly influences sponsored content, but considering the problematic history of advertisers marketing to children on social media, it’s wise to take a look.
Every social media app has slightly different options for how much control users have over their algorithm. Here’s what you should know about resetting the algorithm on popular apps your child might use.
To get the best buy-in and help your child form positive long-term content consumption habits, it’s best to let them take the lead in deciding what accounts and content they want to see.
At the same time, kids shouldn’t have to navigate the internet on their own. Social platforms can easily suggest content and profiles that your child isn’t ready to see. A social media monitoring app, such as BrightCanary, can alert you if your child encounters something concerning.
Here are a few warning signs you should watch out for as you review your child’s feed:
If you spot any of this content, it’s time for a longer conversation to assess your child’s safety. You may decide it’s appropriate to insist they unfollow a particular account. And if what you see on your child’s feed makes you concerned for their mental health or worried they may harm themselves or others, consider reaching out to a professional.
Algorithms are the force that drives everything your child sees on social media and can quickly cause their feed to be overtaken by negative content. Regularly reviewing your child’s feed with them and teaching them skills to control their algorithm will help keep their feed positive and minimize some of the negative impacts of social media.
Just by existing as a person in 2023, you’ve probably heard of social media algorithms. But what are algorithms? How do social media algorithms work? And why should parents care?
At BrightCanary, we’re all about giving parents the tools and information they need to take a proactive role in their children’s digital life. So, we’ve created this guide to help you understand what social media algorithms are, how they impact your child, and what you can do about it.
Social media algorithms are complex sets of rules and calculations used by platforms to prioritize the content that users see in their feeds. Each social network uses different algorithms. The algorithm on TikTok is different from the one on YouTube.
In short, algorithms dictate what you see when you use social media and in what order.
Back in the Wild Wild West days of social media, you would see all of the posts from everyone you were friends with or followed, presented in chronological order.
But as more users flocked to social media and the amount of content ballooned, platforms started introducing algorithms to filter through the piles of content and deliver relevant and interesting content to keep their users engaged. The goal is to get users hooked and keep them coming back for more.
Algorithms are also hugely beneficial for generating advertising revenue for platforms because they help target sponsored content.
Each platform uses its own mix of factors, but here are some examples of what influences social media algorithms:
Most social media sites heavily prioritize showing users content from people they’re connected with on the platform.
TikTok is unique because it emphasizes showing users new content based on their interests, which means you typically won’t see posts from people you follow on your TikTok feed.
With the exception of TikTok, if you interact frequently with a particular user, you’re more likely to see their content in your feed.
The algorithms on TikTok, Instagram Reels, and Instagram Explore prioritize showing you new content based on the type of posts and videos you engage with. For example, the more cute cat videos you watch, the more cute cat videos you’ll be shown.
YouTube looks at the creators you interact with, your watch history, and the type of content you view to determine suggested videos.
The more likes, shares, and comments a post gets, the more likely it is to be shown to other users. This momentum is the snowball effect that causes posts to go viral.
There are ways social media algorithms can benefit your child, such as creating a personalized experience and helping them discover new things related to their interests. But the drawbacks are also notable — and potentially concerning.
Since social media algorithms show users more of what they seem to like, your child’s feed might quickly become overwhelmed with negative content. Clicking a post out of curiosity or naivety, such as one promoting a conspiracy theory, can inadvertently expose your child to more such content. What may begin as innocent exploration could gradually influence their beliefs.
Experts frequently cite “thinspo” (short for “thinspiration”), a social media topic that aims to promote unhealthy body goals and disordered eating habits, as another algorithmic concern.
Even though most platforms ban content encouraging eating disorders, users often bypass filters using creative hashtags and abbreviations. If your child clicks on a thinspo post, they may continue to be served content that promotes eating disorders.
Although social media algorithms are something to monitor, the good news is that parents can help minimize the negative impacts on their child.
Here are some tips:
It’s a good idea to monitor what the algorithm is showing your child so you can spot any concerning trends. Regularly sit down with them to look at their feed together.
You can also use a parental monitoring service to alert you if your child consumes alarming content. BrightCanary is an app that continuously monitors your child’s social media activity and flags any concerning content, such as photos that promote self-harm or violent videos — so you can step in and talk about it.
Keep up on concerning social media trends, such as popular conspiracy theories and internet challenges, so you can spot warning signs in your child’s feed.
Talk to your child about who they follow and how those accounts make them feel. Encourage them to think critically about the content they consume and to disengage if something makes them feel bad.
Algorithms influence what content your child sees when they use social media. Parents need to be aware of the potentially harmful impacts this can have on their child and take an active role in combating the negative effects.
Stay in the know about the latest digital parenting news and trends by subscribing to our weekly newsletter.
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
🚫 TikTok banned from operating in Canada, may remain in US: The controversial social media app was politely yet firmly asked to shutter its offices in Canada, although Canadians can still use and access TikTok at their own risk. CBC reports that the decision to shutter TikTok’s Canadian offices was based on national security concerns and the advice of Canada’s security and intelligence community — pretty similar to the reasons TikTok was heading toward getting banned in the US. President-elect Trump has signaled that he will try to halt the ban, which would likely face formidable political challenges and legal hurdles. Either way, odds are high that your child will still be able to record TikToks for the foreseeable future. Stay informed with our guide to TikTok parental controls.
👤 NYT profile uncovers the dark side of minors on social media: The New York Times recently profiled Jacky Dejo, a child influencer turned social media entrepreneur who grew up in the creator economy — and in close proximity to men who are sexually interested in minors. Jacky’s parents started social media accounts for her when she was 6, intending to chronicle her snowboarding prowess. But as Jacky grew older, she captured the attention of adult fans, ultimately leaning in to their interest and charging for exclusive access to salacious posts and images. The profile is as enlightening as it is disturbing, but one of its more harrowing illustrations is the way that social media algorithms play a distinct role in surfacing children’s images to men who have a sexual interest in them — and failing to remove underage content that violates terms of service. Parents, talk to your kids about the risks of online predators, grooming, and why they should keep their social profiles private.
🔐How schools implement “away for the day” phone policies: More schools across the country are asking students to put away their phones, but are these policies really helping kids? A recent feature in the Seattle Times says yes. Robert Eagle Staff Middle School is one of 4,000 schools worldwide that use Yondr pouches, neoprene bags that lock away the child’s phone for the day. Staff at these schools say that students are more focused during class, spend less time dealing with conflicts from group chats or social media, and even have fewer disciplinary incidents (like vaping and alcohol use) because kids can’t use their phones to coordinate meetup locations. “The last two years, 70% of my job has been dealing with cellphones. It felt like whack-a-mole,” Principal Zachary Stowell said. “And now that’s removed. Now I’m talking to teachers about their teaching, which is cool.” Does your child’s school have a phone-free policy?
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
Is social media influencing your child’s eating habits and relationship to food? We spoke with Maiken Wiese, RD, about eating disorder warning signs. Here are some important conversation-starters based on her insights.
Is your child consuming an endless loop of bad news? This habit can expose your child to an endless stream of negative content and fuel their anxiety. Find out what to do about it.
Text message monitoring is a great way to keep your child safe. But if your child is deleting texts, your alarm bells might be going off. Is it harmless, or are they hiding something?
🤬 What’s the best way to stay cool, calm, and collected with your children? Check out these tips to improve self-regulation — your ability to keep your emotions in check — via Parenting Translator.
📵 Australia has proposed a ban on social media for kids under age 16, but it isn’t clear how the ban would be implemented. If the legislation becomes law, the platforms would have a year to figure out the details.
🧩 Looking for some screen-free activities you can do with your children this winter? Save this list from Techno Sapiens.
🎧 We were featured on the Calm Parenting Podcast in an episode about practical strategies for screen time and parental controls — give it a listen!
If you’ve ever spent hours mindlessly scrolling through a series of negative and pessimistic posts online, congrats: you’ve doomscrolled. But what is doomscrolling? This habit is particularly harmful for kids because it can negatively impact their mental health and anxiety. It’s important for parents to help teens figure out how to stop doomscrolling. Here’s how you can help your child break the habit.
Doomscrolling refers to spending excessive amounts of time online viewing content that causes negative emotions, such as sadness and anger. It is, as the name implies, literally scrolling through the doom.
Some examples of doomscrolling include checking the news multiple times per day, compulsively looking at negative videos and posts about trending topics, and fixating on negative stories for hours — whether it’s about climate change, politics, or another controversial topic.
In order to help your teen stop doomscrolling, it’s useful to examine why they might be doing it in the first place. Here are some reasons why teens doomscroll:
With all the negative events happening in the world, teens turn to doomscrolling to help them feel more in control of their own lives. To ease their fear of the unknown, some teens feel a sense of safety in staying informed about current events.
It’s easy to look at negative events in the world and feel helpless to change them. Doomscrolling can provide a false sense of taking action by exposing the viewer to the darker side of headlines and connecting them to a range of creators in the space.
As odd as it might sound that spending hours consuming content about war, natural disasters, and political unrest could be soothing, there’s some logic to it. Doomscrolling is an avoidance technique. People often use it to help them escape difficult emotions.
Kids can feel left out of conversations with peers if they aren’t informed about the biggest news of the day. To deal with this fear of missing out (FOMO), teens might doomscroll to make sure they don’t miss anything.
Given the content involved in doomscrolling, you probably won’t be surprised to know that the behavior can negatively impact your teen in a number of ways.
Doomscrolling can seriously affect your teen’s mental health. It can cause stress or anxiety about the state of the world or fear that similar things might happen to them.
Too much screen time in general can lead to a loss of sleep. Add to that worries about the negative news they’re consuming, and your teen’s slumber could really suffer.
Exposure to excessive amounts of bad news can desensitize your teen toward violence and tragedies to the point where they become numb to bad news. This numbing can lead to a decrease in empathy for others.
Social media algorithms take a user’s behavior and serve up similar content. This can create an echo chamber on your teen’s feed. The more time they spend doomscrolling, the darker their feed is likely to become, skewing their perception of what’s happening in the world.
Here are some signs your teen may be doomscrolling:
If your teen needs help figuring out how to stop doomscrolling, here are some tips:
Doomscrolling can negatively impact teens in a number of ways, from their mental health to their sleep. If you’re wondering how to stop doomscrolling, parents should keep an eye on their kid’s media consumption, help them reduce their screen time and app usage, and encourage the consumption of uplifting content.
BrightCanary can help you keep an eye on what your child is viewing online to watch for doomscrolling. The app’s advanced technology scans your child’s activity, alerting you when they encounter something concerning. Download the app and get a free trial today.
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
📵 Predators are using TikTok to exploit minors. Minors are using TikTok’s live feature to perform sexually suggestive acts on camera in exchange for money and gifts, according to a report by Forbes and documentation from TikTok’s own internal investigation. NPR and Kentucky Public Radio also found that TikTok tweaked its algorithm to more prominently show attractive people, and the platform quantified how much time it takes for viewers to become addicted to the platform: 260 videos, or under 35 minutes. Even though minors aren’t allowed to livestream or receive gifts, it’s relatively easy for children to fib about their age when they sign up. Performing suggestive acts on camera in exchange for gifts is just one way predators can groom targets for sexual abuse and sextortion. TikTok says it has a zero tolerance policy for child sexual abuse material, and the platform does have parental controls — but they only work if your child sets their correct birthdate.
🤖 Social media companies aren’t doing enough to stop AI bots. That’s according to new research from the University of Notre Dame, which analyzed the AI bot policies and mechanisms of eight social media platforms, including Reddit, TikTok, X, and Instagram. Harmful artificial intelligence bots can be used to spread misinformation, hate speech, and enact fraud or scams. Although the platforms say they have policy enforcement mechanisms in place to limit the prevalence of bots, the researchers were able to get bots up and working on all the platforms studied. If you haven’t talked to your child about the risks of bots, misinformation, and online scams, now’s the time — if your child has used any social platform, odds are high that they’ve encountered a bot already.
😩 Teens are stressed about their future, appearance, and relationships. A team of researchers surveyed US teens about what stressors today’s teens are feeling. A majority (56%) of teens are stressed about the pressure to have their future figured out, 51% felt pressure to look a certain way, and 44% felt like they needed to have an active social life. While adults drove teen’s pressures to have their futures planned out and achieve the most, the pressure to have an active social life and keep up with appearances were driven by social media, the teens themselves, and peers. Teens are struggling to reduce those stressors, too — time constraints, difficulty putting tech away, and feeling like rest isn’t “productive” enough were all blockages to practicing more self-care. Techno Sapiens breaks down things parents can do to help their stressed-out teen.
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
Helping your teen manage stress starts with open and honest conversations. Here are five conversation-starters designed to prompt meaningful chats about self-care, stress management, and healthy ways to navigate the pressures they face:
It’s spooky season! A Good Girl’s Guide to Murder is a popular young adult mystery thriller (and Netflix series) — but is it safe for kids? If your child is interested in this series, read this guide first.
Are your child’s group chats causing major drama in their friend group? Here’s what parents need to watch for when their child starts texting independently — and how to help your child handle it.
🤳 Instagram remains the most used social app among teens, followed by TikTok, according to a new report by Piper Sandler.
🎃 Halloween is next week! In Washington, where BrightCanary is based, the most popular Halloween candy is Reese’s Peanut Butter Cups. What’s the most popular treat in your state?
📍 We’re on Pinterest! Follow BrightCanary to keep up with our latest parenting tips, infographics, and resources.
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
😩 Surgeon General says parental stress is a public health issue: In a new public health advisory, US Surgeon General Murthy called for policy changes that better support parents and caregivers. The advisory noted that 48% of parents report that their stress is completely overwhelming, compared to 26% of other adults. And even though the amount of time parents spend working has increased (+28% for moms, +4% for dads), the amount of time they spend engaged in primary child care has also increased (+40% among moms, +154% among dads). Murthy called for safe, affordable child safe programs, predictable workplaces and understanding workspace leadership, and community centers (such as playgrounds and libraries) that can give children space to play while fostering social connection among parents.
👎Snap and TikTok sued for failures with child safety: The attorney general of New Mexico filed a lawsuit against Snap, the parent company of Snapchat, alleging that the company’s design features (namely, disappearing messages and images) facilitates sexual abuse and fails to protect minors from predation. Additionally, a U.S. appeals court has ruled that TikTok must face a lawsuit over a 10-year-old girl’s death. The girl’s mother, Nylah Anderson, is pursuing claims that TikTok’s algorithm recommended a viral “blackout challenge” to her daughter.
📹YouTube introduces content rules and new supervisory tools for teens: YouTube is limiting content that could be problematic for teens if viewed repeatedly. This includes content that promotes weight loss, idealized physical appearance, and social aggression. The platform also introduced new parental controls for teen users, allowing parents to link their account to their teen’s in order to view their YouTube activity. Parents will be able to view their child’s uploads, subscriptions, and comments — but not their content. (For that, you’ll need a child safety app like BrightCanary.)
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
Snapchat can be risky for kids because of how easily strangers can contact them and messages can disappear. Here’s what parents need to know about the platform.
Many apps promise to help you monitor your child’s texts, but finding one that actually works well is an uphill battle. We’ve done the research to find the best of the best.
Even when your child’s social media feed doesn’t explicitly promote eating disorders, content can still encourage unhealthy behaviors or unrealistic body standards in subtle ways. Here are some conversation-starters to help you talk to your kids about content that promotes disordered eating behaviors and body negativity.
📵 “Just like it is impossible to train your child to drive a car without supervising from the passenger seat, you cannot train your child to be smart online if you are not privy to what he is doing in that world,” writes Melanie Hemp of Be ScreenStrong. Read more about your teen “earning” smartphone privacy.
📱 If your kid keeps getting around Apple Screen Time limits, what are your options? On the BrightCanary blog, we explain common workarounds and how parents can prevent kids from sneaking past their screen time boundaries.
👻 We’re adding a new, much-requested platform to the BrightCanary app in the coming weeks — stay tuned!
In the vast world of YouTube, it’s possible to find just about anything — from the sweet and the innocent to the … not-so-innocent. If you’re looking up how to block a YouTube channel, you may have discovered that your child has been watching something concerning. The answer depends on what type of YouTube account your child has. To help you figure out what option is best, here’s how to block a channel on YouTube.
There are only two ways to truly block a YouTube channel for your child:
The process of blocking channels is the same for both of these account types and can be managed through YouTube parental controls. Follow these steps:
If your child doesn’t have a YouTube Kids or a supervised YouTube account, you can’t completely block them from accessing specific channels. However, parents can reduce the likelihood of certain channels appearing in their child’s recommended video feed. Here’s how:
While this prevents videos from that channel from being recommended to your child, they can still go directly to the channel and view its content. So, this option is less about blocking a channel and more about managing your child’s YouTube algorithm.
To block, or not to block? That is the question that has plagued parents since the advent of YouTube. (Apologies to Shakespeare.)
While blocking YouTube channels or restricting them from your child’s feed are valuable tools for parents, it’s important to recognize that these actions alone won’t solve everything. For every inappropriate channel that you block, there are at least five that are just as bad.
Parents should also take additional steps to monitor their child’s online activity, including setting parental controls and using a monitoring app like BrightCanary. This child safety app uses AI for parents to monitor their child’s online activity, including YouTube history, Google searches, text messages, and social media.
In addition to blocking YouTube channels, here are some actions you can take to ensure your child’s experience is safe and age appropriate.
Not all of the videos on YouTube are appropriate for kids. To keep your child safe on the platform, you can take steps such as blocking channels, resetting their YouTube algorithm, reviewing their feed together, and using a child safety app to keep an eye on the content they’re viewing.
Snapchat has become a ubiquitous part of teen culture. But is Snapchat better than texting for kids? Features like location sharing and vanishing messages have led to growing safety concerns among parents. This article will dig into how kids use Snapchat, its risks, and what parents can do to keep their kids safe.
Here’s what you need to know about the role this messaging app plays in kids’ lives:
Snapchat is so embedded in the social fabric of today’s teens that it’s the main way many kids communicate with friends. These are the ways teens socialize on the app:
The visual nature of Snapchat makes it particularly likely to create extra pressure to keep up with peers. Kids have near-constant access to what their friends are doing and who they’re spending their time with, particularly because users are incentivized to share as often as possible on the platform.
Real-time updates foster a fear of missing out (FOMO) — if your child sees their friends hanging out together without them or going to exciting places on the weekends, they may fall into a comparison trap.
When users Snap with each other at least once a day, they’re awarded a Snapstreak. Their overall engagement with the app is quantified by a number at the top of their profile known as a Snap Score. The social validation of maintaining Snapstreaks and Snap Scores can pressure kids to use the app more often — which is exactly the point of addictive, gamified features.
These features make Snapchat especially problematic for kids and difficult for parents to monitor:
Snapchat messages are designed to disappear as soon as all recipients view it, leaving no trail for parents who want to review their child’s online communication.
Stories let a Snapchatter share something with all their followers at once. Because stories are public, it’s important to talk to your kids about what’s okay to share online and help them set their privacy controls in the app to limit who can see and respond to their posts.
The Discover section displays content that’s been curated for the user. The primary function is to keep users scrolling — a potentially addictive feature. And because the algorithm analyzes a user’s behavior on the app to serve them content, viewing a few harmful Snaps like content promoting disordered eating could lead to a vicious cycle.
Snap Map allows Snapchatters to share their physical location, updated in real time. This feature allows you to see where your child is, but it also poses privacy and safety risks by broadcasting their whereabouts to a wide audience. (If you’re interested in location sharing, this feature is freely available with Apple Find My and Google Family Link.)
Lenses are filters that let users change their faces and the world around them. While many are pure fun, like turning yourself into a dancing turkey, “beauty” Lenses that do things like smooth skin, slim faces, or add a tan may also contribute to unrealistic beauty standards and body image issues.
Here are the top Snapchat risk parents should be aware of:
Here’s what parents can do to minimize the risks of Snapchat for their child:
While Snapchat allows users as young as 13, it’s a good idea to wait longer. Common Sense Media rates it as appropriate for 16+ (and we agree!).
Turn off your child’s location and maximize their privacy settings. Use Snapchat’s Family Center to see who your teens are communicating with and set content controls.
Talk to your kids about safe social media use, including what’s okay to share online. Remind them that anything can be saved and shared, and if anything or anyone makes them feel uncomfortable, they can always bring it to you (or another trusted adult).
Talk to your kids about how they’re using Snapchat. Sit down with them for regular safety check-ins and make it clear they should come to you with any problems, and you’ll support them through it.
Because of vanishing messages and location features, Snapchat is difficult for parents to monitor, and therefore more problematic than other social media and texting platforms. At BrightCanary, we’re committed to providing tools to keep your child safe online. That’s why we’re working toward a solution that lets you better monitor your child on Snapchat. Keep an eye on this space for an announcement and, in the meantime, download the app today to start monitoring your child’s text messages and on YouTube, Google, Instagram, and TikTok!
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
Tech giants have some ‘splaining to do. First up: Google and Meta allegedly made a secret deal to target advertisements for Instagram to teens on YouTube, according to the Financial Times. The project, which began in early 2023, exploited a loophole to bypass Google’s own rules prohibiting ad targeting to users under 18.
The advertising agency Spark Foundry, working for Meta’s marketing data science team, was tasked with attracting more Gen Z users to Instagram, which has been losing ground to rival apps like TikTok. Evidence suggests that Google and Spark Foundry took steps to disguise the campaign’s true intent, bypassing Google’s policy by targeting a group called “unknown”—which just so happened to skew toward users under 18.
Jeff Chester, executive director of the Center for Digital Democracy, which advocates for child privacy, said, “It shows you how both companies remain untrustworthy, duplicitous, powerful platforms that require stringent regulation and oversight.”
Speaking of oversight … the Justice Department is suing TikTok and parent company ByteDance for violating children’s privacy laws. According to a press release, ByteDance and its affiliates violated the Children’s Online Privacy Protection Act (COPPA), which prohibits website operators from knowingly collecting, using, or disclosing personal information from children under the age of 13 without parental consent.
The complaint alleges that from 2019 to the present, TikTok:
These allegations come amid ongoing legal battles over a TikTok ban in the U.S. To add to the controversy, the Justice Department recently accused TikTok of gathering sensitive data about U.S. users, including views on abortion and gun control. The Justice Department warned of the potential for “covert content manipulation” by the Chinese government, suggesting that the algorithm could be designed to influence the content that users receive.
That’s a lot to take in: Indeed. We often talk to parents about the balance between trust and monitoring. We can trust our kids, but we can’t always trust Big Tech companies to protect them or prioritize their well-being.
Taking an active role in your child’s digital life is about more than just supervising their online activity — it also involves considering how these companies use children’s data and how they might influence what your child consumes.
If your child uses social media or YouTube, it’s a good idea to periodically check their feeds together. A child safety app like BrightCanary can help make this easier, but nothing beats having open conversations with your child about what they share and what they see.
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
Unfortunately, the popularity of parental control apps has attracted scammers that want to swindle and frustrate people. Here’s how to identify and avoid parental control scams on iPhone and Android, plus tips to select a reputable app that does what it claims.
Did you know that your kid could be using private browsing to hide their online activity from you? Despite this workaround, parents still have options for monitoring their child online. Here’s what you should know and how to talk to your kid about incognito mode.
Tech giants don’t have our children’s best interests at heart. Privacy is important, but so is staying informed and keeping our kids safe — parents need to understand what their children are consuming, both in their algorithms and through ads. If you’re worried about the privacy conversation, here are some conversation starters:
🤖 Roblox recently released new resources to educate users about generative AI (think: ChatGPT, DALL-E, and Roblox’s own GenAI). Here’s the guide for families and one made for teens.
👑 Meghan Markle and Prince Harry have entered the child safety chat: The Parents’ Network, a new initiative from the Duke and Duchess of Sussex, is intended to assist families of children lost due to social media harm.
👻 Snapchat rolled our new safety features, including expanded in-app warnings, enhanced friending protections, and simplified location sharing. (We’re still not fans of Snapchat for younger kids, but if your teen uses Snap, it’s worth checking out the app’s parental controls.)
😔 Watching just eight minutes of TikTok focused on dieting, weight loss, and exercise content can harm body image in young women, according to a new study.
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
Guess who’s back, back again? After facing an uncertain path in the Senate, the Kids Online Safety Act (KOSA) successfully passed the Senate last Thursday. The historic vote was overwhelmingly bipartisan (86 to 1 to take up the measure), but things will be less rosy in the House, where KOSA faces hurdles in the form of free speech concerns and Big Tech lobbyists.
KOSA would introduce the most sweeping child online safety reform since the now-archaic Children’s Online Privacy Protection Act (COPPA) was passed in 1998. As a recap, KOSA would:
While other child safety bills are also under consideration, KOSA is the closest to becoming law, although we won’t hear anything about its status until the House returns in September. Concerns about KOSA include stifling First Amendment-protected speech and isolating vulnerable youth from accessing information on social media.
At the same time, a growing body of experts are calling for stricter regulations on social media platforms for the sake of children’s mental health.
US Surgeon General Vivek Murthy said that social media should have a warning label, similar to the one required on tobacco products. A new report by the Biden-Harris Administration’s Kids Online Health and Safety Task Force urges the industry to make design choices that prioritize kids’ well-being, such as making privacy protections for youth the default and use data-driven methods to detect and prevent online harassment.
The kids are not alright, and child online safety legislation is overdue. If KOSA doesn’t pass, other options on the table include Sammy’s Law, which would require social media companies to integrate with child safety software, making it easier for parents to supervise their children’s online activities.
“Finalizing these safety bills has been a long and winding and difficult road, but one thing I’ve known from the start: It sure would be worth it,” Senator Chuck Schumer, Democrat of New York and the majority leader, said in a floor speech before the vote. “The message from these parents has been simple and consistent: It’s been long enough.”
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
What can parents do to make sure their teens aren’t texting personal information to strangers? Show this list of tips to your kiddo, or use it as a springboard for a conversation about texting safety.
VPNs are a popular way for kids to get around some parental control settings. Read on to learn about VPNs, how to know if your child is using one, and what you can do about it. (Psst: VPNs don’t impact BrightCanary monitoring.)
It’s important to give kids a degree of privacy, but it’s also important to guide, protect, and support them online and offline. Plus, it makes sense to be more hands-on when your kid first gets a phone or tablet, then give them more autonomy and independence as they grow older and more mature. All that to say, how do you talk to your child about privacy — especially when you start to introduce parental monitoring? Use these conversation-starters.
😫 How do you lay down ground rules about devices? What about tips to handle cyberbullying and online abuse? Teaching children to navigate the online world is a key part of modern parenting. In this Q&A, experts pass on tips to make it feel less overwhelming.
🔎 Meta has, historically, not been the most forthcoming with allowing researchers to review its data. But now, “after years of contentious relationships with academic researchers, Meta is opening a small pilot program that would allow a handful of them to access Instagram data for up to about six months in order to study the app’s effect on the well-being of teens and young adults.”
👉 We mentioned the Kids Online Safety and Health Task Force’s new report earlier in this newsletter, but we recommend taking some time to check it out — it’s packed with advice and conversation-starters for parents and caregivers, plus free resources for parents of tweens and teens.