As a parent, you want your child to surround themselves with good influences. That’s true not only for who they spend time with in real life, but also for the people and ideas they’re exposed to on social media.
If you or your child are concerned about the content appearing in their feed, one beneficial step you can take is to help them reset their social media algorithm. Here’s how to reset your child’s algorithm on TikTok, Instagram, and other platforms.
Social media algorithms are the complex computations that operate behind the scenes of every social media platform to determine what each user sees.
Everything on your child’s social media feed is likely the result of something they liked, commented on, or shared. (For a more comprehensive explanation, check out our Parent’s Guide to Social Media Algorithms.)
Social media algorithms have a snowball effect. For example, if your child “likes” a cute dog video, they’ll likely see more of that type of content. However, if they search for topics like violence, adult material, or conspiracy theories, their feed can quickly be overwhelmed with negative content.
Therefore, it’s vital that parents actively examine and reset their child’s algorithm when needed, and also teach them the skills to evaluate it for themselves.
Research clearly demonstrates the potentially negative impacts of social media on tweens and teens. How it affects your child depends a lot on what’s in their feed. And what’s in their feed has everything to do with algorithms.
Helping your child reset their algorithm is a wonderful opportunity to teach them digital literacy. Explain to them why it’s important to think critically about what they see on social media, and what they do on the site influences the content they’re shown.
Here are some steps you can take together to clean up their feed:
Resetting all of your child’s algorithms in one fell swoop can be daunting. Instead, pick the app they use the most and tackle that first.
If your kiddo follows a lot of accounts, you might need to break this step into multiple sessions. Pause on each account they follow and have them consider these questions:
If the answer “yes” to any of these questions, suggest they unfollow the account. If they’re hesitant — for example, if they’re worried unfollowing might cause friend problems — they can instead “hide” or “mute” the account so they don’t see those posts in their feed.
On the flip side, encourage your child to interact with accounts that make them feel good about themselves and portray positive messages. Liking, commenting, and sharing content that lifts them up will have a ripple effect on the rest of their feed.
After you’ve gone through their feed, show your child how to examine their settings. This mostly influences sponsored content, but considering the problematic history of advertisers marketing to children on social media, it’s wise to take a look.
Every social media app has slightly different options for how much control users have over their algorithm. Here’s what you should know about resetting the algorithm on popular apps your child might use.
To get the best buy-in and help your child form positive long-term content consumption habits, it’s best to let them take the lead in deciding what accounts and content they want to see.
At the same time, kids shouldn’t have to navigate the internet on their own. Social platforms can easily suggest content and profiles that your child isn’t ready to see. A social media monitoring app, such as BrightCanary, can alert you if your child encounters something concerning.
Here are a few warning signs you should watch out for as you review your child’s feed:
If you spot any of this content, it’s time for a longer conversation to assess your child’s safety. You may decide it’s appropriate to insist they unfollow a particular account. And if what you see on your child’s feed makes you concerned for their mental health or worried they may harm themselves or others, consider reaching out to a professional.
Algorithms are the force that drives everything your child sees on social media and can quickly cause their feed to be overtaken by negative content. Regularly reviewing your child’s feed with them and teaching them skills to control their algorithm will help keep their feed positive and minimize some of the negative impacts of social media.
Just by existing as a person in 2023, you’ve probably heard of social media algorithms. But what are algorithms? How do social media algorithms work? And why should parents care?
At BrightCanary, we’re all about giving parents the tools and information they need to take a proactive role in their children’s digital life. So, we’ve created this guide to help you understand what social media algorithms are, how they impact your child, and what you can do about it.
Social media algorithms are complex sets of rules and calculations used by platforms to prioritize the content that users see in their feeds. Each social network uses different algorithms. The algorithm on TikTok is different from the one on YouTube.
In short, algorithms dictate what you see when you use social media and in what order.
Back in the Wild Wild West days of social media, you would see all of the posts from everyone you were friends with or followed, presented in chronological order.
But as more users flocked to social media and the amount of content ballooned, platforms started introducing algorithms to filter through the piles of content and deliver relevant and interesting content to keep their users engaged. The goal is to get users hooked and keep them coming back for more.
Algorithms are also hugely beneficial for generating advertising revenue for platforms because they help target sponsored content.
Each platform uses its own mix of factors, but here are some examples of what influences social media algorithms:
Most social media sites heavily prioritize showing users content from people they’re connected with on the platform.
TikTok is unique because it emphasizes showing users new content based on their interests, which means you typically won’t see posts from people you follow on your TikTok feed.
With the exception of TikTok, if you interact frequently with a particular user, you’re more likely to see their content in your feed.
The algorithms on TikTok, Instagram Reels, and Instagram Explore prioritize showing you new content based on the type of posts and videos you engage with. For example, the more cute cat videos you watch, the more cute cat videos you’ll be shown.
YouTube looks at the creators you interact with, your watch history, and the type of content you view to determine suggested videos.
The more likes, shares, and comments a post gets, the more likely it is to be shown to other users. This momentum is the snowball effect that causes posts to go viral.
There are ways social media algorithms can benefit your child, such as creating a personalized experience and helping them discover new things related to their interests. But the drawbacks are also notable — and potentially concerning.
Since social media algorithms show users more of what they seem to like, your child’s feed might quickly become overwhelmed with negative content. Clicking a post out of curiosity or naivety, such as one promoting a conspiracy theory, can inadvertently expose your child to more such content. What may begin as innocent exploration could gradually influence their beliefs.
Experts frequently cite “thinspo” (short for “thinspiration”), a social media topic that aims to promote unhealthy body goals and disordered eating habits, as another algorithmic concern.
Even though most platforms ban content encouraging eating disorders, users often bypass filters using creative hashtags and abbreviations. If your child clicks on a thinspo post, they may continue to be served content that promotes eating disorders.
Although social media algorithms are something to monitor, the good news is that parents can help minimize the negative impacts on their child.
Here are some tips:
It’s a good idea to monitor what the algorithm is showing your child so you can spot any concerning trends. Regularly sit down with them to look at their feed together.
You can also use a parental monitoring service to alert you if your child consumes alarming content. BrightCanary is an app that continuously monitors your child’s social media activity and flags any concerning content, such as photos that promote self-harm or violent videos — so you can step in and talk about it.
Keep up on concerning social media trends, such as popular conspiracy theories and internet challenges, so you can spot warning signs in your child’s feed.
Talk to your child about who they follow and how those accounts make them feel. Encourage them to think critically about the content they consume and to disengage if something makes them feel bad.
Algorithms influence what content your child sees when they use social media. Parents need to be aware of the potentially harmful impacts this can have on their child and take an active role in combating the negative effects.
Stay in the know about the latest digital parenting news and trends by subscribing to our weekly newsletter.
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
📵 Predators are using TikTok to exploit minors. Minors are using TikTok’s live feature to perform sexually suggestive acts on camera in exchange for money and gifts, according to a report by Forbes and documentation from TikTok’s own internal investigation. NPR and Kentucky Public Radio also found that TikTok tweaked its algorithm to more prominently show attractive people, and the platform quantified how much time it takes for viewers to become addicted to the platform: 260 videos, or under 35 minutes. Even though minors aren’t allowed to livestream or receive gifts, it’s relatively easy for children to fib about their age when they sign up. Performing suggestive acts on camera in exchange for gifts is just one way predators can groom targets for sexual abuse and sextortion. TikTok says it has a zero tolerance policy for child sexual abuse material, and the platform does have parental controls — but they only work if your child sets their correct birthdate.
🤖 Social media companies aren’t doing enough to stop AI bots. That’s according to new research from the University of Notre Dame, which analyzed the AI bot policies and mechanisms of eight social media platforms, including Reddit, TikTok, X, and Instagram. Harmful artificial intelligence bots can be used to spread misinformation, hate speech, and enact fraud or scams. Although the platforms say they have policy enforcement mechanisms in place to limit the prevalence of bots, the researchers were able to get bots up and working on all the platforms studied. If you haven’t talked to your child about the risks of bots, misinformation, and online scams, now’s the time — if your child has used any social platform, odds are high that they’ve encountered a bot already.
😩 Teens are stressed about their future, appearance, and relationships. A team of researchers surveyed US teens about what stressors today’s teens are feeling. A majority (56%) of teens are stressed about the pressure to have their future figured out, 51% felt pressure to look a certain way, and 44% felt like they needed to have an active social life. While adults drove teen’s pressures to have their futures planned out and achieve the most, the pressure to have an active social life and keep up with appearances were driven by social media, the teens themselves, and peers. Teens are struggling to reduce those stressors, too — time constraints, difficulty putting tech away, and feeling like rest isn’t “productive” enough were all blockages to practicing more self-care. Techno Sapiens breaks down things parents can do to help their stressed-out teen.
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
Helping your teen manage stress starts with open and honest conversations. Here are five conversation-starters designed to prompt meaningful chats about self-care, stress management, and healthy ways to navigate the pressures they face:
It’s spooky season! A Good Girl’s Guide to Murder is a popular young adult mystery thriller (and Netflix series) — but is it safe for kids? If your child is interested in this series, read this guide first.
Are your child’s group chats causing major drama in their friend group? Here’s what parents need to watch for when their child starts texting independently — and how to help your child handle it.
🤳 Instagram remains the most used social app among teens, followed by TikTok, according to a new report by Piper Sandler.
🎃 Halloween is next week! In Washington, where BrightCanary is based, the most popular Halloween candy is Reese’s Peanut Butter Cups. What’s the most popular treat in your state?
📍 We’re on Pinterest! Follow BrightCanary to keep up with our latest parenting tips, infographics, and resources.
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
😩 Surgeon General says parental stress is a public health issue: In a new public health advisory, US Surgeon General Murthy called for policy changes that better support parents and caregivers. The advisory noted that 48% of parents report that their stress is completely overwhelming, compared to 26% of other adults. And even though the amount of time parents spend working has increased (+28% for moms, +4% for dads), the amount of time they spend engaged in primary child care has also increased (+40% among moms, +154% among dads). Murthy called for safe, affordable child safe programs, predictable workplaces and understanding workspace leadership, and community centers (such as playgrounds and libraries) that can give children space to play while fostering social connection among parents.
👎Snap and TikTok sued for failures with child safety: The attorney general of New Mexico filed a lawsuit against Snap, the parent company of Snapchat, alleging that the company’s design features (namely, disappearing messages and images) facilitates sexual abuse and fails to protect minors from predation. Additionally, a U.S. appeals court has ruled that TikTok must face a lawsuit over a 10-year-old girl’s death. The girl’s mother, Nylah Anderson, is pursuing claims that TikTok’s algorithm recommended a viral “blackout challenge” to her daughter.
📹YouTube introduces content rules and new supervisory tools for teens: YouTube is limiting content that could be problematic for teens if viewed repeatedly. This includes content that promotes weight loss, idealized physical appearance, and social aggression. The platform also introduced new parental controls for teen users, allowing parents to link their account to their teen’s in order to view their YouTube activity. Parents will be able to view their child’s uploads, subscriptions, and comments — but not their content. (For that, you’ll need a child safety app like BrightCanary.)
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
Snapchat can be risky for kids because of how easily strangers can contact them and messages can disappear. Here’s what parents need to know about the platform.
Many apps promise to help you monitor your child’s texts, but finding one that actually works well is an uphill battle. We’ve done the research to find the best of the best.
Even when your child’s social media feed doesn’t explicitly promote eating disorders, content can still encourage unhealthy behaviors or unrealistic body standards in subtle ways. Here are some conversation-starters to help you talk to your kids about content that promotes disordered eating behaviors and body negativity.
📵 “Just like it is impossible to train your child to drive a car without supervising from the passenger seat, you cannot train your child to be smart online if you are not privy to what he is doing in that world,” writes Melanie Hemp of Be ScreenStrong. Read more about your teen “earning” smartphone privacy.
📱 If your kid keeps getting around Apple Screen Time limits, what are your options? On the BrightCanary blog, we explain common workarounds and how parents can prevent kids from sneaking past their screen time boundaries.
👻 We’re adding a new, much-requested platform to the BrightCanary app in the coming weeks — stay tuned!
In the vast world of YouTube, it’s possible to find just about anything — from the sweet and the innocent to the … not-so-innocent. If you’re looking up how to block a YouTube channel, you may have discovered that your child has been watching something concerning. The answer depends on what type of YouTube account your child has. To help you figure out what option is best, here’s how to block a channel on YouTube.
There are only two ways to truly block a YouTube channel for your child:
The process of blocking channels is the same for both of these account types and can be managed through YouTube parental controls. Follow these steps:
If your child doesn’t have a YouTube Kids or a supervised YouTube account, you can’t completely block them from accessing specific channels. However, parents can reduce the likelihood of certain channels appearing in their child’s recommended video feed. Here’s how:
While this prevents videos from that channel from being recommended to your child, they can still go directly to the channel and view its content. So, this option is less about blocking a channel and more about managing your child’s YouTube algorithm.
To block, or not to block? That is the question that has plagued parents since the advent of YouTube. (Apologies to Shakespeare.)
While blocking YouTube channels or restricting them from your child’s feed are valuable tools for parents, it’s important to recognize that these actions alone won’t solve everything. For every inappropriate channel that you block, there are at least five that are just as bad.
Parents should also take additional steps to monitor their child’s online activity, including setting parental controls and using a monitoring app like BrightCanary. This child safety app uses AI for parents to monitor their child’s online activity, including YouTube history, Google searches, text messages, and social media.
In addition to blocking YouTube channels, here are some actions you can take to ensure your child’s experience is safe and age appropriate.
Not all of the videos on YouTube are appropriate for kids. To keep your child safe on the platform, you can take steps such as blocking channels, resetting their YouTube algorithm, reviewing their feed together, and using a child safety app to keep an eye on the content they’re viewing.
Snapchat has become a ubiquitous part of teen culture. But is Snapchat better than texting for kids? Features like location sharing and vanishing messages have led to growing safety concerns among parents. This article will dig into how kids use Snapchat, its risks, and what parents can do to keep their kids safe.
Here’s what you need to know about the role this messaging app plays in kids’ lives:
Snapchat is so embedded in the social fabric of today’s teens that it’s the main way many kids communicate with friends. These are the ways teens socialize on the app:
The visual nature of Snapchat makes it particularly likely to create extra pressure to keep up with peers. Kids have near-constant access to what their friends are doing and who they’re spending their time with, particularly because users are incentivized to share as often as possible on the platform.
Real-time updates foster a fear of missing out (FOMO) — if your child sees their friends hanging out together without them or going to exciting places on the weekends, they may fall into a comparison trap.
When users Snap with each other at least once a day, they’re awarded a Snapstreak. Their overall engagement with the app is quantified by a number at the top of their profile known as a Snap Score. The social validation of maintaining Snapstreaks and Snap Scores can pressure kids to use the app more often — which is exactly the point of addictive, gamified features.
These features make Snapchat especially problematic for kids and difficult for parents to monitor:
Snapchat messages are designed to disappear as soon as all recipients view it, leaving no trail for parents who want to review their child’s online communication.
Stories let a Snapchatter share something with all their followers at once. Because stories are public, it’s important to talk to your kids about what’s okay to share online and help them set their privacy controls in the app to limit who can see and respond to their posts.
The Discover section displays content that’s been curated for the user. The primary function is to keep users scrolling — a potentially addictive feature. And because the algorithm analyzes a user’s behavior on the app to serve them content, viewing a few harmful Snaps like content promoting disordered eating could lead to a vicious cycle.
Snap Map allows Snapchatters to share their physical location, updated in real time. This feature allows you to see where your child is, but it also poses privacy and safety risks by broadcasting their whereabouts to a wide audience. (If you’re interested in location sharing, this feature is freely available with Apple Find My and Google Family Link.)
Lenses are filters that let users change their faces and the world around them. While many are pure fun, like turning yourself into a dancing turkey, “beauty” Lenses that do things like smooth skin, slim faces, or add a tan may also contribute to unrealistic beauty standards and body image issues.
Here are the top Snapchat risk parents should be aware of:
Here’s what parents can do to minimize the risks of Snapchat for their child:
While Snapchat allows users as young as 13, it’s a good idea to wait longer. Common Sense Media rates it as appropriate for 16+ (and we agree!).
Turn off your child’s location and maximize their privacy settings. Use Snapchat’s Family Center to see who your teens are communicating with and set content controls.
Talk to your kids about safe social media use, including what’s okay to share online. Remind them that anything can be saved and shared, and if anything or anyone makes them feel uncomfortable, they can always bring it to you (or another trusted adult).
Talk to your kids about how they’re using Snapchat. Sit down with them for regular safety check-ins and make it clear they should come to you with any problems, and you’ll support them through it.
Because of vanishing messages and location features, Snapchat is difficult for parents to monitor, and therefore more problematic than other social media and texting platforms. At BrightCanary, we’re committed to providing tools to keep your child safe online. That’s why we’re working toward a solution that lets you better monitor your child on Snapchat. Keep an eye on this space for an announcement and, in the meantime, download the app today to start monitoring your child’s text messages and on YouTube, Google, Instagram, and TikTok!
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
Tech giants have some ‘splaining to do. First up: Google and Meta allegedly made a secret deal to target advertisements for Instagram to teens on YouTube, according to the Financial Times. The project, which began in early 2023, exploited a loophole to bypass Google’s own rules prohibiting ad targeting to users under 18.
The advertising agency Spark Foundry, working for Meta’s marketing data science team, was tasked with attracting more Gen Z users to Instagram, which has been losing ground to rival apps like TikTok. Evidence suggests that Google and Spark Foundry took steps to disguise the campaign’s true intent, bypassing Google’s policy by targeting a group called “unknown”—which just so happened to skew toward users under 18.
Jeff Chester, executive director of the Center for Digital Democracy, which advocates for child privacy, said, “It shows you how both companies remain untrustworthy, duplicitous, powerful platforms that require stringent regulation and oversight.”
Speaking of oversight … the Justice Department is suing TikTok and parent company ByteDance for violating children’s privacy laws. According to a press release, ByteDance and its affiliates violated the Children’s Online Privacy Protection Act (COPPA), which prohibits website operators from knowingly collecting, using, or disclosing personal information from children under the age of 13 without parental consent.
The complaint alleges that from 2019 to the present, TikTok:
These allegations come amid ongoing legal battles over a TikTok ban in the U.S. To add to the controversy, the Justice Department recently accused TikTok of gathering sensitive data about U.S. users, including views on abortion and gun control. The Justice Department warned of the potential for “covert content manipulation” by the Chinese government, suggesting that the algorithm could be designed to influence the content that users receive.
That’s a lot to take in: Indeed. We often talk to parents about the balance between trust and monitoring. We can trust our kids, but we can’t always trust Big Tech companies to protect them or prioritize their well-being.
Taking an active role in your child’s digital life is about more than just supervising their online activity — it also involves considering how these companies use children’s data and how they might influence what your child consumes.
If your child uses social media or YouTube, it’s a good idea to periodically check their feeds together. A child safety app like BrightCanary can help make this easier, but nothing beats having open conversations with your child about what they share and what they see.
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
Unfortunately, the popularity of parental control apps has attracted scammers that want to swindle and frustrate people. Here’s how to identify and avoid parental control scams on iPhone and Android, plus tips to select a reputable app that does what it claims.
Did you know that your kid could be using private browsing to hide their online activity from you? Despite this workaround, parents still have options for monitoring their child online. Here’s what you should know and how to talk to your kid about incognito mode.
Tech giants don’t have our children’s best interests at heart. Privacy is important, but so is staying informed and keeping our kids safe — parents need to understand what their children are consuming, both in their algorithms and through ads. If you’re worried about the privacy conversation, here are some conversation starters:
🤖 Roblox recently released new resources to educate users about generative AI (think: ChatGPT, DALL-E, and Roblox’s own GenAI). Here’s the guide for families and one made for teens.
👑 Meghan Markle and Prince Harry have entered the child safety chat: The Parents’ Network, a new initiative from the Duke and Duchess of Sussex, is intended to assist families of children lost due to social media harm.
👻 Snapchat rolled our new safety features, including expanded in-app warnings, enhanced friending protections, and simplified location sharing. (We’re still not fans of Snapchat for younger kids, but if your teen uses Snap, it’s worth checking out the app’s parental controls.)
😔 Watching just eight minutes of TikTok focused on dieting, weight loss, and exercise content can harm body image in young women, according to a new study.
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
Guess who’s back, back again? After facing an uncertain path in the Senate, the Kids Online Safety Act (KOSA) successfully passed the Senate last Thursday. The historic vote was overwhelmingly bipartisan (86 to 1 to take up the measure), but things will be less rosy in the House, where KOSA faces hurdles in the form of free speech concerns and Big Tech lobbyists.
KOSA would introduce the most sweeping child online safety reform since the now-archaic Children’s Online Privacy Protection Act (COPPA) was passed in 1998. As a recap, KOSA would:
While other child safety bills are also under consideration, KOSA is the closest to becoming law, although we won’t hear anything about its status until the House returns in September. Concerns about KOSA include stifling First Amendment-protected speech and isolating vulnerable youth from accessing information on social media.
At the same time, a growing body of experts are calling for stricter regulations on social media platforms for the sake of children’s mental health.
US Surgeon General Vivek Murthy said that social media should have a warning label, similar to the one required on tobacco products. A new report by the Biden-Harris Administration’s Kids Online Health and Safety Task Force urges the industry to make design choices that prioritize kids’ well-being, such as making privacy protections for youth the default and use data-driven methods to detect and prevent online harassment.
The kids are not alright, and child online safety legislation is overdue. If KOSA doesn’t pass, other options on the table include Sammy’s Law, which would require social media companies to integrate with child safety software, making it easier for parents to supervise their children’s online activities.
“Finalizing these safety bills has been a long and winding and difficult road, but one thing I’ve known from the start: It sure would be worth it,” Senator Chuck Schumer, Democrat of New York and the majority leader, said in a floor speech before the vote. “The message from these parents has been simple and consistent: It’s been long enough.”
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
What can parents do to make sure their teens aren’t texting personal information to strangers? Show this list of tips to your kiddo, or use it as a springboard for a conversation about texting safety.
VPNs are a popular way for kids to get around some parental control settings. Read on to learn about VPNs, how to know if your child is using one, and what you can do about it. (Psst: VPNs don’t impact BrightCanary monitoring.)
It’s important to give kids a degree of privacy, but it’s also important to guide, protect, and support them online and offline. Plus, it makes sense to be more hands-on when your kid first gets a phone or tablet, then give them more autonomy and independence as they grow older and more mature. All that to say, how do you talk to your child about privacy — especially when you start to introduce parental monitoring? Use these conversation-starters.
😫 How do you lay down ground rules about devices? What about tips to handle cyberbullying and online abuse? Teaching children to navigate the online world is a key part of modern parenting. In this Q&A, experts pass on tips to make it feel less overwhelming.
🔎 Meta has, historically, not been the most forthcoming with allowing researchers to review its data. But now, “after years of contentious relationships with academic researchers, Meta is opening a small pilot program that would allow a handful of them to access Instagram data for up to about six months in order to study the app’s effect on the well-being of teens and young adults.”
👉 We mentioned the Kids Online Safety and Health Task Force’s new report earlier in this newsletter, but we recommend taking some time to check it out — it’s packed with advice and conversation-starters for parents and caregivers, plus free resources for parents of tweens and teens.
Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
What if the next time your child signs up for a social media platform, they’re faced with a health warning — the same kind of label you see on cigarettes?
Surgeon General Dr. Vivek Murthy recently called for a warning label that states social media is associated with significant mental health harms for adolescents. The statement comes after Murthy issued a health advisory in May 2023, warning that social media is contributing to the youth mental health crisis.
What is a warning label? You’ve likely seen these labels on tobacco and alcohol products. A surgeon general’s warning label is a public statement that calls attention to a critical public health issue.
Warning labels can’t be implemented without congressional approval, but Murthy’s statement furthers a growing movement for regulation on social media to help keep kids safe and minimize the dangers of addictive design features. For example, New York recently passed a measure that bans social media platforms from algorithmically recommending content to children.
It’s not over: Murthy acknowledges that a warning label, on its own, wouldn’t make social media safer for young people. He also urges legislators to:
“There is no seatbelt for parents to click, no helmet to snap in place, no assurance that trusted experts have investigated and ensured that these platforms are safe for our kids,” he wrote. “There are just parents and their children, trying to figure it out on their own, pitted against some of the best product engineers and most well-resourced companies in the world.”
Parents can help, too — by creating more phone-free experiences at home and at school, supervising kids’ social media use, and delaying giving kids access to phones until after middle school. Stay involved, ask questions, and understand what your child is doing on their devices.
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
For today’s kids, digital literacy comes into play with everything from school projects to social media. When kids are skilled in digital literacy, they’re more capable of identifying reputable information and sources. Here’s how to raise digitally literate kids.
Smishing — phishing’s younger sibling — is an increasingly common form of cyberattack and one parents need to know about so they can help their kids stay safe. But what is smishing? Read on to learn what this scam entails and how to prevent it from happening to your child.
You know you should talk to your child about what they’re doing on their phone, but it can feel awkward and intrusive. Here are some ways to start the conversation:
🎮 Is your child developing an unhealthy relationship with video games? Melanie Hempe of ScreenStrong shares a video game addiction test you can use today.
🔨 Apple recently announced a fix to a problematic Screen Time bug that allowed kids to view explicit content. (If your child is getting around Apple Screen Time, here are some troubleshooting tips.)
🎉 BrightCanary is now free for school teachers, counselors, and mental health professionals! Learn more in this letter from our CEO Karl Stillner.
Today’s kids are more online than any generation before them. The internet is inextricably linked to nearly every part of their lives. That ever presence means it’s more important than ever to teach your child about digital literacy. But what is digital literacy, and why does it matter?
Digital literacy refers to both the technical and cognitive skills needed to navigate our online world. An important part of digital literacy is the ability to find information online and evaluate the reliability of that information. It also involves knowing how to make smart decisions about sharing information online.
For today’s kids, digital literacy comes into play with everything from school projects, to YouTube videos, to social media. When kids are skilled in digital literacy, they’re more capable of identifying reputable information and sources.
How we seek and share information has changed dramatically in recent years. In addition to resources like books, mainstream newspapers, and network news — all of which go through some form of validation or fact-checking process — we now have endless streams of information from anyone with an internet connection.
While many of these new online sources are reliable, plenty of others are from self proclaimed “experts” who don’t know their facts. Intentional disinformation, including deepfakes, is also a growing concern. And the rapid rise of artificial intelligence has further muddied the waters, generating information that sounds credible but oftentimes isn’t.
Here are several ways digital literacy skills help kids navigate this new information landscape:
The ability to find valid information online is not only useful for schoolwork. It’s a skill that will benefit them for the rest of their lives as they seek information about topics like their health, finances, employment, and news.
If a person posts a video where they speak authoritatively on a subject, it’s easy to take it at face value. Digital literacy helps kids evaluate the reliability of the people and information that comes across their feeds. For example, if someone is spouting mental health advice but isn’t actually a trained professional, your child should recognize that their information may not be entirely accurate.
Digital literacy is an important component of behaving responsibly in a digital world. It’s the difference between sharing credible information and misinformation. If your child recognizes that a source may not be trustworthy, they’ll know to tread carefully if friends are sharing conspiracy theories or other faulty information.
Digital literacy is such a broad concept that it can be daunting to know how to talk about it with your kids. Start small and build on their learning as they get older.
Here are some tips to get you started:
Teach your kids to look for sites that are backed by reputable organizations and run by people with expertise. It’s also a good idea to check the date on the article to make sure the information is current. Check out these additional tips on evaluating internet resources from Georgetown University Library.
Fake videos created by artificial intelligence (aka deepfakes) are on the rise. Teach your kids to be on the lookout for things like odd facial movements or pixelation — a few red flags that the video may be artificially generated.
Pausing to consider the validity of a post before sharing it helps prevent the spread of disinformation. If something online seems unbelievable, there’s a good chance it’s not trustworthy.
This also goes for original posts that your child makes. Explain that vague posts and unclear online communication can easily cause confusion and conflict. Learning what’s okay to share online and what’s not is a major part of developing digital literacy.
Help your child recognize why certain things come across their feeds. If they begin engaging with fringe theories and inflammatory content creators, they’ll see more of that content on their social media.
Stay involved in your child’s online life so you can continue to guide them toward greater digital literacy. Regular tech check-ins and using a child safety app like BrightCanary are great ways to stay in the loop about what your child is up to online.
Digital literacy isn’t only about evaluating other’s actions. It’s also about learning to be a responsible member of the online community. Help your children learn to live their values online by being intentional about their behavior.
PBS learning media’s Be MediaWise is a series of digital literacy lessons geared toward kids. The videos are short, fun, and informative. Check them out with your child to continue the conversation on digital literacy.
Like reading and writing, digital literacy is a core skill today’s kids need to succeed in the modern world. Help your child learn to be internet wise by teaching them how to check online sources, protect their privacy, and be a good digital citizen.