Welcome to Parent Pixels, a parenting newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. This week:
📍 Instagram and Snapchat make location sharing easier: Meta now allows users to share their live location in Instagram DMs for up to 1 hour or pin a spot on the map for easy sharing. This feature is off by default, and only those in the specific chat can see the shared location. In very related news, Snapchat debuted location sharing through Family Center, the app’s parental control hub. Once set up, parents and teens can share their locations with each other. Parents can also see who their teen is sharing their location with and receive travel notifications when their teen arrives at places like school or home. It’s actually refreshing to see the apps include some safety considerations, like restricting location sharing to accepted friends only on Snapchat. However, location sharing still raises concerns, such as teens sharing their details in a group DM that might include strangers. Remind your teen to only share their location with people they know in real life and to always prioritize their privacy — now’s a good time to talk about when they might want to use this feature (like coordinating pickups after an event) and when they shouldn’t (like meeting up with someone they haven’t met in-person before).
🕳️ YouTube pushes eating disorder videos to young teens, report suggests: The Center for Countering Digital Hate (CCDH) found that YouTube’s algorithm recommended eating disorder content to minors, including videos that violated its own terms of service. Researchers created simulated 13-year-old users who watched eating disorder-related videos, and YouTube’s algorithm responded by serving more harmful content, such as an “anorexia boot camp” and other harmful content that had accrued an average of over 388,000 views each. YouTube failed to remove, age restrict, or label the majority of videos the CCDH researchers flagged as harmful, and even profited from ads placed next to the content. This rabbit hole of negative content isn’t exclusive to YouTube — it’s a risk on any platform using engagement metrics to recommend videos without factoring in age or safety. age. Parents, here’s how to talk to your kids about the risks of eating disorder content on video platforms and on social media.
😬 Guys, TikTok might actually get banned: A federal appeals court upheld the January 19 deadline for TikTok to be sold or face a ban in the United States. As a recap: earlier this year, President Joe Biden signed a law requiring ByteDance, TikTok’s parent company, to sell the app to an approved buyer due to national security concerns or face a ban. ByteDance had asked the Supreme Court to review the statute, but unless the Court intervenes, the ban will take effect as scheduled. If your child is asking about the ban, here are some helpful talking points.
Parent Pixels is a biweekly newsletter filled with practical advice, news, and resources to support you and your kids in the digital age. Want this newsletter delivered to your inbox a day early? Subscribe here.
Let’s talk about location sharing. On the one hand, it’s a helpful way to make sure your child gets where they need to go. On the other hand, without boundaries in place, it can make your child feel like they lack your trust. Here’s how to start a conversation with them about it.
A kid’s first phone is a big step, but with some proper planning, you’ll set them up for success by teaching healthy tech boundaries. Here’s what we recommend.
Most US teens use iPhones, which means it’s important to find a parental monitoring app that’s effective on Apple devices. Here are a few of the best options for parents in 2025.
🙈 A majority (62%) of social media influencers don’t verify information before sharing it with their audiences, highlighting their vulnerability to misinformation. If your child gets all their news and updates from influencers, this is your reminder to talk to them about digital literacy.
⌛ Most popular social platforms have a minimum age of 13, but 22% of 8–17-year-olds fib about their age on social media, according to a report from UK media regulatory company Ofcom. Although apps like Instagram and TikTok have safety measures in place for underage accounts, those go out the window if kids pretend to be adults online.
⚠️ A proposed bill in California would require social media platforms to display warning labels that cautioning users about their potential impact on youth mental health. This initiative echoes US Surgeon General Vivek Murthy’s proposal to include tobacco-like warning labels on social networks, aiming to raise awareness about the risks of prolonged exposure to these platforms.
⚖️ Despite bipartisan support, the Kids Online Safety Act is unlikely to pass this year, despite last-minute changes and an endorsement from Elon Musk on X.