Facebook is working on a new version of its popular app targeted at children under 13.
Lawmakers have a message for Facebook: Stop the kids’ version of Instagram before it starts.
Sens. Ed Markey (D-MA) and Richard Blumenthal (D-CT) as well as Reps. Kathy Castor (D-FL) and Lori Trahan (D-MA) released a statement urging Facebook to halt its plans to launch an Instagram app built for children under the age of 13.
“Facebook has a clear record of failing to protect children on its platforms,” the lawmakers said in the statement on Tuesday. “When it comes to putting people before profits, Facebook has forfeited the benefit of the doubt, and we strongly urge Facebook to abandon its plans to launch a version of Instagram for kids.”
The lawmakers join a growing number of critics who say that Facebook should not roll out such an app, citing the company’s own record and concerns about child well-being. Earlier in May, 44 attorneys general from US states and territories also called for Facebook to halt its plans, telling the company in a letter that “Facebook is not responding to a need, but instead creating one.”
Public health experts had previously urged Facebook to leave behind its plans for a new version of Instagram targeted at kids under 13. Such a plan, these groups said in a letter sent in April, would “put young users at great risk,” arguing that Facebook isn’t ready to introduce and oversee an app that could have such a powerful influence over young children.
The new app, which Facebook has said will not include ads, is being designed for children under the minimum age for Instagram, which is 13. Facebook also says it’s trying to find new methods, including using artificial intelligence, to confirm that users on the main Instagram platform aren’t under 13. That age restriction is a product of a 1998 law called the Children’s Online Privacy and Protection Act (COPPA), which establishes more stringent requirements and potential financial liabilities for online platforms that collect personal information about users under 13 without their parents’ consent. Child safety experts worry that social media poses additional threats to young children, too.
“Instagram’s focus on photo sharing and appearance makes the platform particularly unsuitable for children who are in the midst of crucial stages of developing their sense of self,” the organizations, which include the Campaign for a Commercial-Free Childhood and ParentsTogether Action, told Facebook CEO Mark Zuckerberg in the letter. “Children and teens (especially young girls) have learned to associate overly sexualized, highly edited photos of themselves with more attention on the platform and popularity among their peers.”
The public health experts and child advocacy groups who signed the letter also argue that social media built for kids could violate young peoples’ privacy and create an increased risk of depression, among a wide variety of other potential harms.
“During the pandemic, I have heard countless stories from parents of elementary-aged children about high-drama and problematic interactions happening over social media that kids weren’t developmentally ready for,” said Jenny Radesky, a pediatrics professor at the University of Michigan’s medical school, in an April statement. “An Instagram for kids is the last thing they need.”
Members of Congress have expressed concern that these apps have become addictive, are harmful to young people’s mental health and self-esteem, and endanger children’s privacy. At the same time, tech companies are grappling with the reality that kids under 13, who are technically not allowed on their platforms, manage to gain access anyway.
The debate over kids on social media was reignited following a BuzzFeed News report in March that Facebook was in the early stages of building an under-13 Instagram app.
Facebook has defended its Instagram-for-kids plan, arguing that it’s an effort to keep young people off of its main service. The company also told Recode in April that the new version of Instagram is being designed in consultation with child development and mental health experts as well as privacy advocates, a process that the company expects will take several months.
“We’ve just started exploring a version of Instagram for younger teens,” said Facebook spokesperson Stephanie Otway. “The reality is that kids are online. They want to connect with their family and friends, have fun, and learn, and we want to help them do that in a way that is safe and age-appropriate.”
Otway added that Facebook did not have more specifics to share regarding how it will approach content moderation for its kids-focused platform. The prospect of adults interacting with children on Instagram is particularly concerning. In March, Instagram added new features to restrict direct messages between teens and adults they do not follow, and said it’s looking into how to make it more difficult for adults with “potentially suspicious behavior” to interact with young people.
In a letter sent to lawmakers in late April, Facebook emphasized its commitment to children’s safety, including its content moderation efforts and work with researchers to study young peoples’ well-being on the internet. But lawmakers said on Tuesday that Facebook’s response wasn’t enough, saying the company “refused to make meaningful commitments about how it will ensure that its proposed Instagram Kids app does not harm young users’ mental health and threaten their privacy.”
Previous attempts by tech and media companies to reach a large number of young children online have run into problems, and the Federal Trade Commission has been involved in several cases related to tech platforms and children’s privacy. In 2017, Facebook launched a kids version of its Messenger app. Two years later, Facebook shut down a technical flaw in its system that made it possible for kids to enter group chats with strangers their parents hadn’t approved. The company now says there are more than 7 million monthly active accounts on the Messenger Kids service.
YouTube has also run into problems with its app for young people, YouTube Kids, which it launched in 2015. The company has had to crack down on inappropriate videos being displayed to young people. Earlier this month, the House Subcommittee on Economic and Consumer Policy told YouTube CEO Susan Wojcicki it was investigating YouTube Kids, hammering the service for low-quality content, a high degree of product placement, and insufficient content moderation.
YouTube announced in May it was changing the way the autoplay feature in its kids’ app after Recode asked about the tool and growing concern in Congress over kids watching an endless loop of algorithmically recommended videos. In April, Viacom, Disney, and 10 ad-tech firms came to a settlement in a lawsuit accusing these companies of launching tracking software on children-focused apps without the consent of the kids’ parents.
So, while we don’t know when a kids’ version of Instagram will launch, it’s clear that lawmakers and child safety experts are not happy with tech platforms targeting children. And if and when the app does launch, past problems with kids-focused platforms as well as Instagram itself suggest that the new app could be a troublemaker.
Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.
Author: Rebecca Heilweil