Young people on Instagram won’t help tweens on social media, parents say: NPR
Rick Bowmer / AP
Social media companies ban children under 13 from registering due to federal privacy law. But parents like Danielle Hawkins can tell you a different story.
“She came to Instagram and Snapchat without my approval when she was around 12,” Hawkins, a mother of four who lives near Detroit, said of her eldest daughter.
Tech companies are well aware of this problem. Facebook CEO Mark Zuckerberg told a congressional hearing in March that his company knew children were bypassing age limits on apps like Instagram, Facebook’s photo-sharing network.
“The [are] clearly a lot of people under 13 who would want to use a service like Instagram, ”he said.
Now Facebook is working on a solution for underage children: “We are exploring a service for Instagram that allows under 13s because we are concerned that children may find ways to try to lie and escape some of our systems, ”Zuckerberg said. legislators. “But if we create a safe system with proper parental controls, maybe we can get people to use it instead.”
The project, which Facebook calls Instagram Youth, would likely give parents the ability to monitor and limit what their kids do on the app. Facebook hasn’t released any concrete details or timelines, but that hasn’t appeased critics.
Parents say difficulties with social apps start at a young age
Parents say Zuckerberg is right: Many kids go on social media, despite the 13-year age limit set by apps like Instagram, Snapchat and TikTok.
Charity White-Voth, a mother from San Diego, said the struggle began long before her daughter’s 13th birthday.
“I was the last resistance from her friend’s parents around Snapchat,” she said. Her daughter told her that all her friends are using the most famous app for missing messages.
“She wasn’t kidding,” White-Voth said. “They were on it and they were using it. And I was like, ‘I don’t feel comfortable. I don’t think this is the right thing to do. “”
She gave in once her daughter turned 13, but still worries that her daughter is too young to understand that what she posts online will be on the internet forever.
“I’m worried that she’s 13, that she has poor impulse control, that the hormones are raging… just this inability to think long-term,” White-Voth said. “I’m worried about sending something inappropriate, which will kind of get someone else’s screenshot.”
Another source of unease for many parents is the emphasis on likes, followers and selfies which is particularly pronounced on visual platforms like Instagram, TikTok and Snapchat.
“Body image, who you are, how accepted you are, is a very important part of becoming a teenager,” said Hawkins, the Detroit-area mom. “Being able to have people on the other side of a screen… tell you who you are or how good you are? You really can’t figure out what that really does to the psyche.”
Her eldest daughter, who signed up for Instagram and Snapchat last year at the age of 12, is no longer allowed to use social media.
“We had to pull the reins. We just realized that it really wasn’t good for his upbringing, for his emotional state,” Hawkins said.
Growing concern that social media use may be linked to mental health issues
These concerns about the role screen time in general and social media in particular play in the well-being of children are well founded, said Blythe Winslow, co-founder of Everyschool.org, a non-profit organization that advises schools on how to use technology.
“Children have more anxiety and depression… Empathy is on the decline. Creativity is on the decline. Suicide rates among children aged 10 to 14 have tripled” between 2007 and 2017, he said. she said, referring to a 2019 report from the Centers for Disease Control and Prevention.
“Parents are concerned that social media is linked to a lot of these issues,” she said.
As a mother of two pre-teens, Winslow knows firsthand how difficult these choices are for parents.
“My 11-year-old daughter has been fighting for social media probably since she was eight or nine,” she said. “Most of his friends have TikTok and they love TikTok.”
Researchers say the risks for children to be on platforms where they can interact with adults are urgent. A recent report from the nonprofit Thorn, which develops technology to defend children against online sexual abuse, found that more than a third of children aged 9 to 12 reported having had ” potentially dangerous online experience “with someone they thought was 18 or older. Nineteen percent said they had had sexual interaction online with someone they believed to be an adult.
Many are skeptical that a social media network just for kids would ward off malicious adults.
“If you create a community for kids, the adults who really want to enter it will also find how to enter it,” said Julie Cordua, CEO of Thorn.
Critics use fears to pressure Facebook
These fears not only disturb parents. They’re fueling a backlash against Instagram Youth from child safety advocates, members of Congress and 44 attorneys general, who are urging Facebook to drop the idea altogether.
Critics cite concerns about online predators, links to depression and body image issues, and fears for children’s privacy. And, they say, Facebook just doesn’t have a good track record when it comes to user protection.
Jim Steyer, managing director of advocacy group Common Sense Media, describes Facebook’s tactic as “the classic approach to brand marketing, which is to hook kids up as early as possible” – which benefits his business by ensuring a pipeline of users.
“One, you get their loyalty from the cradle to the grave. And second, if you’re lucky, you bring their parents with them,” he said.
Facebook says Instagram Youth is still in its infancy and prioritizes security and privacy.
“As all parents know, kids are already online. We want to improve this situation by providing experiences that give parents visibility and control over what their children are doing,” said the Facebook spokesperson, Liza Crenshaw, in a statement.
“We will develop these experiences in consultation with experts in child development, child safety and mental health, and privacy advocates. We also look forward to working with lawmakers and regulators. Plus, we’re committed to not serving any ads in the Instagram experiences we develop. for those under 13, ”she said.
“We are not asking social media to raise our children”
Some parents NPR spoke to said they would be interested in letting their kids use a version of the app with more limited content and the ability to monitor what they’re doing.
But San Diego father Buyung Santoso said his children, aged 11 and 13, would not go for it.
“My daughter said yesterday that she didn’t think it would work,” he said, “because kids can do whatever they want whether you need permission or not.”
Critics say that instead of building new apps for kids, tech companies should focus on making their existing products safe for kids they already know.
“The most important thing platforms can do is not turn a blind eye, but deeply recognize how their platforms will be abused and build for it – to secure this most vulnerable user,” said Cordua, CEO of Thorn.
Titania Jordan works for Atlanta-based software company Bark, which helps parents monitor their children’s online activity, and is a mom to a 12-year-old son who loves TikTok and Snapchat.
“We are not asking social media to raise our children,” she said. “It’s not their job. Don’t make our job harder.”
Editor’s Note: Facebook is one of the financial backers of NPR. TikTok helps fund videos produced by NPR from Planet Money that appear on the social media platform.