Instagram head Adam Mosseri confirms that a version of the popular photo-sharing app for kids under 13 is in the works. BuzzFeed News reports. The Facebook company knows that many children want to use Instagram, Mosseri said, but there is no “detailed plan” yet, according to BuzzFeed News.
“But part of the solution is to create an Instagram version for young people or children where parents have transparency or control,” Mosseri said. BuzzFeed News. “This is one of the things we’re investigating.” Instagram’s current policy excludes children under 13 from the platform.
Children are increasingly asking their parents if they can join programs that help them keep up with their friends. An Instagram version where parents controlled, as we did with Messenger Kids, is something we are investigating. We share more along the way.
– Adam Mosseri (@mosseri) 18 March 2021
“Children are increasingly asking their parents if they can join apps that can help them keep up with their friends,” Facebook spokesman Joe Osborne said in an email. The edge. ‘At the moment, there are not many options for parents, and we are working on building additional products – as we did with Messenger Kids – that are suitable for children, run by parents. We are bringing a parental control experience to Instagram to help kids keep up with their friends, discover new hobbies and interests, and more. ”
BuzzFeed News received a message from an internal message board, where Vishal Shah, vice president of Instagram, said that a project “youth pillar” had been identified by the company as a priority. His community product group will focus on privacy and safety issues ‘to ensure the safest possible experience for teens’, Shah wrote in the post. Mosseri will oversee the project along with Vice President Pavni Diwanji, who oversaw YouTube Kids while at Google.
Instagram published a blog post earlier this week describing its work to make the platform safe for its youngest users, but no mention was made of a new version for children under 13.
Targeting online products to children under 13 is not only concerned with privacy but also with legal issues. In September 2019, the Federal Trade Commission fined Google $ 170 million for watching children’s viewing history to display ads on YouTube, a violation of the Children’s Online Privacy Protection Act (COPPA). TikTok frontrunner Musical.ly was fined $ 5.7 million in February 2019 for violating COPPA.
In 2017, Facebook launched an ad-free version of its Messenger chat platform for children, intended for children between the ages of 6 and 12. Children’s health advocates criticized it as harmful to children and CEO Mark Zuckerberg called for it. discontinue. When a bug in Messenger Kids in 2019 allowed kids to join groups with strangers, thousands of kids left with unauthorized users chatting. Facebook has quietly shut down the unauthorized chats, which they say have affected a small number of users.
Update March 18, 19:46 ET: Added tweet from Adam Mosseri.