Meta announced a new set of tools designed to protect young users on Wednesday, an overdue response to widespread criticism that the company doesn’t do enough to protect its most vulnerable users.
Parents, tech watchdogs and lawmakers alike have long called for the company to do more to keep teens safe on Instagram, which invites anyone older than 13 to sign up for an account.
To that end, Meta is introducing something it calls “Family Center,” a centralized hub of safety tools that parents will be able to tap into to control what kids can see and do across the company’s apps, starting with Instagram.
The new set of supervision features lends parents and guardians some crucial transparency into young users’ Instagram habits. The tools will allow parents to monitor how much time a kid spends on the app, be updated about accounts they’ve followed lately and who has followed them and receive notifications about any accounts they’ve reported.
Those tools will roll out today on Instagram in the U.S. and are on the way to Meta’s VR platform in May and the rest of Meta’s apps (remember Facebook?) some time in the coming months, including to global users. The company characterized the tools as the “first step in a longer-term journey,” though why it took so long to take these initial measures to protect teens from the unsavory side of its software is less clear.
For the time being, teenaged Instagram users will have to enable the safety tools from within their own accounts, though the company says the option for parents to initiate the supervision mode will be implemented by June. Instagram also plans to build out more controls, including a way for parents to restrict app usage to certain hours and a setting that would allow multiple parents to co-supervise an account.
Young and vulnerable
Article By: Tech Crunch
Ask Any Query If You Have