Meta Restricts Content Shown to Teens

Meta will hide inappropriate content, like posts about self-harm, eating disorders, etc.

Meta Restricts Content Shown to Teens

A teen browsing the web. Photo by Freepik.

What kind of content will be hidden from teenagers?

Meta has announced that it will start hiding inappropriate content from teenagers' Instagram and Facebook accounts. This includes posts about topics like self-harm, eating disorders, etc. The company wants to make sure that teens don't see stuff that's not right for their age.

Under the new policy, such content won't appear in teenagers' feeds, even if it's shared by an account they follow. Teens on Facebook and Instagram (who said their real age when they signed up) will have their accounts set to the strictest settings.

They won't be able to search for harmful terms like self-harm. Instead, the company will hide those results and connect them to various resources for help. Meta will also limit who can tag, mention, or repost a teenager's content. Account settings will be adjusted so only followers can message them and help filter out bad comments.

Results when someone searches for self-harm-related content. Photo by Meta.
Why did Meta make this decision?

The social media company wants to make sure that kids have fun and safe experiences on their apps. They made some rules and tools to help kids and their parents enjoy using the apps. Meta's latest announcement also comes at a time when many US states have sued the company.

Why have the states sued Meta?

Several US states have sued Meta, claiming that the company is putting teenagers at risk by including addictive features on Instagram and Facebook. They think Meta is making Instagram and Facebook addictive for teenagers, and it's not good for their mental health. You can read more about it here.

Collect 150 Hootcoins!

START QUIZ

Did you read these articles?

© 2024 Newsahoot Media Pvt. Ltd. All rights reserved.