NewsTechnology

Meta introduces stricter messaging settings to protect teens on Instagram and Facebook

Meta

Meta announced on January 25th that it’s implementing further measures to safeguard teenagers on Instagram and Facebook by enforcing stricter settings for private messages.

Previously, Meta limited adults aged 19 and above to one text-only message to teens who don’t follow them on Instagram.

Now, teens will have their direct message (DM) feature disabled by default for individuals they don’t follow or aren’t connected to, including other teens.

Meta stated, “Under this new default setting, teens can only be messaged or added to group chats by people they already follow or are connected to, helping teens and their parents feel even more confident that they won’t hear from people they don’t know in their DMs.”

Teens with supervised accounts will require parental permission to alter this setting.

These changes apply to users under 16 (or 18 in some regions), who will receive a notification in their Feed informing them of the updated setting.

Similar settings will be implemented for teenagers using Messenger, limiting messages to Facebook friends or contacts in their phone.

Moreover, Meta revealed plans for a feature to protect teens from “seeing unwanted and potentially inappropriate images in their messages” from known contacts, likely through pop-up alerts.

Further details on this feature will be disclosed later in the year.

Meta is also enhancing its parental supervision program, prompting parents to approve or deny requests when teens modify their default safety and privacy settings.

Meta

Meta clarified, “As with all our parental supervision tools, this new feature is intended to help facilitate offline conversations between parents and their teens, as they navigate their online lives together and decide what’s best for them and their family.”

These updates come in response to allegations from the United States and the European Commission regarding Meta’s insufficient measures to protect children on its platforms.

Recently disclosed documents from a lawsuit in New Mexico revealed Meta’s historical reluctance to prioritize child safety on its platforms.

Despite this, Meta claims to employ advanced technology, child safety experts, and collaboration with law enforcement to address safety concerns.

As lawmakers increase pressure on child safety issues, Meta continues to update its tools and safeguards for younger users.

Source-CNBC

Tags

Related Articles

Leave a Reply

Your email address will not be published.

Back to top button
Close
Close