advertisement
Facebook
X
LinkedIn
WhatsApp
Reddit

Instagram will block adults from messaging teens who don’t follow them

A new set of policies is being implemented by Instagram with a view to make the platform a bit safer for its younger users.

The most important update to Instagram is that adults will no longer be able to message users under the age of 18 who they don’t follow.

But we’re getting ahead of ourselves because Instagram, like every online platform has a problem – children.

As you might be aware, you need to be 13-years old to create an account on Instagram, but knowing for certain that a person is 13-years or older relies largely on trust and unfortunately, people lie.

To address this, rather than banning people who are below 13-years old, Instagram is, “developing new artificial intelligence and machine learning technology to help us keep teens safer and apply new age-appropriate features”.

The first of those features is the aforementioned inability for adults to message young people who don’t follow them.

“This feature relies on our work to predict peoples’ ages using machine learning technology, and the age people give us when they sign up. As we move to end-to-end encryption, we’re investing in features that protect privacy and keep people safe without accessing the content of DMs,” explains Instagram.

What strikes us as odd is that Instagram isn’t using this AI and ML to identify accounts which are younger than 13 years old to ban them. Of course, kicking a bunch of young people off of the platform might not be good news for advertising revenue if there is a large proportion of those users.

What we are curious about is whether this analysis of data associated with accounts relates to the Children’s Online Privacy Protection Rule in the US. Those laws are highly prescriptive regarding how data from minors is processed and we are curious to see whether Instagram’s new policies comply with those laws.

Moving on, however, Instagram will also prompt teens to be more cautious when interacting with adults in direct messages.

“Safety notices in DMs will notify young people when an adult who has been exhibiting potentially suspicious behavior is interacting with them in DMs. For example, if an adult is sending a large amount of friend or message requests to people under 18, we’ll use this tool to alert the recipients within their DMs and give them an option to end the conversation, or block, report, or restrict the adult. People will start seeing these in some countries this month, and we hope to have them available everywhere soon,” wrote Instagram.

Further to this, teen accounts in Suggested Users will no longer be suggested to adults and it will be harder to find content created by teens in Reels or Explore.

Instagram is also trying to encourage teens to make their account private, though once again, this seems strange.

To our mind, the reason there are so many young folks on Instagram and similar apps is because they are in pursuit of fame. TikTok creators are earning real money with their fame. Charli d’Amelio who is just 16 years old, earns $83 602.41 per sponsored post according to Yahoo Finance. So it makes sense that folks younger than 13 want to get a slice of that fame.

We’d argue that what Instagram should be doing is being more strict about verifying a person’s age but of course that takes time and money where you could instead, we don’t know, create an automated system that studies yet more data from users in a bid to identify their ages and limit interactions so that those users are able to remain on the platform and provide eyes for advertisers. But that’s just us theory-crafting.

This is a good move from Instagram, our comments aside, it’s welcome news that the platform is taking the protection of minors seriously. Whether lawmakers will agree that this is the best way to do that, however, is something we’re very curious to find out.

[Source – Instagram]

advertisement

About Author

advertisement

Related News

advertisement