advertisement
Facebook
X
LinkedIn
WhatsApp
Reddit

Social media CEOs can’t seem to agree on how they protect children online

  • The chief executive officers of Meta, Snap, Discord, X, and TikTok sat before the Senate Judiciary Committee hearing on Wednesday.
  • While the matter being discussed was serious, US lawmakers once again showed how little they know about tech and geography.
  • The hearing revolved around the provisions social media platforms make or don’t make to protect children online.

On Wednesday, the chief executive officers of Meta, Snap, Discord, X and TikTok sat before the Senate Judiciary Committee hearing. The hearing revolved around the child exploitation that has taken place, in varying degrees on the platforms owned by the companies above and an inability to properly protect children online.

These sorts of hearings where over-confident tech CEOs answer terrible questions posed by senators and US lawmakers have become meme fodder. Despite the seriousness of the matter at hand, this hearing was no different.

As an example, Senator John Kennedy presented this gem to Evan Spiegel, CEO at Snap.

You also have to feel bad for TikTok CEO Shou Zi Chew who, despite being from Singapore, has to consistently field questions about whether he is beholden to China.

Putting the memeable questions and responses aside, the hearing leveled serious accusations against all CEOs that their platforms were used in some capacity for purposes of child exploitation.

As reported by Engadget, Evan Spiegel, Jason Citron, and Linda Yaccarino only appeared before the committee after they had been subpoenaed and Citron’s presence required US Marshalls to appear at Discord’s headquarters.

The CEOs were met by a hearing room filled with parents whose children had been victims of exploitation, some held up photos of their children.

During the proceedings, the senators highlighted how all platforms had been used to target young people and children. Senator Lindsey Graham went so far as to tell Zuckerberg and all of the other CEOs that they had blood on their hands.

The hearings took place as part of efforts to reshape existing policies put in place to keep children safe online. There are reportedly as many as six bills that deal with online safety currently being eyed in the US and the CEOs seem to support these bills in part. These bills include the Kids Online Safety Act, the Children’s Online Privacy Protection Act and proposed bills such as the EARN IT and STOP CSAM acts.

However, tech companies couldn’t seem to agree with the contents of these bills. This is problematic as these companies spend lots of money on lobbying and this in turn prevents the bills from being passed as acts. This means that social media firms are beholden to nobody.

“It’s been 28 years since the internet. We haven’t passed any of these bills … The reason they haven’t passed is because of the power of your companies, so let’s be really, really clear about that. What you say matters. Your words matter,” Senator Amy Klobuchar said according to NBC News.

That’s concerning because Meta alone is staring at a lawsuit filed by 41 US states accusing the firm of harming the mental health of teens, turning a blind eye to minors using their platform and doing very little to prevent adults from sexually harassing minors. One would think that supporting bills that prevent those actions and ultimately, prevent lawsuits from even being filed.

While some of the CEOs including Zuckerberg apologised to parents for the harm that had been caused in its failure to protect children, the juxtaposition to the numerous examples of how Meta’s platforms are exploited was jarring.

However, the problems these CEOs have with the proposed bills isn’t without cause.

“Online child safety is a complex issue, but KOSA [Kids Online Safety Act] attempts to boil it down to a single solution. The bill holds platforms liable if their designs and services do not ‘prevent and mitigate’ a list of societal ills: anxiety, depression, eating disorders, substance use disorders, physical violence, online bullying and harassment, sexual exploitation and abuse, and suicidal behaviors. Additionally, platforms would be responsible for patterns of use that indicate or encourage addiction-like behaviors,” the Electronic Frontier Foundation wrote in May.

“Deciding what designs or services lead to these problems would primarily be left up to the Federal Trade Commission and 50 individual state attorneys general to decide. Ultimately, this puts platforms that serve young people in an impossible situation: without clear guidance regarding what sort of design or content might lead to these harms, they would likely censor any discussions that could make them liable. To be clear: though the bill’s language is about ‘designs and services,’ the designs of a platform are not causing eating disorders. As a result, KOSA would make platforms liable for the content they show minors, full stop. It will be based on vague requirements that any Attorney General could, more or less, make up,” the non-profit organisation added.

It’s unclear how Wednesday’s hearings will influence the multitude of bills on the floor at the moment. However, with more eyes on social media firms we expect to see some changes at the very least when it comes to how we protect children.

advertisement

About Author

advertisement

Related News

advertisement