Content Warning: This article contains content relating to self-harm and suicide.
Have you ever seen gore or explicit content on social media get past “safety” filters?
On the 1st of February, the US Senate held a hearing with the bosses of TikTok, Snapchat, X (formerly known as Twitter), Meta, and Discord on what these leaders are doing to protect children online.
This four-hour hearing comes after several parents across the country claimed that their children had self-harmed or taken their own lives after viewing content on these social media platforms. One of several lawsuits claims that TikTok allowed various accounts to share depressing and violent content directed towards children that were spread through TikTok’s algorithm. The parents of Chase Nasca claim that this is what led their child to take his own life.
Democrats and Republicans both showed mutual agreement during the hearing, expressing concerns for the safety of children on social media.
Mark Zuckerberg, the CEO of Meta (formerly known as Facebook), even turned around to apologize to the families who have been affected by his social media platforms, as seen in this video of the senate hearing.
Even though the hearing’s primary focus was on the protection of children from exposure to sexual content and exploitation, there was a wide variety of questions directed towards the social media leaders.
Shou Zi Chew, the CEO of TikTok (owned by Chinese company ByteDance), was repeatedly asked if US TikTok data is shared with the Chinese government, which he denied. He was also questioned on his background. US Senator Tom Cotton repeatedly asked Chew about his citizenship status and whether or not Chew was affiliated with the Chinese Communist Party. Despite Chew being Singaporean, the senator kept asking these questions.
Another notable part of the hearing was when republican senator Ted Cruz showed the hearing an Instagram prompt regarding sexual content involving children. The prompt warns users about potentially explicit content with children but gives the user the option to “see results anyways.”
All social media leaders said that they are continually improving safety tools on their platforms and the work they do to protect minors.
This senate hearing brings an important issue to the world’s attention—the safety of minors on social media and the general effect social media has on the world. From addictive algorithms to gore and sexual content, there are clearly holes in the “safety tools” of social media platforms, and more needs to be done to ensure that social media can be a safe platform for young people to express themselves in healthy ways.