TikTok, Snapchat, and YouTube Come Under Fire from Senate Subcommittee…They’re Harming America’s Youth

By Rawpixel.com/shutterstock.com
By Rawpixel.com/shutterstock.com

Are social media platforms safe for children? Sen. Edward Markey from Massachusetts doesn’t think they are. He told a Senate Commerce Committee on consumer protection, “The problem is clear: Big Tech preys on children and teens to make more money.”

After hearing damning testimony against Facebook by a former higher echelon employee, the top popped off a can of worms. Children are worth money to Big Tech. Big money. And it’s why they’re targeted.

The testimony revealed how swapping photos on Instagram, which is owned by FB, has proven to be harmful to certain teens who are privately sharing risque photos while openly bullying others, even to the point of the assaulted victims sometimes harming themselves. 

In light of this revelation, the subcommittee has widened its net to include all social media platforms that are destructively aiming their services at young people and are competing for their loyalty. 

Referring to YouTube, TikTok, and Snapchat, Sen. Richard Blumenthal from Connecticut said, “We’re hearing the same stories of harm. This is for Big Tech a big tobacco moment… It is a moment of reckoning. There will be accountability. This time is different.” 

Michael Beckerman with TikTok, Leslie Miller with YouTube, and Jennifer Stout from Snapchat were all asked if they would get behind bipartisan legislation to add additional privacy rights for kids. They were also asked if they would be willing to ban all ads that targeted children.

All three of the executives claimed their respective companies already have measures in place that meet what they’re being asked to do, but the subcommittee didn’t buy it. Blumenthal was the first to point a finger by telling them, “This is talk that we’ve seen again, and again, and again.” He told them their talk was “meaningless” without appropriate action.

Beckerman agreed by saying, “sex and drugs are violations of our community standards; they have no place on TikTok.” But he was also quick to point out the tools TikTok already has in place. Screen-time management allows parents to enter keywords for what they don’t want their children to see and to monitor the length of time they spend using the platform.

TikTok is fully owned by a Chinese company called ByteDance. Since the platform launched five years ago it’s accumulated over one billion registered users, with the largest base consisting of young children and teenagers.

Stout was asked to give her take on a 19-year-old who died from taking counterfeit medication he purchased over Snapchat. “We’re absolutely determined to remove all drug dealers from Snapchat,” she told the subcommittee. She then pointed to the detection measures Snapchat already has in place but agreed that drug dealers are often able to evade them.

As far as she knows, Stout said Snapchat is the only platform that uses real live people to monitor content. She said the others all use artificial intelligence, or “bots,” which are only able to do what they’ve been programmed to do. But still, there’s something Snapchat got all wrong.

By allowing users to share pics and videos which will quickly disappear from sight, kids using the platform can avoid their snoopy parents seeing what they’ve been up to and what they’re sharing. The individual user accounts keep no record of previous activity.

There is also the undetectable danger on all of the platforms of false identities. Kids are naive and predators know it. While it’s tough enough for a human to detect this disguised activity, artificial intelligence is completely incapable. 

Perhaps this time around the platforms will heed the warnings they’ve ignored in the past, but don’t go placing money on it. Especially with Chinese-owned TikTok. Big Tech is no friend of America’s children. To them, our kids are numbers that equate to money. If a few of them have to get harmed along the way, oh well, it’s just the cost of doing business. You understand. Don’t you?