Social media giants make billions advertising to our youth but fail to protect them online 

Every day, technology is advancing at a rate that we consumers can barely keep up with.  

With each update, our children become more vulnerable to exploitation, but our protections aren’t keeping up. Our government is woefully behind the times, operating on inadequate safety solutions established in the dot-com boom, and the tech industry is doing the minimum required — which isn’t much — to protect its users. 

And worse, these tech companies that are failing to adequately protect our children are actually making money off them: A recent Harvard study revealed that in 2022, social media platforms generated nearly $11 billion in revenue from advertising directed at youth under 17 years of age.

That’s an astronomical number, especially when compared to the $31.2 million that Congress allocated to the Internet Crimes Against Children Task Force Program in 2022. This means the investments going into the program working to protect children are barely a percentage point compared to the money companies are making off of them. 

Families in the U.S. now have access to an average of 20 internet-capable devices, and victims of sex trafficking and drugs are only getting younger. This increase in internet access and communication means investigators have an exponential increase in the volume of data they must search to find victims. Bad actors lurk around every dark corner (or should I say in every chatbox and social platform). To say our children don’t stand a chance when it comes to predators would not be an exaggeration. 

The use of different platforms varies across ages, but the truth remains that where our children have access to the internet, they can be susceptible to exploitation by these platforms and the offenders that lurk there. With poor moderation, the absence of age or identity verification and inadequate or missing safety mechanisms, we are leaving our children unprotected.  

Due to a lack of funding, law enforcement is now reactive to tips related to online activity instead of being preventive. They’re unable to engage in the very operations that are designed to target the most dangerous offenders before they strike. 

The National Center for Missing and Exploited Children’s (NCMEC) CyberTipline is the nation’s centralized reporting system for the online exploitation of children. In 2022, NCMEC received 32 million cybertips. Further, Crimes Against Children Task Force commanders report a lack of quality and uniformity of the data tech companies report, leading to only 5 percent of Cybertips resulting in arrests. The data investigators do receive forces them to comb through inactionable or incomplete information, distracting investigators from the most egregious offenders.  

Shielding our children from abuse and exploitation with the current resources is untenable. A sentiment shared by thousands of experts in this space is that we “cannot arrest our way out of this problem.” The current landscape of legislation and resources to identify victims and offenders is failing our law enforcement.  


Our law enforcement teams across the country are fighting as best they can to protect children from online exploitation, but we have a serious dearth of funding. It is an especially brutal reality when juxtaposed with the billions in profits that technology companies are raking in. 

We have let technology companies police themselves for too long. In today’s hearing on child sexual exploitation, I hope to see our officials hold these company CEOs accountable; moving forward, I am eager to see how Congress calls on technology companies to do more to protect our children online. 

John Pizzuro is the CEO of Raven, a 501(c)4 organization dedicated to protecting children from victimization by raising awareness of the threat of online child exploitation, increasing resources and funding to law enforcement and lobbying for policy changes on the local and federal level. 

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

Source link

About The Author

Scroll to Top