As a software company founder, I spent the majority of 2017 collecting feedback from teens, pediatricians, church leaders, and school administrators of the trends they are seeing in the United States related to sexting and sextortion.
Bark Technologies, which monitors over 5M teens text, email, school, and social media accounts, says that “texting is the new first base” for this generation. Pediatricians also confirmed that this behavior spans all socioeconomic, religious, and ethnicity lines and is incredibly common behavior. Unfortunately, they also report that parental denial, and unwillingness to play a gatekeeper role in the digital lives of their children is also prevalent.
“I see a lot of mental health issues resulting in a visit to the ER/ICU that are related to sexting,” according to Dr. Free Hess and Child Safety expert behind Pedimom. “Children as young as 11 and 12 years old attempting suicide as a result of this issue.”
It has been 13 years since the floodgate of free online streaming pornography has been opened as a result of overturning the Child Online Protection Act (COPA). The average age of a first phone is now 10.3 years old, and cell phones do not ship with parental controls enabled. We are now seeing the downstream effects of a generation raised on content that we never intended for their eyes.
This trend does not affect genders evenly. The CDC reports that the suicide rate among teenage girls continues to rise and hit a 60 -year high in 2015. “Suicide rates doubled among girls and rose by more than 30 percent among teen boys and young men between 2007 and 2015.”
Anonymity undermines accountability and enables victimization
The trend in social media right now is to go towards full encryption. This seems like a wise course of action on the surface, but in order to hold child predators accountable for their actions, online companies must have the ability to determine if illegal content is being traded on their platforms, which violates terms of service (in addition to being a felony).
“Last year, tech companies reported over 45 million online photos and videos of children being sexually abused — more than double what they found the previous year,” the New York Times reported. We should all be concerned about what happens when the ability for tech companies and social media platforms to align with their congressional mandate to report images of child sexual abuse material online evaporates overnight. Are we creating a playground for pedophiles?
People’s Right to Safety must be preserved for the protection of people.
The tech industry is not the first to stumble when it comes to finding the right balance of regulation from the government, but it is uniquely protected from it due to Section 230 of the Communications Decency Act (CDA) of 1996. CDA 230 protects third-party platforms from liability for what is posted on their sites. The first modifications to this law began in 2018 when President Trump signed the Stop Enabling Sex Traffickers Act (SESTA) and Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) that removed congressional-based protections for platforms that “knowingly” participate in the crime of human trafficking on their platforms.
The most visible outcome of this legislation was Backpage.com losing their protections from CDA 230 that had prevented victims from holding them accountable from profiting from their abuse until 2020.
In March, the first case against a large social media platform was allowed to proceed based upon a 2:1 ruling from the Texas Supreme Court claiming the site did not do enough to warn children about the risk of being groomed into human trafficking on their platform. This will set a precedent for other victims looking to pursue damages from the tech industry.
There is a strong need for finding the optimal balance between privacy and safety. To find this balance and protect both privacy rights and safety rights, collaboration across users, regulators, tech providers, and law enforcement is necessary. The question is what would this balance look like?
The next steps are unclear but necessary. One solution could be to issue social media permits and mandatory training for new users, mirroring the process of obtaining a driver’s permit. In this solution, permit-seekers must learn social media safety as they would road safety for a driving permit. Similarly, permit-seekers must verify their identity and have their parent or guardian’s permission to obtain a permit. Once someone obtains a permit, the permit holder must be monitored by an adult who is knowledgeable of the rules and regulations of social platforms.
The internet is over 20 years old and we are at a tipping point where hoping the tech industry can regulate itself is no longer serving society. With new competitors coming into the market every day, it is important there is a legal framework that is enforced in order to make it a fair playing field for industry and a safer space for society. The first versions of the automobile did not include safety features like seatbelts but there was eventually enough traffic on the road that the need to mitigate risk to the consumers required safety laws.
Our key recommendations
1. Increase the age on COPPA from 13 to 16 with mandatory compliance validation. This expands the regulation to protect children during their most vulnerable and impressionable stage of life when they are most likely to fall victim to human trafficking, sextortion, or cyberbullying. Age and identity validation is currently a weak point that must be addressed.
2. Funded digital safety education campaigns targeting both children and adults. Education is needed to convey the risks and responsibilities, similar to the tobacco, alcohol, driving, and firearms industries.
3. The government must recognize privacy as a basic human right. Making privacy a right must accompany the education, responsibility, and necessary oversight to protect from abuses. Additionally, for the practicality of adherence and compliance, privacy laws must be consistent across the states.
4. Empower law enforcement to detect and investigate crimes, without undermining privacy/security for all:
- Better point-capability technology for investigations and surveillance such as individual smartphone decryptors.
- Solutions must be developed for efficient victimization reporting and takedown.
Contributing author: Matthew Rosenquist, CISO, Eclipz.io.