LOS ANGELES — The article posted by the writer Nicholas Kristof on December 4 in the New York Times tells a heart-wrenching story with a narrow focus on only one company and one issue that plagues many giant tech companies on both adult entertainment and mainstream platforms, including but not limited to Facebook, YouTube, Twitter, Instagram, TikTok, WhatsApp, XVideos, Pornhub and virtually every platform that accepts user generated content (UGC) on a massive scale.
This is illustrated concisely in a tweet by Carrie A. Goldberg, a prominent victim’s rights attorney who specializes in online abuse and revenge porn:
I’m a victims rights lawyer. For every 1 case involving a rape tape on Pornhub, I have 50 involving rape and CSAM being disseminated on Insta and FB. Pornhub is far from perfect. But mainstream big tech is far worse and have a built-in mechanism for harassing victims directly.
— Carrie A. Goldberg (@cagoldberglaw) December 10, 2020
With the invention of any revolutionary, world-changing and globally beneficial technology, including the internet, comes a plethora of societal negatives — this is just a fact of life.
This does not mean that society should just accept these societal negatives but rather society needs to support resources to identify, study, mitigate and eliminate, if possible, these societal harms.
A Pioneering Child Protection Organization
In 1996, during the early years of the internet, and at a time when the digital professional adult entertainment industry was in its infancy, XBIZ founder and publisher Alec Helmy founded ASACP (Association of Sites Advocating Child Protection). He recognized in a truly visionary way the need for the professional adult entertainment industry to protect children in this space and do its utmost to fight and mitigate these societal harms.
ASACP was the first in creating a way for the industry to report and fight the heinous crime of child sexual exploitation on the internet. In the over two decades of ASACP’s work, there have been over one million reports processed by the ASACP Child Sexual Exploitation Tipline.
ASACP has built out industry leading membership and sponsorship programs, educates parents, policy makers and the professional adult entertainment industry, created industry leading code of ethics and best practices (including UGC), served on government initiatives and think tanks such as the Financial Coalition Against Child Pornography and the Internet Safety Technical Task Force, Berkman Institute at Harvard.
On November 6, 2006 ASACP created the international award-winning RTA (Restricted To Adults) meta-data label that allowed parental filtering technology to more accurately identify and block age-restricted content at a time when parental filtering technology was new and had significant accuracy issues that RTA helped to solve.
ASACP was awarded the 2008 Associations Make a Better World Award at American Society of Association Executives (ASAE) for creating the RTA Label.
These are only some of the accomplishments of ASACP, which is materially and financially supported by the professional adult entertainment industry. In short, the professional industry not only cares deeply about these issues they have also been proactive at every turn in trying to help mitigate and eliminate the impact this technology has on our most vulnerable citizens, our children.
A great many of the individuals that work in this space are parents and grandparents themselves and have the same feelings and concerns that the rest of society has when any child is victimized in such an abhorrent way.
The Task of Moderation
It is also important to recognize that UGC is only one fraction of a much larger adult entertainment industry and most platforms only publish consenting, professional licensed model content. These models have provided the required government proof of age documentation prior to their hiring. These include cam sites, clip sites, and pay sites.
These sites do not contain unverified UGC and these companies hire independent professional custodians of records to maintain these documents to verify all models on these platforms are consenting adults to relevant authorities.
As one Harvard lecturer, Evelyn Douek, puts it, “If you’re going to have users generating content, you’re going to have users generating harmful content.