Google’s commitment to removing child porn less than committed

Google has insisted time and again that it has a “zero tolerance policy” on child pornography. In 2013, Google’s director of communications stated:

When we discover child abuse imagery [child porn] or are made aware of it, we respond quickly to remove and report it to the appropriate law enforcement authorities.

Google claims to be so eager to report and remove child pornography on its services, that the company scans Gmail accounts for such images, handing over user information of offenders to the authorities. This resulted in the highly publicized arrest of John Skillern last year.

So it would appear as though Google is truly following it’s unofficial slogan, “Don’t be evil,” and doing it’s best to protect children.

There’s just one problem: Google doesn’t actually offer any way for users to “make them aware” of child pornography (aka images of child sexual abuse) on their services. When users find child porn, Google Support only offers links to national governmental authorities to which users can file reports.

The question is: If Google is so interested in protecting children, why is there no way for users to inform Google of the presence of child porn on its services? It’s not like Google is some two-bit operation that doesn’t have the ability to program a reporting feature. For goodness sake, they’re pioneering Sci-Fi dystopian-level shit in facial recognition technologies. You’d think they would have the resources and know-how to remove child porn from their search results. It just doesn’t add up.

The government authority Google Support offers for users to report child porn in the U.S. is the National Center for Missing and Exploited Children (NCMEC). From there, one is able to file a CyberTipLine Report identifying the location of the child porn they find online. Through the CyberTipLine program, the Internet Crimes Against Children (ICAC) Taskforce is then dispatched to do the actual work of removing the images from the Internet and finding the culprit responsible for posting them. There is one Taskforce in each state.

I spoke with Head of New York ICAC Taskforce, Captain Michael Tietz, and asked him if his criminal justice department was responsible for “making Google aware” of child porn on their services. Captain Tietz told me that the ICAC Taskforce has no way of contacting Google regarding images reported to them. They are only able to communicate with Google by issuing a subpoena for the removal of the content.

So, in reality, it sounds less like Google wants to be made aware of child pornography and remove it, and more like the company has ensured it is required to do absolutely nothing until legally forced to take action by governmental agencies.

Time is the key factor in cases of Internet child porn — once it is posted, it can be reposted again and again to the point where it becomes very difficult to remove and has already been saved to an unknown number of personal hard drives. Thus Google denies justice for victims of child porn by requiring this lengthy legal process.

Take for example, the case of the girl known to the media as “Jada,” a 16-year-old Texas resident who went to a party and was drugged and raped while in an unconscious state. She did not report the crime until weeks later, when the photos her rapist had taken of her naked and unconscious body and posted online were texted to her. Thousands of men had access to these photos and were able to use her body for sexual titillation without her consent. Jada was, in a way, raped again and again, enmasse. These photos went viral in Jada’s community and beyond in a matter of days.

You might think that if her photos had been reported to the ICAC Taskforce in her state sooner, they would not have gone viral to such an extent. But it turns out that the ICAC Taskforce would not have aided Jada at all, let alone in a timely manner, because the ICAC rarely acts on behalf of teenage girls.

Why won’t they provide protection to these victims? The answer I found turned out to be a disturbing, unwritten rule of law enforcement that no one is talking about.

The legal definition of child pornography is the depiction of anyone under the age of 18 in a sexually explicit situation. This is the International legal norm for many countries including the US, the UK, Canada, and Australia. However, Captain Tietz told me that the ICAC Taskforce only acts if the image in question appears to be an “obvious child” — so pubescent and post-pubescent teenage girls are not a priority. Tietz says he receives dozens of calls every day from the parents of teen girls who whose pornographic images have been posted online, but that the taskforce does nothing to help them.

This means that as soon as a girl even resembles a woman, she loses the right to legal protection. She could be as young as twelve years old, but if she’s begun to develop breasts or a curve to her hip, her body is no longer hers. This facet of the Criminal Justice system is utterly mind-boggling — the actual reality of the victim is not taken into account, just their appearance. It speaks to a disturbing, deep-seated cultural misogyny regarding female bodies: If we look like women, we deserve sexual violence. Or, at very least, it’s a normal and acceptable occurrence.

Furthermore, this legal practice is a blatant case of unequal protection on the basis of sex, as female children go through puberty at an earlier age than males and can easily appear much older, visually, than male children of the same age.

In Jada’s case, when authorities refused to help take down the images, she went a different route. She sought media attention and gave interviews about her ordeal. Jada started the social media campaign #IAmJada so people could stand in solidarity with her against Internet child porn.

Once Jada started to get publicity, websites began to delete her image, but why should she have to go through this media circus at all? The authorities should have helped her from the beginning. Now she’s forever branded by this horrible event, and you better believe her Google results will be forever linked to it.

One 16 year old minor and one a 19 year old adult were arrested for gang raping Jada in December 2014. The adult was charged with two accounts of sexual assault of a child, but was not charged for the creation or dissemination of child pornography. (The details of the minor who participated in the gang rape are unreported due to his minor status). The failure of authorities to charge him for this crime shows justice isn’t served even when the victim gains wide sympathy and support from in the media and online.

I can still easily find Jada’s child porn in Google’s search results. I wish I could report it to Google, but there’s no way to do so.

Why has our criminal justice system turned their backs on girls and young women facing this new, pervasive form of sexual violation? Why is this “obvious child” policy an unsaid norm and not written into law, allowing it to be contested?

How many victims would still be with us today had they been given protection under the law regarding the removal of their images? Could the tragedy of 17 year old Rehtaeh Parsons been prevented had she not been forced to live in a state of perpetual sexual violation every time the photo of her rape was re-posted for the world to see?

Recently, Google announced they will remove “revenge porn” from search results on request. In the coming weeks they will be creating a web-form* for users to report these images. Google’s new policy on revenge porn is being compared to their policy on child porn, in that they will remove the content from search results once made aware of it. As we can see, however, Google’s policy on child porn in practice does nothing close to what it claims.

And so, I have this to say to Google:

We don’t need your creepy dystopian Sci-Fi facial recognition technologies — stop putting money into developing that. Don’t you have enough ways to gather surveillance on us? We also really don’t need Google Glass sitting on all our faces. You could even cut down on the “Google Doodles” if you’re really tight on funds and resources over there. What we do need is for you to create an expedient and reliable method of reporting and removing child porn on your services and in your search results.

Children’s lives are at stake.

 

*Update: Google’s web-form to file a removal claim for revenge porn is now live. Unfortunately, there’s already a glaring misstep in Google’s implementation of the feature. The subject of the revenge porn must file their full legal name, which is then shared with the webmasters of all sites where the revenge porn is hosted. This means that any victims who have had their images go viral will now have to forfeit their anonymity and take public ownership of the images in order to remove them. It also puts sensitive personal information in the hands of the very people who have perpetuated the crime against the victim in the first place.

Meghan Murphy

Founder & Editor

Meghan Murphy is a freelance writer and journalist from Vancouver, BC. She has been podcasting and writing about feminism since 2010 and has published work in numerous national and international publications, including The Spectator, UnHerd, Quillette, the CBC, New Statesman, Vice, Al Jazeera, The Globe and Mail, and more. Meghan completed a Masters degree in the department of Gender, Sexuality and Women’s Studies at Simon Fraser University in 2012 and is now exiled in Mexico with her very photogenic dog.