The Internet Still Lacks Key Protections for Women

In the past few weeks, two different female journalists I know were the targets of online abuse. Unfortunately, this isn’t a new thing for either one of them, but it served as a reminder that there really aren’t any systems in place to protect people from online bullying or hate speech.

I reported sone of this abuse — which appeared on Facebook — to Facebook’s abuse team. At first, they claimed that the posts didn’t violate Facebook’s community guidelines. But after enough people reported the same issue, we began to get notices that the content indeed violated Facebook’s guidelines and had been removed. What changed? Why should it take a certain amount of reporting to get Facebook’s attention? I’ve heard some of the technical reasons, such as “these reporting features can be misused if they’re too responsive,” but where does that leave people who face online abuse but can’t amass a large enough response?

I also tried reporting an abusive article targeting one of my colleagues to the service hosting the website where it appeared. Knowing that web hosts will already only take action if they think someone is breaking the law, I told them a) they were using someone’s copyrighted photos without permission, b) they were using a well-known person’s name and likeness without permission, and c) it’s harassment.

The host, in this case, is DreamHost. Here’s their response:

Thank you for writing. I am afraid that we are unable to censor or otherwise assert editorial control over our customers' site content. We do not tell customers what to write, or how to exercise their speech. If you have concerns over the content of their site, I am afraid that you will need to contact the site owner and resolve it directly with them. If you are unable to come to an amicable resolution with the site owner, I would then encourage you to take action as the law affords you. We will work expediently with any subpoenas, injunctions or other court orders that come in from law enforcement with appropriate jurisdiction over our business.

If you believe that a DreamHost customer is engaging the unauthorized distribution of copyrighted material, please have the copyright owner (or their legal representative) file a formal notification of claimed infringement as described in the Digital Millennium Copyright Act (512(c)(3)(A)(i-vi)). The copyright owner should be sure to provide detailed and specific URLs/links to the content in question, not including any non-infringing material. Make sure the notification contains all the elements of a proper notification as described in the law.

Once it is drafted, the DMCA Notification (text only and no attachments) should be sent to us at abuse@dreamhost.com or in response to this message. Upon receipt of a valid DMCA Notification, we will commence with the removal of such content in an expeditious manner.

I'm sorry we couldn't be of more immediate assistance, but we are only able to respond to notifications sent to us from the impacted copyright owners themselves. You may, of course, feel free to contact them and have them contact us directly. If you have any questions, please let us know.

There are a few things here worth noting:

  1. They suggest I reach out to the site owner. Given that the site owner allowed this content to be published in the first place, it seems unlikely that they would agree to take it down.
  2. They suggest I take legal action against the site owner, even though I wouldn’t have legal standing to do, so since I wasn’t the one harmed.
  3. The only law they seem to care about is copyright law.

The whole scenario reminds me of Sarah Jeong’s book The Internet of Garbage, in which she looks at issues such as online harassment, doxing, threats and abuse. She describes the case of Cindy Lee Garcia v. Google; Garcia was an actress who appeared in that weird YouTube movie Innocence of Muslims, which earned her all sorts of online hatred and threats. She sued, and the case wound up in the Ninth Circuit Court of Appeals. Jeong writes:

Cindy Garcia had been tricked into acting in the film The Innocence of Muslims. Her dialogue was later dubbed over to be insulting to the prophet Mohammed. Later the film’s controversial nature would play an odd role in geopolitics — at one point, the State Department would blame the film for inciting the attack on the Benghazi embassy.

Garcia had first tried to use the DMCA. YouTube wouldn’t honor her request. Their reasoning was simple. The DMCA is a process for removing copyrighted content, not offensive or threatening material. While Garcia’s motivations were eminently understandable, her legal case was null. The copyright owner of the trailer for The Innocence of Muslims was Nakoula Basseley Nakoula, not Garcia. Garcia pressed the theory that her “performance” within the video clip (which amounted to five seconds of screen time) was independently copyrightable, and that she had a right to issue a DMCA takedown. YouTube disagreed, and their position was far from unfounded — numerous copyright scholars also agreed.

Cindy Garcia went straight to the DMCA because it was the “only” option she had. But it was also the “only” option in her mind because 16 years of the DMCA had trained her to think in terms of ownership, control, and deletion.

When you assume that your only recourse for safety is deletion, you don’t have very many options. It’s often very difficult to target the poster directly. They might be anonymous. They might have disappeared. They might live in a different country. So usually, when seeking to delete something off the Web, wronged individuals go after the platform that hosts the content. The problem is that those platforms are mostly immunized through Section 230 of the Communications Decency Act (described in detail below). The biggest gaping hole in CDA 230, however, is copyright. That’s where most of the action regarding legally-required deletion on the Internet happens, and all of that is regulated by the DMCA.

“The censorship of the early Internet has revolved around copyright enforcement, rather than the safety of vulnerable Internet users. And so we now tackle the issue of gendered harassment in a time where people understand policing the Internet chiefly as a matter of content identification and removal — and most dramatically, by unmasking users and hounding them through the courts,” Jeong writes. “Yet an anti-harassment strategy that models itself after Internet copyright enforcement is bound to fail.”

(To read the whole chapter, go here.)

As it stands, the United States doesn’t have a federal cyberbullying law. Instead, that’s been left up to the states. While most states now have laws on the books, in many cases those laws only apply to juveniles, meaning adults can’t use these laws to fight bullying and harassment.

Both of these women — as well as the many others who’ve been targeted by GamerGate or other forms of online abuse — have had their work and online lives seriously affected by this treatment. Fighting it took a whole lot of effort for not much effect. Women and other abuse targets need much clearer paths and legal protections on their side. Otherwise, the Internet winds up truly serving only those who are targeted and harassed the least.

Journalist, editor, author, opinionator. Bylines: Guardian, New Yorker, Vice, Mother Jones, Wired. Much more at www.bethwinegarner.com.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store