News

net.wars: Robocops

by Wendy M Grossman | posted on 07 September 2012


Great, anguished howls were heard on Twitter last Sunday when Ustream silenced Neil Gaiman's acceptance speech at the Hugo awards, presented at the World Science Fiction Convention. On Tuesday, something similar happened when, Slate explains, YouTube blocked access to Michelle Obama's speech at the Democratic National Convention once the live broadcast had concluded. Yes, both one of our premier fantasy writers and the First Lady of the United States were silenced by over-eager, petty functionaries. Only, because said petty functionaries were automated copyright robots, there was no immediately available way for the organizers to point out that the content identified as copyrighted had been cleared for use.

Wendy M Grossman

TV can be smug here: this didn't happen when broadcasters were in charge. And no, it didn't: because a large broadcaster clears the rights and assumes the risks itself. By opening up broadcasting to the unwashed millions, intermediaries like Google (YouTube) and UStream have to find a way to lay off the risk of copyright infringement. They cannot trust their users. And they cannot clear – or check - the rights manually for millions of uploads. Even rights holder organizations like the RIAA, MPAA, and FACT, who are the ones making most of the fuss, can't afford to do that. Frustration breeds market opportunity, and so we have automated software that crawls around looking for material it can identify as belonging to someone who would object. And then it spits out a complaint and down goes the material.

In this case, both the DNC and the Hugo Awards had permission to use the bit of copyrighted material the bots identified. But the bot did not know this; that's above its pay grade.

This is all happening at a key moment in Europe: early next week, the public consultation closes on the notice-and-takedown rules that govern, among other things, what ISPs and other hosts are supposed to do when users upload material that infringes copyright. There's a questionnaire for submitting your opinions; you have until Tuesday, September 11.

Today's notice and takedown rules date to about the mid-1990s and two particular cases. One, largely but not wholly played out in the US, was the several-years fight between the Church of Scientology and a group of activists who believed that the public interest was served by publishing as widely as possible the documents Scientology preserves from the view of all but it3s highest-level adherents, which I chronicled for Wired in 1995. This case – and other early cases of claimed copyright infringement – let to the passage in 1998 of the Digital Millennium Copyright Act, which is the law governing the way today's notice-and-takedown procedures operate in the US and therefore, since many of the Internet's biggest user-generated content sites are American, worldwide.

The other important case was the 1997 British case of Laurence Godfrey, who sued Demon Internet for libel over a series of Internet postings, spoofed to appear as though they came from him, which the service failed to take down despite his requests. At the time, a fair percentage of Internet users believed - or at least argued - that libel law did not apply online; Godfrey, through the Demon case and others, set out to prove them wrong, and succeeded. The Demon case was eventually settled in 2000, and set the precedent that ISPs could be sued for libel if they failed to have procedures in place for dealing with complaints like these. Result: everyone now has procedures and routinely operates notice-and-takedown, just as cyber rights lawyer Yaman Akdeniz predicted in 1999

A different set of notice-and-takedown regime is operated, of course, by the Internet Watch Foundation, which was founded in 1996 and recommends that ISPs remove material that IWF have staff have examined and believe is potentially illegal. This isn't what we're talking about here: the IWF responds to complaints from the public and at all stages humans are involved in making the decisions.

Granted that it's not unreasonable that there should be some mechanism to enable people to complain about material that infringes their copyrights or is libellous, what doesn't get sufficient attention is that there should also be a means of redress for those who are unjustly accused. Even without this week's incidents we have enough evidence - thanks to the detailed collection of details showing how DMCA notices have been used and abused in the years since the law's passage being continuously complied at Chilling Effects - to be able to see the damage that overbroad, knee-jerk deletion can do.

It's clear that balance needs to be restored. Users should be notified promptly when the content they have posted is removed; there should be a fast turnaround means of redress; and there clearly needs to be a mechanism by which users can say, "This content has been cleared for use".

By those standards, Ustream has actually behaved remarkably well. It hasapologized and is planning to rebroadcast the Hugo Awards on Sunday, September 9. Meanwhile, it's pulled its automated copyright policing system to understand what went wrong. To be fair, the company that supplies the automated copyright policing software, Vobile, argues that its software wasn't at fault: it merely reports what it finds. It's up to the commissioning company to decide how to act on those reports. Like we said: above the bot's pay grade.


Technorati tags:       

Wendy M. Grossman’s Web site has an extensive archive of her books, articles, and music, and an archive of all the earlier columns in this series. Readers are welcome to post here, at net.wars home, follow on Twitter or send email to netwars(at) skeptic.demon.co.uk (but please turn off HTML).