« Imbalance of Justice | Main | More on judicial dominance of oral argument »

Friday, April 08, 2011

Hit Lists: Cyber Incitement, Cyber Threats

    As a resident of Gainesville, Florida, incitement has been on my mind lately.  Is the Internet a game-changer for the law of incitement and/or "true threats"? When Gainesville pastor Terry Jones recently burned a Quran and put the video on the Internet, it was specifically foreseeable that violence would result, even though inciting violence was not his purpose. And First Amendment law makes it almost impossible to hold Jones legally responsible for the violent response of his audience. First Amendment law typically assumes (regardless of evidence to the contrary) that audiences will behave rationally and not leap to violence when confronted with offensive speech.  Instead, offended audience members will engage in counterspeech to drive "noxious doctrine" from the marketplace of ideas.  The Jones incident, however, raises the question whether the ideals that underlie current First Amendment doctrine are foundering on the shoals of the new reality the Internet creates. 

   Another recent case raises more directly than the Jones incident the question whether First Amendment principles and doctrines governing incitement and true threats need to be adapted  in light of the unique dangers of Internet speech. In December 2010, blogger and occasional radio talk show host Hal Turner was convicted of threatening to assault or murder three federal judges based on a blog post stating that they "deserved to die" for affirming dismissal of a challenge to a handgun ban. "The postings included photographs, phone numbers, work address, and room numbers of these judges, along with a photo of the building in which they work and a map of its location." [FBI Press Release]  Turner's attorney evidently plans to appeal.

 Although the Turner case was tried as a "true threats" case, the speech involved fits at least as squarely into the legal definition of "incitement."  The line between true threats and incitement is not always clear.  In Virginia v. Black, 538 U.S. 343 (2003), a plurality of the US Supreme Court defined "true threats” to "encompass those statements where the speaker means to communicate a serious expression of an intent to commit an act of unlawful violence to a particular individual or group of individuals. The speaker need not actually intend to carry out the threat." True threats are not protected by the First Amendment because they engender fear and intimidation and disrupt the lives of victims.  Incitement, by contrast, involves advocacy "directed to inciting or producing imminent lawless action" that is "likely to incite or produce such action." Incitements are unprotected because they create a likelihood of violent actions, not because of the fear they engender.

    Put (overly) simply, the distinction between a threat and an incitement is as follows. A threat involves a speaker saying to a victim: "I will do you harm."  Whereas, an incitement involves a speaker saying to third parties: "You ought to harm someone (or some thing)." This distinction gets blurred, however, in a case like Turner's.  Turner's statement was arguably designed to create fear and intimidation in the three federal judges against whom it was directed and to cause them to change how they ruled in future cases.  However, it was not clear that Turner contemplated personally doing violence to the judges.  Instead, his speech was aimed at persuading a third party to do violence to the judges "on his behalf," so to speak. His speech deserves censure (moral certainly, legal arguably) because it magnifies the risk of violence by unidentified third parties, and the risk is undoubtedly greater because the speech took place on the Internet.

     But would it meet the constitutional test for unprotected incitement?  Brandenburg v. Ohio arguably would prevent convicting a defendant like Turner for incitement, unless the contours of current doctrine are dramatically altered to fit the Internet context. Brandenburg provides strong protection for advocacy of violence by  radical dissidents like Turner, and it is a proud pillar of American First Amendment jurisprudence precisely because it sets an extremely high bar to imposing liability in incitement cases. The speech in Brandenburg, though, is completely despicable. There, the Supreme Court defended the right of a hooded Ku Klu Klan speaker to exhort his audience to "[s]end the Jews back to Israel," and to "[b]ury the niggers." This speech took place at an "organizers' meeting" of the Klan, at which some of the attendees were clearly armed. The Supreme Court nonetheless found the speech to be protected by the First Amendment.

    In striking down Ohio's prosecution of the Klansmen for advocating criminal activity, the Court stated that the First Amendment does not allow "a State to forbid or proscribe advocacy of the use of force or of law violation except where such advocacy is directed to inciting or producing imminent lawless action and is likely to incite or produce such action."  In order for a speaker to be prosecuted for incitement, therefore, the State must show (1) intent to incite another; (2) to imminent violence; and (3) in a context that makes it highly likely such violence will occur.  Brandenburg's test appreciates the fact that the State is likely to over-predict violence from speech, and it seek to ensure that suppression is not based on fear or dislike of radical ideas or speakers.

    The main obstacle to convicting Internet speakers under Brandenburg is the imminence requirement.  Brandenburg's imminence requirement was designed around the speech situation it presented: a firebrand speaker trying to rally a crowd in a physical setting.  Brandenburg contemplates liability for speakers in those rare instances where a "mob mentality" is especially likely to take hold and lead to violent action.  The paradigm case for Brandenburg, then, is a speaker exhorting an angry torch-wielding mob on the courthouse steps to burn it down immediately. It is only in such cases, where there is no time for "evil counsels" to be countered by good ones, that advocacy of violence crosses the line into incitement.

    Brandenburg's sanguine attitude toward the prospect of violence rests on an assumption about the audiences of radical speech.  Brandenburg assumes that most citizens (even Ku Klux Klan members) simply are not susceptible to impassioned calls to violent action by radical speakers.  In fact, Brandenburg represents the fruition of a libertarian theory of free speech planted by Justices Oliver Wendell Holmes and Louis D. Brandeis in a series of mostly dissenting opinions brought against social radicals following World War I. As I've discussed elsewhere, that theory makes several assumptions about the likely "audiences" of potentially inciting speech. The most fundamental assumption is that these audiences are typically composed of rational beings who will not leap to violence simply because radical speakers urge them to do so. Not only is the audience assumed to be rational and skeptical, but they are also assumed to be willing and motivated to engage in public discourse to refute dangerous falsehoods or "noxious doctrine."  

Cyber incitement represents a challenge to the rational audience assumption underlying incitement doctrine (and much of First Amendment law).  Audiences in cyberspace arguably differ from audiences in "real space" in ways that justify changing our assumptions.  How is cyber incitement different than incitement in "real space"? One of the main problems identified by scholars is audience size.  Initially, it might seem that size ought not to matter, but here's what the size complaint really means. If you magnify the potential audience, you magnify the chance that the speech will reach an audience member who is NOT rational and NOT willing to listen to counterspeech that defuses the violent advocacy of its dangers.  This prospect is heightened by the technology of search; cyber audiences searching out violent advocacy on the Internet may be searching for confirmation of their own violent plans or projects and may be especially impervious to counterspeech even if it were immediately available, which it is not.  [It is also worth mentioning that Internet speech crosses geographical borders into communities where counterspeech is not the norm.]

A related argument is that cyber audiences may be more susceptible to indoctrination and exhortations to violence than the audience envisioned by Brandenburg.  [Let's remember, however, that Brandenburg involved a Klan rally!]  Certainly the Internet enables subcommunities of hate to flourish, and interactions within these subcommunities may serve to "normalize" violence.  These communities are supported by the anonymity the Internet enables, and the speed of Internet communications allows speakers to reach individual audience members at the point when they are most vulnerable to calls for violent action. 

Finally, audience members in "real space" are "connnected" to one another and thus can exert a restraining influence on the individual who is spurred to violent actions by the words of a fiery speaker. Brandenburg contemplated the dark side of crowd behavior and specified that incitement can occur when a mob mentality is likely to take hold; the flipside of "mob mentality," however, is that audiences--even audiences of Klansmen--rarely react with immediate violence to impassioned rhetoric, which sends a signal to those individuals who would undertake violence if left to their own devices.   The moderating influence of crowd response cannot take place in cyberspace, which is yet one more reason that cyber incitement may indeed justify a different legal response than incitement in real space.

That said, I am generally skeptical of legal doctrines that assume the worst of audiences, especially in light of the tendency of governments to overstate the link between speech and violence.  If the imminence requirement is to be replaced in cyber incitment cases, it should be replaced by a requirement that still tips strongly against suppression of threatening hyperbole directed toward public institutions or public officials. 

 

Posted by Lyrissa Lidsky on April 8, 2011 at 08:45 AM in Blogging, Constitutional thoughts, Criminal Law, First Amendment, Lyrissa Lidsky, Weblogs | Permalink

TrackBack

TrackBack URL for this entry:
https://www.typepad.com/services/trackback/6a00d8341c6a7953ef014e874930ab970d

Listed below are links to weblogs that reference Hit Lists: Cyber Incitement, Cyber Threats:

Comments

Turner's case sounds like the "Nuremberg Files" case from about 10 years ago, where a civil judgment based on threats against an anti-abortion group was affirmed. The web site included calls to justice, provided phone numbers and addresses of doctors who perform abortions, and (most famously) crossed out or grayed out names of doctors who had been killed or injured.

Posted by: Howard Wasserman | Apr 8, 2011 10:30:29 AM

The comments to this entry are closed.