Thursday, November 17, 2011
Yesterday, the House of Representatives held hearings on the Stop Online Piracy Act (it's being called SOPA, but I like E-PARASITE tons better). There's been a lot of good coverage in the media and on the blogs. Jason Mazzone had a great piece in TorrentFreak about SOPA, and see also stories about how the bill would re-write the DMCA, about Google's perspective, and about the Global Network Initiative's perspective.
My interest is in the public choice aspect of the hearings, and indeed the legislation. The tech sector dwarfs the movie and music industries economically - heck, the video game industry is bigger. Why, then, do we propose to censor the Internet to protect Hollywood's business model? I think there are two answers. First, these particular content industries are politically astute. They've effectively lobbied Congress for decades; Larry Lessig and Bill Patry among others have documented Jack Valenti's persuasive powers. They have more lobbyists and donate more money than companies like Google, Yahoo, and Facebook, which are neophytes at this game.
Second, they have a simpler story: property rights good, theft bad. The AFL-CIO representative who testified said that "the First Amendment does not protect stealing goods off trucks." That is perfectly true, and of course perfectly irrelevant. (More accurately: it is idiotic, but the AFL-CIO is a useful idiot for pro-SOPA forces.) The anti-SOPA forces can wheel to a simple argument themselves - censorship is bad - but that's somewhat misleading, too. The more complicated, and accurate, arguments are that SOPA lacks sufficient procedural safeguards; that it will break DNSSEC, one of the most important cybersecurity moves in a decade; that it fatally undermines our ability to advocate credibly for Internet freedom in countries like China and Burma; and that IP infringement is not always harmful and not always undesirable. But those arguments don't fit on a bumper sticker or the lede in a news story.
I am interested in how we decide on censorship because I'm not an absolutist: I believe that censorship - prior restraint - can have a legitimate role in a democracy. But everything depends on the processes by which we arrive at decisions about what to censor, and how. Jessica Litman powerfully documents the tilted table of IP legislation in Digital Copyright. Her story is being replayed now with the debates over SOPA and PROTECT IP: we're rushing into decisions about censoring the most important and innovative medium in history to protect a few small, politically powerful interest groups. That's unwise. And the irony is that a completely undemocratic move - Ron Wyden's hold, and threatened filibuster, in the Senate - is the only thing that may force us into more fulsome consideration of this measure. I am having to think hard about my confidence in process as legitimating censorship.
Cross-posted at Info/Law.
Posted by Derek Bambauer on November 17, 2011 at 09:15 PM in Constitutional thoughts, Corporate, Culture, Current Affairs, Deliberation and voices, First Amendment, Information and Technology, Intellectual Property, Music, Property, Web/Tech | Permalink
TrackBack URL for this entry:
Listed below are links to weblogs that reference Choosing Censorship:
To complicate things further, I think censorship is but one flavor of prior restraint - and, a particularly potent flavor at that. One advantage to ex post penalties, rather than ex ante interdiction, is that we can take the risk of challenging a speech-limiting law. We may succeed, and the law is invalidated, or we may fail, and pay the price. What's pernicious, potentially, about on-line censorship is that we can never reach the content, and we never have the chance to decide for ourselves if that speech is worth the legal price we'd have to pay.
So, I agree that a rule prohibiting speech about X or Y constitutes censorship, but we have a lot of those: you can't speak falsely about someone if it damages their reputation, or create / distribute an obscene film. What makes on-line filtering difficult is that the government prevents the undesirable action directly, rather than through penalties after the fact for those who contravene the rules.
Posted by: Derek Bambauer | Nov 19, 2011 2:39:11 PM
We are talking past each other.
The lawyer is "priorly restrained" from cursing at the judge. I gave an example of something where "speech is judged beforehand" to be wrong. A judge can say beforehand that using one of Carlin's seven words to curse at the judge is wrong. I'm unsure how this doesn't fit meaning of "prior restraint. As cited in NYT v. U.S.:
"Any system of prior restraints of expression comes to this Court bearing a heavy presumption against its constitutional validity."
It isn't absolute.
Posted by: Joe | Nov 19, 2011 9:11:41 AM
Joe, those all might be valid ways to use the word "censorship." I was using it in the way it was used in the original post.
"I am interested in how we decide on censorship because I'm not an absolutist: I believe that censorship - prior restraint - can have a legitimate role in a democracy."
Posted by: Andrew MacKie-Mason | Nov 18, 2011 1:19:51 PM
"I don't think the First Amendment ought to be read to allow the government to tell someone they can't speak because of the risk that their speech will violate the law, or to judge before speech is made whether that speech is in violation."
I think "censorship" is generally applied to include putting someone in jail for criticizing the government, even if the government doesn't try to seize the book before it is released. The net result is censorship.
Again, certain times, the government says they cannot speak. Let's take a lawyer. A lawyer is obligated by law to protect the privacy of client conversations in various ways. "They can't speak" about such things in a way that violates the rules. If someone cannot speak, they are in some fashion being "censored." A lawyer also has to follow certain guidelines in court. Can't curse at the judge. This is a "prior restraint" on his/her actions. Saying "expletive you" judge can readily be said beforehand to violate the rule.
"Censorship" has a certain flavor that doesn't really apply there but if we want to be literal, the lawyer is being "censored."
Posted by: Joe | Nov 18, 2011 1:02:14 PM
Thanks for the link, I'll try to read it later today or this weekend. Just to be clear on where I am before I read it (feel free to filter my thoughts afterwards through this lens) I'm suspicious of a system like this that would seem to enable majority suppression of minority dissent with relative ease.
I'm also a bit skeptical of the idea that any of the four things you cite can coexist with a censorship regime, since evaluating them would seem to require exactly the information that the censorship is designed to block. (How do we know whether a gov't is being open about their censorship if we don't see the things they're censoring...and if we can see them, what's the point of censorship?)
But anyways, I look forward to the article and seeing how you address those concerns.
Posted by: Andrew MacKie-Mason | Nov 18, 2011 12:59:54 PM
Andrew: I think we're in agreement, actually. My view is that Internet censorship should be possible, but carefully cabined and limited. I don't define whether censorship is permissible / legitimate based on the content that's targeted - I think that is a dead end, because societies differ greatly on that. (Saudi Arabia blocks material on religious faiths, such as the Bahai, that would be robustly protected here in the States - and this blocking is highly popular there.) Rather, I think it's better to assess how censorship decisions get made.
I wrote an article on this, called Cybersieves, but reading law review articles is kind of painful. Let me give you one paragraph from the shortened version (link below) that outlines my approach:
Cybersieves offers a new approach that looks not to what is blocked but at how the decision to filter is made. The Article suggests four touchstones for this analysis: openness, transparency, narrowness, and accountability. Openness probes whether a state admits to and explains its reasons for engaging in censorship; transparency measures a country’s disclosure of the types of material it blocks and how it evaluates suspect content; narrowness checks filtering’s precision in targeting information; and accountability assesses citizens’ involvement in censorship policy. This methodology, which the Article calls “the Framework,” focuses on process. It is designed to be compatible with different normative positions on filtering—whether you support or oppose blocking religious dissent, you’ll find it useful to know how forthright Saudi Arabia is in filtering that material and how involved its citizens are in that decision.
Posted by: Derek Bambauer | Nov 18, 2011 11:12:25 AM
First, let me say that when I was reading this on my phone I didn't realize there was a link to a paper defending the position -- I'll definitely give that a read.
I understand censorship in the way it was used above: prior restraint. I think speech can be removed after a process demonstrating that it violates the law, and speakers can at time be punished, but I don't think the First Amendment ought to be read to allow the government to tell someone they can't speak because of the risk that their speech will violate the law, or to judge before speech is made whether that speech is in violation.
Posted by: Andrew MacKie-Mason | Nov 18, 2011 10:50:54 AM
Well, Andrew, what is "censorship" in your opinion?
The author can answer, but let's say we are dealing with child pornography or other categories of speech now deemed properly prohibited, is not "censorship" allowed?
Posted by: Joe | Nov 18, 2011 9:05:09 AM
Could you elaborate on when you think censorship is warranted? I hold basically the opposite opinion, so I'd be interested to hear what you have to say.
Posted by: Andrew MacKie-Mason | Nov 18, 2011 12:49:02 AM