« The Shining Bibliographical Resource on the Hill | Main | »
Wednesday, September 17, 2008
Mind Games and Privacy Risks
In case you missed it in the Neuroethics & Law blog earlier this year : A company called Emotiv is getting ready to roll out a video game that you control with your mind. As a recent CNN article describes it:
“The Emotive EPOC headset – the first Brain Computer Interface (BCI) device for the gaming market . . . detects and processes real time brain activity patterns (small voltage changes in the brain caused by the firing of neurons) using a device that measures electrical activity in the brain. In total, it picks up over 30 different expressions, emotions, or actions.”
Other companies offer (or intend to offer) similar technology: NeuroSky offers a series of devices that can convert states of attention, meditation, or other mental states associated with specific brainwave patterns into commands for computers, robots, appliances, or whatever the user wishes to control. Other products of this kind include OCZ’s “neural impulse actuator” and Brain Actuated Technologies, Inc’s “cyberlink headband” and “brainfingers” technology (The YouTube video I’ve attempted to embed above provides a demonstration of the headband).
This is all really intriguing – and I can’t wait to see whether we use this technology to retool law school exams -- or perhaps even oral arguments in the Supreme Court – so that they take the form of video games, where you can zap monster-shaped questions and challenges to your client’s claims by conjuring up brilliant arguments inside your mind’s eye.
On a more serious note, though, I wonder whether this technology raises more serious privacy concerns than are raised by archived Google searches and recorded Web use. That’s not to say every record we produce by mental effort must remain hidden from public view. When our minds directly generate only the same kind of conscious action that once required an intermediate muscle movement, a mind-controlled video game doesn’t seem to compromise privacy any more than the old-fashioned type of Web-based video game would (If it makes sense to call any such game “old fashioned.”). We probably don’t reveal much more about ourselves when we fire a virtual laser gun with a brainwave pattern than we do when our mind directs our hand to press a button to zap a virtual gun. But as the CNN article notes, the EPOC headset not only picks up brain activity associated with muscle movements, but also emotions – and perhaps emotional (or other mental) activity you’re not fully aware of as you experience it. Put this dimension of video games’ future together with another trend – the increasing trend of gamers to seek entertainment in on-line interactive games generated by computer servers far beyond their homes – and you have a recipe for what could be a serious privacy problem: a world where every one who plays a Web-based video game, or issues mental commands in other Web-based activities, can do so only by sharing unconscious feelings or other states of mind with the outside world, and perhaps offer them for recording and archiving.
To be sure, it would not be entirely unprecedented for us to find our privacy threatened by technology that recorded evidence of unconscious thoughts or feelings: Experts analyzing a surreptitiously recorded conversation might infer things about our emotional state from our tone of voice. A public video camera on the street might capture facial expressions, patterns of movements, or other behaviors that betray feelings or mental states that we’re not even aware of. But I somehow find it more unnerving that technology is emerging that officials, businesses, or busy bodies might use (perhaps inaccurately or dishonestly) to read and record mental states even of those who stay stone-faced and silent.
That doesn’t mean I think there’s anything inherently bad about this technology – or about its use on the Web or other electronic environments. On the contrary, apart from providing us with new forms of entertainment, such technology can give some disabled individuals the power to control computers with mental signals that they can’t control with their hands. It provides individuals with biofeedback devices they can use to improve concentration or control stress. And it might also be a boon for those of us who are curious about what kind of electrical brain activity corresponds to specific experiences, thoughts, or feelings. (It would be fascinating, for example, to see a record – even a relatively crude one – of what brain activity patterns were generated during an intense nightmare or other dream).
What would be bad, however, is a world where – thanks to our near constant-connection to the Internet – we can only get the benefits of these new “neurotechnology killer apps” (as Adam Kolber has aptly described them) by sharing private feeling states with the rest of the world. At a minimum, then, I think we should be thinking about how we can reap the benefits of these (and similar) technologies with minimal costs to privacy. One excellent starting point for such thinking is Neil Richard’s forthcoming Texas Law Review article, Intellectual Privacy, which I had the pleasure of commenting on at a recent conference. As the article notes, since “we have come to rely on computers and other electronic technologies to live our personal and professional lives . . . ”, we find ourselves generating “a record of our intellectual activities -- a close proxy for our thoughts –- in unprecedented ways and to an unprecedented degree.” This need to leave a trail of intellectual records behind makes us vulnerable not just to garden-variety intrusions into our privacy, but to intrusions into our First Amendment freedom of thought. And while the article focuses on the threats of this sort that exist here and now – the record we leave when we enter Google search terms or link from Web site to Web site – our freedom of thought may be even more vulnerable in a world where we leave behind not only records of our intellectual choices, but snapshots of unfiltered emotional and other mental states. Thus, I think Neil is right that we have to tackle (with more clarity and vigor than we have before) the challenge of building robust intellectual privacy protections into our legal regime and perhaps also into the design of new technologies.
Posted by Marc Blitz on September 17, 2008 at 04:26 PM in Information and Technology | Permalink
TrackBack
TrackBack URL for this entry:
https://www.typepad.com/services/trackback/6a00d8341c6a7953ef010534b2d81b970c
Listed below are links to weblogs that reference Mind Games and Privacy Risks:
Comments
The comments to this entry are closed.