Neuroskeptic and the folks he’s discussing appear to be excessively naive concerning the mind and its vulnerabilities:
An interesting new paper by Swiss researchers Marcello Ienca and Roberto Andorno explores such questions: Towards new human rights in the age of neuroscience and neurotechnology
Ienca and Andorno begin by noting that it has long been held that the mind is “a kind of last refuge of personal freedom and self-determination”. In other words, no matter what restrictions might be put on our ability to speak or act, or what coercion is used to force us to behave in a certain way, our thoughts, beliefs and emotions are free and untouchable.
Yet, the authors go on to say, “with advances in neural engineering, brain imaging and pervasive neurotechnology, the mind might no longer be such unassailable fortress.”
I have no problem discussing the impact of new technologies on questions of ethics and morality – but one should start from a clear understanding of what’s come before. The very statement that the mind is a refuge betrays a breathtaking ignorance, not to mention an implicit philosophical confusion.
Every time we have an interaction with someone else, there is an opportunity to breach this supposed invincible citadel of refuge. From the most mundane and mild to the prisoner threatened with torture, from advertising (as I discussed a few days ago) to the preacher in the pulpit preaching hellfire and damnation, every interchange carries with it the possibility, and very often the intention, of changing some element of your being. For example, consider the happy person who has a child. Someone kills that child, intentionally. Now that person is very unhappy. The citadel is not in the least inviolable.
Nor should it be. We are not free to execute any random action that occurs to us, for there are consequences to actions, from trivial to terminal; this is, in fact, part of simply evolutionary theory. And the emotions? They are part and parcel of the evolutionary process; they cannot be independent of the outside world, for they are the primary pathway of interpretation of that outside world, an older mechanism than the rationality we rattle on about. Such a refuge would be an oddity in the evolutionary chain; indeed, on another day, I might make the case that pathological individuals may exhibit these characteristics, but they are evolutionary outliers.
I’ve not mentioned the more obvious method for breaching the citadel yet, because that carries with it the philosophical confusion I see here – the mind / body duality. Simple physical damage to the brain can severely damage the mind, and thus this refuge. Some folks lose their short term memories; some have amnesia; etc. By denying this linkage through their assertion that “… beliefs and emotions are free and untouchable,” they indulge in the mistake of the mind / body duality.
The point of the paper – and Neuroskeptic’s post – is to discuss civil rights and how they relate to new technology in which it may be credible to read and even write the mind via fMRI in the not too distant future. I think the first step is to recognize the conceptual differences, if any, between these new technological methodologies, and those already covered by law. After all, we’ve employed many methodologies in the search for truth and intentionality, from phrenology to polygraphs (both failed). Perhaps the key element here is the element of voluntariness? But what about the mundane collection of evidence? Such collections are rarely a matter of voluntary action on the part of a suspect, and yet they remain the only manner in which to justly convict a person of a real crime.
Consider, even, fingerprints. They cannot be withheld, at least in the United States, as they do not fall under the Fifth Amendment protections. From Quora:
No; your 5th Amendment right against self-incrimination applies only to statements that you might make during a custodial interrogation. It does not apply to physical evidence that the police may take from you, such as performing a breathalyzer test, taking your fingerprints, or taking your DNA through a mouth swab or other non-invasive method. [Cliff Giley]
The protection of one’s thoughts, as deduced from an fMRI scan, may indeed require a new law, as the forcible reading of one’s thoughts is not the same as a statement, under what appears to be current law. I will note in passing there will be at least two sources of errors which jurors and law-enforcement personnel will need to consider – that inherent in the machine doing the reading, and that inherent in the thought processes of the subject. Consider the work of Dr. Elizabeth Loftus in the area of false memories as just one example of the latter problem. One’s mind is not a perfect reflection of reality.
Neuroskeptic sets up a question:
So let’s suppose that it’s the near future and this technology really works. You are applying for a sensitive job and you’re asked to take this scan to prove that you have no attraction to children. No fMRI scan, no job. Would that policy be a violation of your rights?
It’s one to ponder. My inclination is that it would be a violation. Not in the sense that it would be unfair to discriminate against someone merely for having a desire, but rather because no-one has a right to know my desires (or beliefs, or thoughts) except me. If I act on a desire, then I’ve made it important to others, but the desire per se is no-one else’s business.
What if, through other means, a manager learns a prospective new employee plans to use their employment to embezzle from the company. Is it ethical to deny employment? How about if they learn of it via the fMRI? Or are actions really the only basis on which to make a judgment?
What if it’s murder rather than embezzlement? Or pedophilia?
The problem here is that there’s a difference between desire, intentions, and actions. Is it right to take an action against a latent pedophile, one who has never acted on that desire and does not plan to? What about the one planning to use his position as a camp counselor to victimize children?
The trick is to find the proper balance that does justice for potential victims and the potentially innocent. Can an fMRI, in the near future, make that distinction? Should the proposed use be outlawed until it can distinguish between those three?