Brain scans and neurotrash

It's the ultimate branding strategy. Just slap "neuro" before a word and the goofiest speculation becomes respectable science. 
Denyse O'Leary | Nov 24 2009 | comment  

The notion that "the mind is what the brain does" is catching fire in academia, especially in the trendy area of neuroscience. In other words, you -- your personality, your most intimate self, your dreams, your convictions -- are electrical circuits sparking in your gray matter. Recently, New York Times pundit David Brooks informed us that:
When you go to an academic conference you expect to see some geeks, gravitas and graying professors giving lectures. But the people who showed up at the Social and Affective Neuroscience Society's conference in Lower Manhattan last weekend were so damned young, hip and attractive. The leading figures at this conference were in their 30s, and most of the work was done by people in their 20s. When you spoke with them, you felt yourself near the beginning of something long and important.
That did not fill me with confidence. We'd notice the same thing on a fashion runway. Fashion runways are long and (commercially) important too. But is the consequence good, bad, or irrelevant?

To me, quite honestly, the whole "neuroscience" attempt to explain humanity sounds like a fad. I was involved in a big neuroscience project that lasted several years. So I know what I am talking about when I say that the use of neuroscience to cure social ills is not a workable idea.

In fact, it sounds like the century-old eugenics craze (In that case, a desire to fix the world through controlling who is allowed to have babies). What am I to make of statements like

I'm free to speculate that this work will someday give us new categories, which will replace misleading categories like 'emotion' and 'reason.'
Why are those categories misleading if they have worked for humans for many thousands of years?

First, everyone believes in something, whether it is the workplace union, the local football fan club or the Catholic Church.

A recent neuroscience study didn't find that there was any important difference between the brains of "believers" and "unbelievers."

Among humans, there aren't any unbelievers, unless they are genuinely too stupid that you couldn't tell them from cows. But that's unlikely.

Oh wait! Even a cow believes in something. She believes, totally, utterly, adoringly, in fresh green grass and is not slow to testify to her belief either. She just scarfs it up - and just try leading her away from it.

Humans believe in ideas, not grass. That should make a difference.

If the young, hip neuroscientists can't explain that, then they have not made the connection that would justify the lauds they are currently given.

However, one really interesting new development that deserves attention is that even "skeptical" publications are becoming very justifiably skeptical of all this neuro-nonsense -- a healthy response, surely, under the circumstances.

For example, Here, at New Humanist, in an article titled "Neurotrash", Raymond Tallis (dubbed by The Economist one of the world's top living polymaths) rallies the neuroskeptics as follows:

Hardly a day passes without yet another breathless declaration in the popular press about the relevance of neuroscientific findings to everyday life. The articles are usually accompanied by a picture of a brain scan in pixel-busting Technicolor and are frequently connected to references to new disciplines with the prefix "neuro-". Neuro-jurisprudence, neuro-economics, neuro-aesthetics, neuro-theology are encroaching on what was previously the preserve of the humanities. Even philosophers - who should know better, being trained one hopes, in scepticism - have entered the field with the discipline of "Exp-phi" or experimental philosophy. Starry-eyed sages have embraced "neuro-ethics", in which ethical principles are examined by using brain scans to determine people's moral intuitions when they are asked to deliberate on the classic dilemmas. Benjamin Libet's experiments on decisions to act and the work on mirror neurons (observed directly in monkeys but only inferred, and still contested, in humans) have been ludicrously over-interpreted to demonstrate respectively that our brains call the shots (and we do not have free will) and to point to a neural basis for empathy.
Well, exactly. Neurotrash.

We like to sing to our children, "The cow jumped over the moon." Reality: Cows never even considered putting a cow anywhere near the moon. People did consider putting a man on the moon. And then those people had a lot of work to do, work that cows wouldn't understand. And the people did it.

Unfortunately, neurotrash may not always be harmless nonsense in marketing departments about what color of car people choose. Increasingly, in the form of neurolaw, it is catching on in the legal profession, in the same way that lie detector tests did decades ago. What happened there was that some people learned to fake results -- people who may well have committed serious crimes. Who knows how many others were damaged by false results when they were innocent?

A serious ethical question also erupts as to why the accused's brain should be scanned anyway. It is not a crime to think about a crime, only to act outside the law. Even if a brain scan showed the accused was thinking about it, that would never prove he did it. Lots of employees hate their boss and wish the boss would just drop dead. If you scanned their brains... well, let's say it's better not to go there. Very few employees actually commit a violent crime against the boss, so the brain scan evidence -- even if reliable, which it probably isn't -- is not worth gathering.

Also, we must consider traditional principles of law. Under English common law, if a person cannot be convicted on the external evidence of their intent and actions, that person cannot be convicted. Period. It is too bad if the prosecution team loses, even when morally certain that the accused is guilty. But that is an incentive to improve their procedures in normal ways.

The really scary prospect for neuroscience is that it will be used to "improve" us. You can already find policy wonks speculating about second-generation social engineering projects under the guise of "progressive humanism". Now that we "know" that that morality is hard-wired into our neurons, perhaps we can use science to make us all nicer people. One of Tony Blair's advisors recently contended that "brain and behaviour research is reframing political debates".

I'd like to propose a radical idea. Why don't the neuroscientists and progressive humanists stop hyperventilating and chill out for a while. Let them reflect on the fact that a hundred years ago phrenology, the "science" of analysing behaviour by putting a tape measure around a skull, was all the rage. Until they can account for the difference between the mind and the brain, their research might not be worth a hill of beans. In fact, it might just be, in the words of Raymond Tallis, neo-phrenology.

Denyse O'Leary is co-author of The Spiritual Brain.

This article is published by Denyse O'Leary and under a Creative Commons licence. You may republish it or translate it free of charge with attribution for non-commercial purposes following these guidelines. If you teach at a university we ask that your department make a donation. Commercial media must contact us for permission and fees. Some articles on this site are published under different terms.

comments powered by Disqus
Follow MercatorNet
MercatorNet RSS feed
subscribe to newsletter
Sections and Blogs
Family Edge
Sheila Reports
Reading Matters
Demography Is Destiny
From the Editor
contact us
our ideals
our People
our contributors
Mercator who?
partner sites
audited accounts
advice for writers
privacy policy
New Media Foundation
L1 488 Botany Rd
Alexandria NSW 2015
+61 2 8005 8605
skype: mercatornet

© New Media Foundation