Steven Novella dispatches dualism – whether it be the religious “woo” of Chopra and Egnor, or the philosophical zombie-inspired speculations of Chalmers:

The “easy problem” and “hard problem” of consciousness are more meaningfully described as the scientific questions and philosophical questions of consciousness. The context of my prior article was the scientific question – what causes consciousness. The materialist hypothesis – that the brain causes consciousness – has made a number of predictions, and every single prediction has been validated. Every single question that can be answered scientifically – with observation and evidence – that takes the form: “If the brain causes the mind then…” has been resolved in favor of that hypothesis.

For example, if the brain causes the mind then: there will be no documented mental function in the absence of brain function; altering the brain biologically will alter the mind functionally; mental development will correlate with brain development; and mental activity will correlate with brain activity (this holds up no matter what method we use to look at brain activity – EEG to look at electrical activity, PET scanning to look at metabolic activity, SPECT scanning to look at blood flow, and functional MRI to look at metabolic and neuronal activity).

The whole subjectivity and qualia stuff just annoys me. How do you build a brain? The easiest way (though it’s a bit time-consuming) is to evolve one. Start with simple sensor-effector nets. Add in some feedback loops to allow for memory, comparison, simulation, and learning. Why does pain “feel” the way it does? Because of the way sensory nerves are wired up to the pattern recognizers in the brain. Why does redness “feel” the way it does? Because of the way the various processing feedback loops are wired to pattern recognizers in the brain. Could it be different? Perhaps – but there’s definitely some efficiency associated with making these systems stable and predictable, so some of the low level stuff is likely to get hard-wired. And that’s about it. No qualia. It has to “feel” like something, because of the way it’s wired up, and the way it feels is simply the result of the way it’s wired, and the way we interpret the resulting pattern. No “what is it like to be an X?”it is what it is, the result of brute contingency. Nothing ineffable. There are no philosophical zombies. We have no need of that hypothesis.

The fall-out from this is significant, of course. No souls. No life after death. (What could survive?) Ergo no ultimate reward or punishment, so no system of morality based on cosmic threats and promises. Ultimately, no philosophy of mind. (Sorry, Dan.) No transubstantiation, demonic possession, or metempsychosis. No reincarnation. No out-of-body experiences. And as Pinker and others have noted, a bunch of folk-psychology notions bite the dust.

But we get to grow up.

8 Responses to “The brain causes the mind. What’s so hard about that?”
  1. Cool. Since you know the answer to this one, tell me what is this “consciousness” of which you speak?

  2. Karen says:

    My most compelling argument against the brain being designed is this: the operating system is buggy, the hard drive randomly loses data, and acquired programming clogs up the filters on the input media so that what I think I see or hear ain’t necessarily so. Even Win95 worked better. :-)

  3. geoff says:

    Except for quoting Novella, I didn’t use the word “consciousness”. And he only used it in identifying Chalmers’ (in)famous “hard problem of consciousness”. I am happy to defer to Wikipedia if you want a definition; I’m more interested in the functionality than the terminology. One of the best short introductions to the complexity of the term is that of Jakob Hohwy of Monash University in Australia.

    One problem with a naturalistic view of consciousness is that the skeptics often demand that you justify your reduction all the way down, from subjective feelings of “blueness” to particle physics (and sometimes beyond, as if quantum effects could come into play). I would concur with Novella that there are some areas in which we have to separate the “does?” question from the “how does?”. Consciousness is by no means unique in this regard; we face it with all forms of living systems, from animals to plants. In large measure, this is because we have to inject a layer of information-based interpretation, and we’re not (yet) very good at making the jump from physical systems to information-based ones. Evo-devo is giving us good tools for thinking about such systems, and I’m not worried about the outcome.

  4. Edmond says:

    Geoff, it is possible to be an atheist materialist and still find an ontological place for sensory experience. One is not committed to dualism. See my latest book ‘Narrative, Perception, Language, and Faith’ (Palgrave Macmillan, 2005, just out in paperback) and the volume I have just edited, coming out this summer, ‘The Case for Qualia’ (MIT Press), in which 18 philosophers and psychologists argue for qualia as undismissable elements of the real. We do have one or two dualists among us, but also those who believe that commitment to qualia does not make one, as Dennett claims, a believer in fairies or things divine. One of the articles is actually called ‘Subjective Physicalism’ — and that is not a self-contradiction. We endeavour to show that the old arguments against so-called ‘sense-data’ are no longer to be regarded as valid, and as a result a whole new 21st-century way of considering philosophy emerges.

  5. geoff says:

    I’m definitely interested in the book, and I’ll find a copy ASAP. My objection to qualia is in the way they are identified as “over and above”, as in Chalmers’ (crudely simplified) “zombies + qualia = people”. Qualia used to describe aggregated patterns of brain activity are unexceptionable……

  6. hisham says:

    Novella writes “there will be no documented mental function in the absence of brain function”. Chalmers doesn’t deny that, since he believes mental functions like perception and memory are brain activities. He doesn’t believe qualia are brain activities/functions, however, but he still believes that there “there will be no documented [qualia] in the absence of brain function.” Chalmers thinks that absent qualia are physically and naturally impossible, so no one will document them. He just doesn’t think they’re inconceivable. He believes they’re conceivable, and because so, logically coherent. If it’s ideally conceivable A is not B, then A is not B. If it is conceivable that A can occur without B, then A is not B.

    Geoff, you write “And as Pinker and others have noted, a bunch of folk-psychology notions bite the dust.” Funny you say that considering Pinker agrees with Chalmers that there is a hard problem.

    (Corrected and reformatted by moderator.)

  7. Here’s my question. You say the “way we INTERPRET.” But why should we have the power to INTERPRET at all? A computer does the calculation, and we INTERPRET the results. But what is interpretting? A computer cannot interpret information.

    Bablfish, for example translates (barely), but it does not pontificate on its own accord.

  8. Steve Riley says:

    Any turing compatible machine can model the system you are talking about. A computer of course could simulate the laws of physics and simulate a brain.. Given enough tinker toys and rubber bands configured correctly you could simulate your brain. Would this tinker-toy, rubber band creation feel pain and see red. Of course it would act like it did on all levels. There are so many conceivable turing compatible configurations that it is inconceivable to suggest they would all generate “red” because of the apparent inputs and outputs and internal processes. The configuration of information does not generate “red”.

    The other implication is that it is tied both to the configuration of information and the substrate on which its built. Meaning we have brains with cells that are configured in a certain way. In that sense you could never know whether a computer feels anything at all. Not knowing why “red” is red or “pain” is pain except that it’s a configuration of the brain…. you wouldn’t know when someone else’s brain was different enough that they felt these things in different ways.

  9.  
Creative Commons Attribution-ShareAlike 3.0 Unported
This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported.