The brain causes the mind. What's so hard about that?

Steven Novella dispatches dualism – whether it be the religious “woo” of Chopra and Egnor, or the philosophical zombie-inspired speculations of Chalmers:

The “easy problem” and “hard problem” of consciousness are more meaningfully described as the scientific questions and philosophical questions of consciousness. The context of my prior article was the scientific question – what causes consciousness. The materialist hypothesis – that the brain causes consciousness – has made a number of predictions, and every single prediction has been validated. Every single question that can be answered scientifically – with observation and evidence – that takes the form: “If the brain causes the mind then…” has been resolved in favor of that hypothesis.
For example, if the brain causes the mind then: there will be no documented mental function in the absence of brain function; altering the brain biologically will alter the mind functionally; mental development will correlate with brain development; and mental activity will correlate with brain activity (this holds up no matter what method we use to look at brain activity – EEG to look at electrical activity, PET scanning to look at metabolic activity, SPECT scanning to look at blood flow, and functional MRI to look at metabolic and neuronal activity).

The whole subjectivity and qualia stuff just annoys me. How do you build a brain? The easiest way (though it’s a bit time-consuming) is to evolve one. Start with simple sensor-effector nets. Add in some feedback loops to allow for memory, comparison, simulation, and learning. Why does pain “feel” the way it does? Because of the way sensory nerves are wired up to the pattern recognizers in the brain. Why does redness “feel” the way it does? Because of the way the various processing feedback loops are wired to pattern recognizers in the brain. Could it be different? Perhaps – but there’s definitely some efficiency associated with making these systems stable and predictable, so some of the low level stuff is likely to get hard-wired. And that’s about it. No qualia. It has to “feel” like something, because of the way it’s wired up, and the way it feels is simply the result of the way it’s wired, and the way we interpret the resulting pattern. No “what is it like to be an X?”it is what it is, the result of brute contingency. Nothing ineffable. There are no philosophical zombies. We have no need of that hypothesis.
The fall-out from this is significant, of course. No souls. No life after death. (What could survive?) Ergo no ultimate reward or punishment, so no system of morality based on cosmic threats and promises. Ultimately, no philosophy of mind. (Sorry, Dan.) No transubstantiation, demonic possession, or metempsychosis. No reincarnation. No out-of-body experiences. And as Pinker and others have noted, a bunch of folk-psychology notions bite the dust.
But we get to grow up.