A defense of non-epiphenomenal consciousness and free will.

[NOTE – this is re-post from the original incarnation of this blog.]

The existence of non-epiphenomenal consciousness and free will are two different, but related issues. Both are disputed by those of a physicalist persuasion, and both find themselves lacking any place within our current scientific understanding of the world. Indeed, they not only have no place, but also run contrary to a key precept of modern science: that there is no such thing as an uncaused cause.

click image for source

In the classical Newtonian picture of physics, the processes that lead to a particular brain state are governed by deterministic laws of nature. If in principle we could perfectly describe a starting brain state, then by extrapolation using those laws, we can predict with certainty a subsequent brain state. Quantum mechanics overthrows this view, revealing that fundamentally, all processes are probabilistic in nature. Instead of predicting with certainty, instead  we only have a probability that one result will win out over another (even if in macroscopic systems there are so many quantum elements that the law of averages means the probability is very high indeed). This introduces a random element to the possible evolution of systems over time, but doesn’t necessarily help with defending free will. A random result is not necessarily a free one.

This fundamentally random, but practicably deterministic state of affairs is what we observe in every area of nature we’ve ever cared to study. Physical processes alone are sufficient to explain the evolution of systems in time. So what role could mental processes have if they exist at all? And even if there is a role, by what conceivable mechanism could a mental process affect a physical process?  This is the problem of defending non-epiphenomenal consciousness.

https://i2.wp.com/images.crestock.com/2210000-2219999/2214892-xs.jpg

Click for source

Beyond questions of the efficacy of conscious systems looms the even more unlikely notion of traditional incompatibilist free will; a concept seemingly so contrary to what we know about nature that most philosophers and scientists appear to have abandoned it altogether. And it’s not difficult to see why. The suggestion appears to be that not only does the mind play a role in the evolution of brain states, but that it can also derail the chain of cause and effect by somehow tipping the probabilities in favour of what would otherwise be a vanishingly unlikely alternative options.

Given those facts, how can defenders of causally efficacious mind and free will construct a believable argument for their existence?

To be taken seriously, both non-epiphenomenal consciousness and free will are desperately in need of a viable mechanism. Without it, both are rightly open to attack as being only explainable by supernatural forces. And to be viable, I would argue that any proposed mechanism would have to both conform to our current best-fit scientific theories and be robust enough to be considered mainstream.

Some may claim that such questions are outside the scope of science altogether, being that evidence for their existence is purely subjective and therefore unverifiable by the scientific method. With most such phenomena I would agree. For instance, believers in gods may try to claim that their experience of the divine counts as evidence, while others use subjective experience to underpin all sorts of dubious pseudoscience and quackery. So right away, I should make it clear that  I consider non-epiphenomenal consciousness and free will worthy of explanation for one reason alone: they are – at first blush at least – subjectively universal phenomena. Even the most ardent physicalist must admit that without further reflection, we appear to have both. That of course is not proof – appearance often misrepresents reality – but it is I think, at least reason to investigate as best we can with an open mind.

An axiom attributed to ancient Greek philosopher Parmenides and later made famous in the modern Western world by William Shakespeare in King Lear says that “nothing comes from nothing“. The antithesis of this idea is the idea of creation ex nihilo, or “out of nothing”. The gods of many religious traditions are supposed to have pulled off such a trick at the beginning of the universe, and – unfortunately for defenders of non-epiphenomenal consciousness and free will – it’s a trick that agents seemingly also need to perform every time they exercise free will. They have to introduce or create some new event that is neither random nor wholly dependent on prior physical causes.

However, modern science has put that axiom under pressure, leading us to question whether it really is such a self-evident truth. It’s not that science has shown that matter or energy can be created ex nihilo (indeed, that would violate another key idea in physics; that of the conservation of energy enshrined in the first law of thermodynamics) but rather that modern science now suggests that the very concept of nothingness may be meaningless.

The quantum fields that make up the universe, such as the electromagnetic field and the Higgs field all have a ground state – a lowest possible energy configuration – slightly above zero, making them subject to quantum fluctuations. This is the case even in a complete vacuum, hence the name vacuum energy, although the property as applied to each field is known as zero point energy. But a vacuum is the only physical (i.e. non-abstract) definition of nothingness that makes sense within the bounds of the universe, so physically-speaking there is no such thing as nothing.

click image for source

Because excitations in quantum fields are one and the same as point particles in the standard model, this vacuum energy manifests as the creation of virtual particle/antiparticle pairs that briefly pop into existence and immediately annihilate each other. This fact applies not only to the vacuum or to space, but to every part of the universe. This vacuum energy can be thought of like the fizzing surface of a liquid, with each bubble being that brief pair of particles that burst into existence only to almost immediately pop out of it again, although it is important to note that this energy is usually both unmeasurable and unavailable to macroscopic processes – it is not some mystical energy field one can use to justify belief in dubious phenomena!

In technical terms, these particles exist for a time shorter than the Planck time, which means that due to the time-energy relationship in Heisenberg’s uncertainty principle, they remain unmeasurable and insubstantiated in the physical world. Hence the label virtual particles as opposed to actual particles we can measure.

However, just because they are virtual, one shouldn’t imagine that they play no role in the physical world. Not only have experiments shown them to be most-likely responsible for proven phenomena such as spontaneous emission, the Casimir effect, and the Lamb shift, but they are also generally thought to mediate the interaction of real particles in quantum field theory. For example, the exchange of virtual photons underlying the interaction of electrons in electromagnetism.

The only way these virtual particles can achieve actualisation and gain any kind of permanence is to draw on the energy in the surrounding environment, whilst avoiding mutual annihilation.

One situation in which this is thought to be possible is in the extreme environment of a black hole. These gravitational sink-holes bend space so severely that even the fastest moving objects in the universe – photons of light – do not have sufficient escape velocity to avoid falling into their clutches. This results in the formation of a boundary, or event horizon, from which no matter or energy can escape.

click image for source

Now consider a particle/antiparticle pair that forms at the event horizon of a black hole. In simple terms, if one of the pair forms inside the event horizon and the other on the outside, then they will not be able to interact and annihilate, and drawing on the gravitational energy of the black hole, they actualise. So both an observer on the interior of the horizon, and one on the outside witness the emission of particles as radiation. This is known as Hawking radiation after physicist Stephen Hawking who first conjectured its existence.

As previously stated, this isn’t really ex nihilo creation of matter or energy, because the creation process is driven by the intrinsic zero point energy of quantum fields, plus the energy of the surrounding system. Thus the principle of conservation of energy also means that the system involved must lose some of its own energy, or in the case of black holes the equivalent mass. In this way black holes starved of infalling matter are though to slowly but surely evaporate.

Another consequence is that the more mass or energy a system has, the greater the mass or energy of the particles that can be emitted. So whilst there are also hypothesized micro black holes, produced primordially in the early universe and perhaps still existing today, the Hawking radiation they would emit would consist only of low mass particles like electron/positron pairs or photons, which are massless and their own antiparticles. (Note that even in normal black holes, Hawking radiation is dominated by photons).

But black holes are not the only situation where this type of particle creation can occur. In theory, any energetic phenomena that forms an event horizon can perform the same trick.

One such phenomenon is known as the Unruh effect, and is a logical consequence of Einstein’s realisation that the gravitational force is equivalent to acceleration. Here an accelerating system gains kinetic energy from the gravitational field which then – from the point of view of an observer in the same relativistic reference frame as the accelerating system  – results in a radiation bath in that internal frame, as particle/antiparticle pairs actualise before annihilating. And just like the black hole case, because an accelerating system creates an event horizon (the reason for which is beyond the scope of this piece), the equivalent of Hawking radiation is also witnessed by observers outside that horizon.

In both examples, we have the formation of an event horizon creating a one-way barrier between an enclosed volume of space (the interior of the black hole and the relativistic reference frame) and the rest of the universe.

So, returning to consciousness, we have – superficially at least – an interesting parallel. In both, the external environment can influence the enclosed internal worlds via the flow of information into them, but from within those enclosed internal worlds one is only able to observe the external universe rather than interact directly with it. However, via a phenomenon such as Hawking radiation, that internal world is able to exert a physical influence back on its environment. By analogy, these phenomena correspond to a mechanism for non-epiphenomenal consciousness.

Now, I’m certainly not suggesting that consciousness resides in microscopic black holes – I’ll leave that to Romulan starships! Nor am I saying that the Unruh effect is responsible. I simply don’t have enough knowledge of physics or mathematics to surmise or calculate how small objects at short distances may or may not produce the acceleration necessary or an event horizon local enough. And I strongly doubt there is anything in mainstream neuroscience to offer as a framework for such effects in the brain.

I’m only suggesting that such seemingly ex nihilo creation would not-so-long-ago have been thought impossible without supernatural intervention in the world, but that zero point energy opens-up the possibility of a variety of effects that might – at least conceivably – be exploited by evolved systems.

Under those speculative lights, the mere possibility of horizon-enduced particle creation in connection to consciousness and the brain would provide a high-level explanatory mechanism for non-epiphenomenal consciousness. And if such creation could be directed and (perhaps chaotically) amplified, one might see how such internally-produced nudges might pave the way for free will.

https://i2.wp.com/s0.geograph.org.uk/geophotos/03/05/83/3058398_f033c313.jpg

Click for source

At such low energies, any such creation would have to be in the form massless particles like photons, and whilst this might bring its own problems in accounting for how they might deliver the needed nudges to existing processes, on the other hand, such effects should in principle be measurable and therefore testable. It should be noted that there is already some speculation about the role of photons in the brain, though stressed that this is not mainstream.

Of course, even if there is something in my speculation, many issues might remain unresolved, such as the hard problem of consciousness and how the mental domain might manage to muster and direct its will; not to mention under what ontology and laws consciousness itself might operate internally.

Also, there is danger here in stepping too far with speculative ideas. Scientists and rational thinkers are wary of any non-physicalist speculation on consciousness, I suspect because to do so opens the door to all sorts of religious and pseudoscientific nonsense that are neither objectively testable or even subjectively universal. So it’s important to not speculate more than a single step beyond our current knowledge, and to do so without any preconceptions of where one is heading.

But with that caution in mind, I still think it’s fair to say that this class of phenomena in physics at least shines a light into the domains in which we should search for clues. And even if such speculation proves fruitless, it serves to illustrate how science continually surprises us with unexpected phenomena. So while admitting that the existence of non-epiphenomenal consciousness and free will remain improbable, we should not lose hope. Closing the door on what are our most universal and all-encompassing experiences of reality – that our minds interact with and affect the physical world – is premature.

Advertisements

The imaginary number at the heart of quantum mechanics.

[NOTE – this is re-post from the original incarnation of this blog.]

I’ve no idea how many books, articles, podcasts and videos I’ve digested over the years on the subject of quantum mechanics, but it’s a lot.

What they all have in common is that they are aimed at the layperson, and therefore try to describe counterintuitive features of the theory such as superposition, the uncertainty principle, and entanglement using experimental examples and everyday analogies. Almost none of them take even the briefest of toe-dips into the actual mathematics behind the theory.

And that’s not surprising. After all, as Stephen Hawking wrote in A Brief History Of Time, “Someone told me that each equation I included in the book would halve the sales“. No-one outside academia likes to try to get their head around baffling equations, least of all those with no more tools at their disposal than high school maths. Like me.

click image for source

So, some time ago when I made an attempt to dig a little deeper by watching a series of Quantum Mechanics lectures by Leonard Susskind, I wasn’t expecting to get much out of it. I couldn’t have been more wrong.

OK, I couldn’t follow all the intricacies of the maths, and I certainly wouldn’t be able to do any of the calculations myself, but what I did gain was a good overview of the subtler concepts behind the theory, and an understanding of how the maths models and corresponds to those features, including the more counterintuitive ones, like superposition and the uncertainty principle.

The biggest revelation for me was the significance of the constant i, which confusingly is also sometimes known as j, mainly by engineers. i is the symbol that represents an imaginary number, a number that does not exist, specifically the square root of -1.

click image for source

This concept is at the mathematical heart of the theory, yet is almost never mentioned in the layperson’s literature, let alone explained. I first encountered i years ago before I started reading about quantum mechanics. A friend who works at a nuclear site who recounted how a scientist there had told him about it. We were both slightly incredulous that the maths behind quantum mechanics, the theory therefore that underpins all our nuclear know-how and much of modern technology is fundamentally based on something we don’t and can never know.

However, it’s not really quite that scary, and quantum mechanics remains by far the most experimentally accurate theory science has ever produced. As with most discoveries, the physical phenomena were uncovered first by experimentation, and only then were mathematical models found to fit what was being seen. These models were then be used to calculate the outcomes of further experiments, proving the theory, and enabling the practical use of the phenomena. The theory isn’t founded on the maths, the maths models the phenomena to give predictive power to the theory. (This is not to say that physical reality is more fundamental than mathematics or vice versa – that’s a philosophical question for another day – but rather it’s just the way scientific discovery usually works.)

So how is i used, and what does it represent? Well, for starters the concept was not created for use in QM, but pre existed and is used in classical physics. The following short clip from icedave33 (who I should also thank for clearing up my many misconceptions whilst writing this piece!) explains very well how i can be used to model 2D rotation on a 1D line:

The imaginary number i and multiples thereof, are used in a series (i, 2i, 3i etc.) to create an extra “imaginary axis” on a graph. The three normal spatial directions are combined on to the “real axis” as shown below:

click image for source

Points plotted against the real and imaginary axes then take the form of one real number (e.g. 2 or -2) and one imaginary number (e.g. 3i or -3i), making a composite that is known as a complex number (e.g. 2+3i or 2-3i or -2+3i or -2-3i). This full graph on which both real and complex numbers can be plotted is known as the complex plane.

In quantum mechanics, a system when it is observed can only be found in a few different observable states.  Before observation, the system is in superposition (a combination of these states) and each of the states is assigned a complex number.

A complex number can be represented as an arrow in the complex plane, as shown above. The “size” of the complex number is given by the length of the arrow, but the probability of finding the system in the particular state represented by the complex number of that size is relative to its squared value.

One of the ways to represent a state in quantum mechanics is as what’s known as a quantum state vector. A state vector is just a list of numbers, each corresponding to the value of a parameter of the state – for example quantum spin. States corresponding to the possible observable states are known as eigenstates.

In the maths, the equivalent of observing a parameter is to apply to the state vector a special type of mathematical operator known as a Hermitian operator. This will produce a new state vector.

If the operator is applied to a state vector already in an observable state, then the new state vector is just a multiple of the original. It is still a real number on the real axis. These are known as its eigenvalues.

However, if the operator is applied to a state vector in superposition, then the new state vector will not be like the old one. Instead the system collapses into an observable state and again we have a real eigenvalue.

Intuitively this seems wrong, because in the maths of real numbers it is somewhat like saying that adding one to a number doesn’t always increment it’s value by one, but that instead it depends on which number you begin with. To use an analogy with real whole numbers and fractions, it’s like saying that two plus one equals three, but also any fraction between two and three (2.1, 2.5, 2.999 etc) plus one also equals three.

Another analogy might be a clock’s second-hand that cannot stop between seconds because the mechanism doesn’t allow this to happen. The hand is always at a whole number value. Except that in the case of quantum mechanics, we know that the second-hand can reside between ticks, it’s just then whenever we measure the time by observing it, we always find that the second-hand has ticked into place.

So although superposition is unintuitive, the mathematics of imaginary and complex numbers and their operators models it precisely. Similarly, it’s the use of operators that allows the maths to model other features of QM such as the uncertainty principle. This states that is impossible to measure two parameters of a system’s state to a high degree of accuracy at the same time; for example position and momentum or time and energy. This happens because in quantum mechanical systems, certain mathematical operations do not commute. With real numbers, this would be like saying that two times three does not equal three times two, but that they yield two different answers. This corresponds to the way that measuring position then momentum, or momentum then position, can result in different answers.

So if one can see what the maths is doing, then is it true to say that one has a picture of might be happening physically?

For instance, to me at least, the maths suggests that quantum states in superposition are unobserved in the real world because they are at least partly “off somewhere else” on the imaginary axis, and that the act of measurement seems to “snap” them into existence in the “real” observable world by applying the Hermitian operator.

This is not to suggest that by “off somewhere else” I mean somewhere supernatural. Rather I’m suggesting that the state they are in is not realizable in the external objective word that we experience and measure. Perhaps like virtual particles they reside in some sub-planckian world between the ticks of the clock I used in my earlier analogy.

https://farm3.staticflickr.com/2093/2027749537_167fbeef0b_z.jpg?zz=1

Click for source

However, this is where science meets philosophy, and interpretation is everything. Not always in physics do the apparent properties of the maths correspond to how things are physically. The use of complex numbers in classical wave mechanics described above is a good example, but there, at least as I understand it, the use of the extra dimension introduced by i is more like a shortcut to avoid doing harder, more regular maths. In quantum mechanics I don’t believe that’s the case. The i is a mandatory part of the theory.

Additionally, there’s the possibility (or almost certainly the inevitability) that quantum mechanics will one day be superseded by an even more accurate theory that has different mathematics and thus revises the physical picture again. One thing’s for sure, like i itself, nothing in science is set in stone, but I wish I’d had at least a little introduction to the maths in some of the popular books I’d read previously.

No god-smuggling or wizardry here.

[NOTE – this is re-post from the original incarnation of this blog.]

I sometimes find myself having to defend my insistence on wanting to save consciousness and free will from the ravages of staunch physicalism, and thus having to define my reasons in doing so.

One worry I have is that my views might be seen as surreptitiously trying to smuggle-in pseudoscience or religion, so let me explain why I have no interest in defending either.

I am only interested in phenomena that are either backed-up by traditional peer-reviewed empirical evidence, or are universally accepted by subjective agents to be real (at least at first sight).

https://i2.wp.com/www.seemsartless.com/photos/full/puppet-wizard.jpg

Click for source

So for me, both consciousness and free will qualify for explanation by the second method. Even Dan Dennett, before he had a good think about it, would I’m sure admit to believing that both were genuine phenomena.

However, the concepts of a god or gods, and the ideas of pseudoscientific disciplines like homeopathy qualify by neither method.

They are phenomena that could be real but have no convincing evidence in their favour, either objectively by scientific inquiry or subjectively by universality.

The complexity of the universe might be seen by some as evidence for god by virtue of the watchmaker argument. But this is not first-hand evidence of a phenomenon, but rather a deductive argument based on other independent phenomena.

So, no god-smuggling here. Or wizardry.

Strong AI, quantum biology and consciousness.

[NOTE – this is re-post from the original incarnation of this blog.]

Today’s Guardian is running a piece on the possibility of strong AI (also known as AGI) by physicist David Deutsch entitled “Philosophy will be the key that unlocks artificial intelligence“.

Despite the title, Deutsch isn’t arguing that philosophy is needed to speculate how future science might unlock the hard problem of consciousness. Instead he refers to further interesting philosophical questions regarding what constitutes a person, and what rights we might confer on an AGI.

I suppose it’s no surprise that my first thoughts are usually geared towards explaining consciousness itself and the explanatory gap; after all, I’m not happy with the current physicalist position. For me, asking how the physical make-up of the world we perceive through science can be reconciled with the subjective experience we all attest to having is a priority.

https://i2.wp.com/s3.frank.itlab.us/photo-essays/small/oct_29_4766_wet_leaf.jpg

Click for source

However, for pragmatic people actually doing the science who either don’t know, or for practical reasons don’t care, about the metaphysical questions (or for perfectly contented physicalists) my priority might not even count as a problem.

So I happily read and enjoyed the article for what is was supposed to be. But one comment and link did stick out to tweak my metaphysical funny bone:

“Some have suggested that the brain uses quantum computation, or even hyper-quantum computation relying on as-yet-unknown physics beyond quantum theory, and that this explains the failure to create AGI on existing computers. Explaining why I, and most researchers in the quantum theory of computation, disagree that that is a plausible source of the human brain’s unique functionality is beyond the scope of this article.”

I clicked the link to find an inter-disciplinary paper from the University of Waterloo, Canada entitled “Is the brain a quantum computer“, where they argue the case that it is not.

One of the co-authors is from the philosophy department, yet the paper makes several statements that seem to me to simply presuppose the physicalist view, and totally ignore the hard problem. For instance:

“In our wing analogy, it is unnecessary to refer to atomic bonding properties to explain flight. We contend that information processing in the brain can similarly be described without reference to quantum theory.”

The problem with likening phenomenal consciousness to a wing is that the function and make-up of the phenomenon of wings is completely explained via normal emergence, whereas phenomenal consciousness is not.

In other words, one can in principle deduce how wings come to exist given full disclosure on their microphysical make-up. So up from the binding of quarks by the strong force and the binding of electrons by the electromagnetic force, out pops the emergent solidity of matter and chemistry, and on upwards through biology to the make-up of a wing.And similarly one can deduce why they exist using principles from evolution, where we get a satisfying story of how large self-replicating systems of matter interact with the environment in such a way that functions like flight emerge.

However, the qualities of subjective phenomenal consciousness – our experienced internal world – cannot be explained in this way. Nothing from current microphysics up to current neuroscience even gives us a hint of what constitutes cognition or qualia. This is what philosopher David Chalmers calls the difference between normal weak emergence, and the strong emergence of phenomenal consciousness.

And again similarly, evolution doesn’t tell us why our functionality, which can in principle be perfectly simulated on a computer, additionally gives the system a felt experience. It seem an unnecessary epiphenomena, so why did it evolve?

Now of course, one might suggest that phenomenal consciousness is an illusion or that some future discovery will solve the hard problem, but that’s not the same as simply sweeping these issues under the carpet with an analogy that doesn’t seem to work.

The paragraph continues:

“Mechanisms for brain function need not appeal to quantum theory for a full account of the higher level explanatory targets.”

Here again, this sentence is only true if one ignores the “target” of the hard problem, and instead only aims at the functional aspects of consciousness.

The authors go on to explain why they feel quantum processing could not be a factor in processing:

“…quantum-level events, in particular the superpositional coherences necessary for quantum computation, simply do not have the temporal endurance to control neural-based information processing.”

“…it could perhaps be argued that extremely short quantum events somehow “restructure” neurons or neural interactions, to effect changes at the timescale of spiking, these speculations are hampered by the significant biological plausibility problems we explore in the next section.”

I do not know enough about the subject to refute the first statement, or press for an alternative where there is an more than one type of processing going on in the brain, but for sure the admission in the second statement allows us to substitute “control” (in the first) with “influence” to make the statement less plausible.

This is particularly effective since the objection to their own admission; the biological implausibility in the next section, turns out to be very shaky. To be fair that’s not the fault of the authors, because the paper was written six years ago. Since then, science has discovered several real and potential biological quantum phenomena, and the field of quantum biology is burgeoning and in the news.

From photosynthesis to the magnetic sensing of robins, nature seems to have found a way for quantum effects to not have a problem with the “warm, wet” conditions in biological systems.

source = photons in my back garden, bouncing off Robbie

One argument in the paper that does strike me as interesting is this:

“Even if quantum computation in the brain were technically feasible, there is a question about the need for such massive computational efficiency in explaining the mind. It is technologically desirable that a quantum computer should factor a large number into primes in polynomial time, but there is no evidence that brains can accomplish this or any similarly complex task as quickly.”

Here the word “mind” is used, but again subjective experience seems to be ignored. But the point does seem to have some power against cognition. This might suggest that any quantum processing is limited to qualia, which might include cognitive phenomenology, but not cognition itself.

The final part of the paper is introduced with this:

“Moreover, as we argue next, there is no compelling evidence that we need quantum computing or related processes to explain any observed psychological behavior, such as consciousness.”

The problem here is that Subjective consciousness is not synonymous with “observed psychological behaviour”.

The section goes on to attack Penrose‘s Orch-OR model of quantum collapse, it’s references to Godel‘s incompleteness theorem, and Hameroff‘s suggestions as to how Orch-OR might work in the case of the brain.

The main argument against Orch-OR seems to mostly rest on it requiring revisions to current scientific understanding, and the lack of evidence. I’d deny neither of those, but point out that it’s supposed to be a speculative idea to solve a problem, not a fully fledged theory. To take such ideas out of context by ignoring the problem they seek to solve doesn’t really show us anything.

I can’t argue the case regarding the use of Godel and Hammeroff’s ideas with Orch-OR, but their objections do appear to be on more solid ground there. As I say, the whole thing is highly speculative, as Penrose himself would admit. An alpha version of a model if you will.

In summary the authors admit the possibility of quantum computing (or quantum effects on classical computing) in the brain but suggest that it is less plausible than classical physics doing all the work.

I’d suggest in my own summary that after six years the jury is still out, but that the idea has become a lot less implausible. Additionally, it’s not helpful to make reference to phenomenal consciousness in a paper, and then ignore it in favour of other uses of the word in some of your arguments.

For both reasons, I don’t think the paper shouldn’t have been referenced.


UPDATE Feb 2013

For more on the burgeoning field of quantum biology, see this collection videos from the recent Quantum Biology workshop at the University of Surrey.

Problems with physicalism.

[NOTE – this is re-post from the original incarnation of this blog.]

Having recently written about panprotoexperientialism as an alternative metaphysics, I’m now going to say a few words as to why one might want to look for alternatives in the first place.

On first sight our current scientific understanding of the universe seems consistent with a physicalist viewpoint, so what’s the problem? Why introduce novel ideas without any objective evidence to warrant such a move?

Click for source

Well, science doesn’t just collect evidence. The whole endeavour starts by observing phenomena in the world that require explanation, and then asking relevant questions. Only then do we formulate a hypothesis and devise experiments to collect evidence.

The phenomena to be explained here are those of the mind; specifically qualia, or the qualitative nature of experience. This covers sensory experience, (for example, what it is like to see the colour red), and can arguably be extended to include cognitive phenomenology, or what it is like have thoughts, beliefs, desires or anything else.

Imagine trying to explain why one person likes broccoli and another does not. We might talk about the physical mechanics of taste and differences in sensitivity. That’s a useful answer for many reasons, but is it really telling us about the subjective contents of the mind?

We might have discerned how and why the subject is experiencing an unpleasant taste, but that’s simply because we are able to liken their experience to those of our own (for example, we might imagine what it’s like to taste something that’s too bitter).

We still have no idea how the broccoli tastes to them if we like it ourselves, because the experience in question isn’t tractable by analogy, just like we would not be able to explain the qualitative experience of sight to a blind person or hearing to the deaf.

This is related to what philosophers call the explanatory gap, and in itself it presents a potential problem for physicalism. Just how do you explain how objective physical causes, like the firing of neurons in the brain, give rise to subjective personal experience?

https://farm3.staticflickr.com/2627/3804570043_1cbd1d8ebf_o.jpg

Click for source

Physicalist responses include reductive ideas like eliminativism where conscious phenomena are entirely imaginary (mind states simply are brain states and nothing more), and non-reductive ones like emergentism, where conscious phenomena emerge as novel things in the world from lower-level physical facts (much like the wetness of water emerges from molecules of H2O).

However, most non-reductive versions of physicalism are epiphenomenal in nature, meaning that consciousness is an unnecessary by-product of physical activity in the brain. This view is consistent with physicalist ideas about the non-existence of free will, because although brain states cause mind-states, mind states cannot cause brain states in return, but it raises questions regarding how and why consciousness evolved in the first place.

Another argument against materialism, and one that seeks to counter eliminativism and epiphenomenalism, is the knowledge argument, also known as Mary’s room.

Imagine a great scientist called Mary who has lived her whole life in a black and white room with only black and white objects (ignore the practical difficulties of the scenario, this is a thought experiment!). Mary has access to all the scientific knowledge of an advanced civilization, so she knows absolutely everything there is to know about colours vision.

click image for source

Now imagine that Mary is allowed to leave the room, and sees the colour red for the first time. The argument says that despite knowing everything about the physical world – all it’s properties and laws – Mary has still learnt a new fact about the world, that being what it’s like to see red. According to the argument, this shows that qualia are both real and non-physical.

I would also argue that this thought experiment suggests that consciousness is causally efficacious, as Mary could be instructed to perform an action based on whether she knows what it’s like to see red. (Note however that this does not count as evidence for free will).

Again in response, physicalists might say that Mary hasn’t actually learnt any new facts about the world, but just learnt a new skill. Others might argue that a truly comprehensive understanding of the physical world would include such knowledge in the first place, although this opens physicalism up to alternative monistic approaches such as the various flavours of panpsychism.

There are other arguments against physicalism that I won’t go into here, such as David Chalmers‘ interesting conceivability argument involving philosophical zombies, and arguments revolving around inverted spectrums of colour vision.

My main point is that these issues are not resolved, and that the physicalist position is far from set in stone. It may be the dominant view, but it is not the only one. And given the huge gaps in our scientific understanding at the most fundamental level of microphysics, it’s not even the most likely. The odds are even because it’s an entirely open question.

Next up, maybe I’ll write some words on what I think are sometimes unspoken motivations in the physicalist viewpoint, and how and why it has become the dominant position amongst scientists and intellectuals. On the other hand I might go into the science itself to illustrate just where those gaps in our understanding are widest and deepest.

I’ll sign off with a related quote from Brian Davies in his book Why Beliefs Matter:

“Scientists… often do not realize that their ‘common sense realism’ is a philosophical position, and that resolving a variety of serious criticisms of it is no small task”

Some words on panprotoexperientialism.

[NOTE – this is re-post from the original incarnation of this blog.]

Panprotoexperientialism is a long word. Type it, and your spell-checker is going to take exception, even if you spell it right.

It may or may not be found with hyphenation between the pan (meaning everywhere) and the proto (meaning potential), or the proto and the experientialism (meaning the subjective qualitative experience of consciousness, or qualia).

But hyphenation or not, it basically conveys the metaphysical idea that all fundamental constituents of the universe (strings, loops, processes or whatever they may be) carry properties that pertain to their ability – when bound within a complex system like the brain of a conscious being – to subjectively experience the objective universe.

In dividing the world into two using terms like subjective and objective, a duality comes to mind, perhaps of the type imagined by Descartes. His was a substance dualism, where the stuff of matter and the stuff of mind are two entirely different sorts of stuff. Body and soul, and all that stuff. And although panprotoexperiential ideas can be read that way, many modern philosophers prefer to talk of a monism – a oneness – where matter and mind are made of the same stuff, but where that same stuff has dual-aspects that carry two different types of properties: material and experiential.

https://upload.wikimedia.org/wikipedia/commons/thumb/9/99/Property_dualism.jpg/250px-Property_dualism.jpg

click image for source

For this reason, these versions of panprotoexperientialism are often referred to as dual-aspect monistic theories, or property dualism as opposed to Descarte’s substance dualism. One might think of the two types of properties in one substance as the two sides of a coin.

What panprotoexperientialism is not, is the assertion that rocks feel pain or thermometers think. Hence the proto. (even the stronger version – panpsychism – doesn’t necessarily assert those things, although it comes closer to it and may suggest that those things do just that, although in immeasurably diminished quantity and quality to living beings).

Exactly how this scheme works, and explanations of why science has thus far failed to find this aspect of matter in it’s probing of the microphysical are sketchy at best, but no-one would claim the idea as a well-grounded theory. This is philosophy and this is metaphysics. The point is to look at unexplained or unclear phenomena – like consciousness, free will or the details of microphysical substance and causality – and suggest logically consistent ideas that fit the gaps. Note the emphasis: this is what distinguishes metaphysics from pseudo-scientific quackery. These are philosophical ideas, not scientific theories, and for most people who think about such things, nor are they beliefs.

Personally, panprotoexperientialism is one area I like to consider and research, because in relation to various problems posed for physicalism (such as the hard problem of consciousness) the idea has explanatory potential.

I hope to write some more regarding physicalism next time.

One final thing. Please note that I am not a philosopher, nor am I a scientist. I am a lazing dropout (seconded). I’ve been interested in philosophy for a few years and science for a few years longer and just like to read a lot. So please do not take my views as fact or amuse yourself too deeply when you spot my mistakes and misunderstanding. I would prefer a simple correcting from those who know better!

Why the depreciation of the amateur is just plain wrong.

[NOTE – this is re-post from the original incarnation of this blog.]

“She’s an economist when not competing in the modern pentathlon” says the BBC commentator, as I try to get my head around yet another sport I’ve never seen in my life, but inexplicably seem to be enjoying a lot more than I’d anticipated.It’s a timely reminder for me, on the final day of the Olympic Games, that many of the stars we’ve all watched and admired out there, get no financial reward for participating, and don’t get paid for pursuing their chosen sport at all. In fact, many have regular day jobs.

Click for source

Of course, the idea that the modern games was ever a purely amateur affair is a bit of a myth. There have always been sponsorships, scholarships, gifts, and competitors from communist states that are really full-time sportspeople. It’s a phenomenon they used to call “shamateurism”, and in 1971, acknowledging the situation, the IOC officially sanctioned professional athletes for all but a handful of sports.I don’t know what the mix of amateurs and professionals is at the 2012 games, but the mere presence of those who are doing it just for the love of their sport and personal satisfaction, is enough to remind me of a bugbear of mine in regards to modern phraseology.Here’s some etymology for you:

amateur 1784, “one who has a taste for (something),” from Fr. amateur “lover of,” from L. amatorem (nom. amator) “lover,” agent noun from amatus, pp. of amare “to love”

(from etymonline.com)

From the same Latin root as the English word “amorous”, to be an amateur – whether it be in reference to a sport, a hobby, a job, or any other task or activity – is to be nothing more than a person doing something for the love of it.

Being of the status “amateur” says nothing about whether or not you are any good at something, or dedicated to doing it, yet the word has been grossly depreciated by ridiculous expressions such as “amateur hour” or “a bit of an amateur”, while the word “professional” has been unjustly elevated to mean the opposite, in expressions like “very professional”.

The truth is that there are good and bad amateurs, and good and bad professionals in every area of life. As much as it’s true that the quality of some jobs may benefit from having people paid to do them, many of these will be jobs that people would never do for the love it. Where there’s a job or activity that spawns a large community of amateurs, chances are that they’ll produce results on par with, or better than, people who may just be doing it for the cash.

One great example is software, where freeware and shareware created by amateurs is sometimes far superior to commercial alternatives. The amateurs are usually making something they’ll actually use themselves, so will write it as well as they possibly can, without regard for the kind of commercial considerations that may hold professional software writers back.

That all said, there is one way in which the expressions make sense, and that’s in regards to jobs or activities requiring specialist equipment. An example would be astronomy, where all the skill and enthusiasm in the world can’t make up for not having mountain-top 8.2 metre telescopes at your disposal. However, luckily for astronomy, the nature of the discipline probably means that there are very few professionals who don’t love their job as much as the amateurs, and the amateurs themselves still contribute massively to the field. Other professions are not so lucky.

So perhaps amateurs lacking professional equipment was the original impetus behind these expressions. It’s just unfortunate that the unthinking have over-extended the scope of the phrases, simply because they don’t know what the word “amateur” actually means.