Hayek’s Speculative Psychology, The Neuroscience of value Estimation, and the Basis of Normative Individualism

Here’s the opening paragraph of Don Ross’ paper from Hayek in Mind: Hayek’s Philosophical Psychology.

Philosophers of mind who re-visit Friedrich Hayek’s The Sensory Order almost sixty years after its publication should feel humbled, perhaps sheepish, on behalf of their discipline. The book is essentially an exercise in abstract speculative mental architecture construction, the kind of project that has dominated the philosophy of mind since it began to reflect the rise of cognitive science in classic texts such as Dennett’s Content and Consciousness (1969) and Fodor’s Language of Thought (1975). Remarkably, Hayek’s effort is less in need of revision today, despite the mountain of intervening empirical work and technical refinement, then any of these works in its most obvious comparison class that were written by philosophers.

The Mind as Neural Software?

Here is a superb paper by Gualtiero Piccinini that brings much needed clarity to a longstanding issue. A penultimate ms can be found here.

As a consequence, when the behavior of ordinary computers is explained by program execution, the program is not just a description. The program is also a (stable state of a) physical component of the computer, whose function is to generate the relevant capacity of the computer. Programs are physically present within computers, where they have a function to perform. Somehow, this simple and straightforward point seems to have been almost entirely missed in the philosophical literature. (p. 295).

Bravo Gualtiero!

Clark’s reply to Fodor

This hot off the press. Jerry Fodor, you may recall, reviewed Andy Clark’s latest work Supersizing the Mind in the London Review of Books. In the latest issue, Clark uses the Letters section to respond. As this is a general link I paste in Clark’s letter below.



Vol. 31 No. 6 · Cover date: 26 March 2009

Where is my mind?

From Andy Clark

Jerry Fodor’s amusing, insightful, but fatally flawed review of my book, Supersizing the Mind, seems committed to the idea that states of the brain (and only states of the brain) actually manage to be ‘about things’: to ‘have content’ in some original and underived sense (LRB, 12 February). ‘Underived content,’ he says, ‘is what minds and only minds have.’ That’s why, as Fodor would have it, states of non-brainbound stuff (like iPhones, notebooks etc) cannot even form parts of the material systems that actually constitute the physical basis of a human mind. But just how far is he willing to go with this?

Let’s start small. There is a documented case (from the University of California’s Institute for Nonlinear Science) of a California spiny lobster, one of whose neurons was deliberately damaged and replaced by a silicon circuit that restored the original functionality: in this case, the control of rhythmic chewing. Does Fodor believe that, despite the restored functionality, there is still something missing here? Probably, he thinks the control of chewing insufficiently ‘mental’ to count. But now imagine a case in which a person (call her Diva) suffers minor brain damage and loses the ability to perform a simple task of arithmetic division using only her neural resources. An external silicon circuit is added that restores the previous functionality. Diva can now divide just as before, only some small part of the work is distributed across the brain and the silicon circuit: a genuinely mental process (division) is supported by a hybrid bio-technological system. That alone, if you accept it, establishes the key principle of Supersizing the Mind. It is that non-biological resources, if hooked appropriately into processes running in the human brain, can form parts of larger circuits that count as genuinely cognitive in their own right.

Fodor seems to believe that the only way the right kind of ‘hooking in’ can occur is by direct wiring to neural systems. But if you imagine a case, identical to Diva’s, but in which the restored (or even some novel) functionality is provided – as it easily could be – by a portable device communicating with the brain by wireless, it becomes apparent that actual wiring is not important. If you next gently alter the details so that the device communicates with Diva’s brain through Diva’s sense organs (piggybacking on existing sensory mechanisms as cheap way stations to the brain) you end up with what David Chalmers and I dubbed ‘extended minds’.

There is much more to say, of course, about the specific ways that non-implanted devices (iPhones and the like) might or might not then count, in respect of some enabled functionality, as being appropriately integrated into our overall cognitive profiles. Fodor seems to believe that such integration is impossible where parts of the extended process involve what he describes as the ‘consultation’ (and then the explicit interpretation) of an encoding, rather than the simple functioning of that encoding to bring about an effect. This kind of consideration, however, cannot distinguish the cases in the way Fodor requires. Think of the case where, to solve a problem, I first conjure a mental image, then inspect it to check or to read off a result. Imagining the overlapping circles of a Venn diagram while solving a set-theoretic puzzle, or imagining doing long division using pen and paper and then reading the result off from one’s own mental image, would be cases in point. In each case we have a process that, while fully internal, involves the careful construction, manipulation and subsequent consultation of representations whose meaning is a matter of convention.

As a final real-world illustration, consider the trials (at MIT Media Lab) of so-called ‘memory glasses’ as aids to recall for people with impaired visual recognition skills. These glasses work by matching the current scene (a face, for example) to stored information and cueing the subject with relevant information (a name, a relationship). The cue may be overt (consciously perceived by the subject) or covert (rapidly flashed and hence subliminally presented). Interestingly, in the covert case, functionality is improved without any process of conscious consultation on the part of the subject. Now imagine a case in which the same cueing is robustly achieved by means of a hard-wired connection to the brain. Presumably Fodor would allow the latter, but not the former, as a case of genuine cognitive augmentation. Yet it seems clear that the intervention of visual sensing in the former case marks merely an unimportant channel detail. The machinery that makes minds can outrun the bounds of skin and skull.

Andy Clark
University of Edinburgh