I’d like to generate some discussion around probably the most well supported materialistic theory of mind, computationalism. In particular, I’d like to discuss the problems with this theory and (probably in a later post) how a Thomistic response might be put forward. Note, the below is paraphrased substantially from a book by Edward Feser called “The Last Superstition”, and I would recommend it.
Simply, the computationalist theory of mind is based on the idea that thoughts are “symbols”, the medium of which are neural states or firing patterns, in the same way as “cat” written on a page is a symbol, whereby the medium of this symbol is ink on a page. Thinking, on this view, is the transformation from one or more symbols to others according to the rules of an algorithm. For an example with a difference in degree but not in kind, a calculator manipulates the symbols “2” and “+” and “2” and “=” to give “4” according to an algorithm encoded in the device. These symbols get their meaning by the cause and effect associations with objects outside the brain. For instance, a certain brain state will count as a symbol for “there’s a cat” if it is caused by cats appearing to the sense organs of the observer.
Now, there are problems with each step in this process:
- The idea of certain thoughts being considered as symbols
- The idea of unconsious algorithms constituting thinking and
- The meaning given to symbols by causal relationships
At the moment, I would like to concentrate on 1. The difficulty of trying to define thoughts as symbols can be loosely expressed in the following reductio:
- All thoughts are physical symbols in the brain.
- Symbols are only considered symbols of things (i.e. representative of things) by intentional agents. For instance, the word "cat’ is only a symbol for an actual cat because English speakers have associated the two. In the same way, a drawing of a cat is only considered symbolic of an actual cat by an intentional observer.
- From 2., the physical arrangements of matter which are said to be symbols don’t inherently point beyond themselves to the thing being symbolized (according to materialism). For instance, the word “cat” written on a page is, without a mind to interpret, just squiggles of ink on a page.
- If thoughts are physical arrangements of matter, they do no inherently point to anything, or represent anything.
- Only an intentional mind can interpret physical arrangements as symbols
- presupposes an intentional mind to interpret the physical states of the brain and assign “symbols”, but (under materialism) this leads to a regress of internal “minds” - see Homunculus Arguement.
- Therefore 1. is false, some thoughts are not physical symbols in the brain.
The conclusion of 7. seems to suggest that there are intentional states of mind which cannot be reduced to brain states, as the computational theory of mind is probably the most plausible way of explaining thought under materialism.
Look forward to your comments.