Languages Magazine

How Worried Are You By The Symbol Grounding Problem?

By Andrew D Wilson @PsychScientists
Imagine you're a mental representation. You are a computational symbol system, and your job is to contain knowledge that is about the world and that can help your organism interact with that world (Newell, 1980). The 'aboutness' thing is the most important part of you - you are an intentional system, which means you have content that is meaningful.

So where did your content come from? (I'd like to know your thoughts, so please help by answering the questions at the end!)


This question is the issue of symbol grounding, first posed as such by Searle (1980), talked about seriously by Harnad (1990) and made famous with the Chinese Room thought experiment:


The problem is that you can have a system that deals in nothing but syntax (the form and structure of a communication transaction) but that will pass the Turing Test, i.e. look like it trades in semantics (meaning), even though that syntax is definitely not grounded in a real semantics.

There is currently no solution to the problem of endowing a mental representation symbol system with content/meaning/intentionality that doesn't involve that meaning to have come from somewhere else. If the meaning is not intrinsic to the system's form (Bickhard, 2009, calls this being 'internally related') then the mean has to come from something else, but then how did that thing get its meaning, and so on....it quickly becomes turtles all the way down. This means that mental representations cannot do the things they need to do in order to play the role they need to play in our cognitive economy to make us functional, intentional beings and not philosophical zombies.

This has always struck me as an absolute disaster for the standard cognitive approach. But my question here is, do other people worry about this?

I would love it if people would comment below and answer the following questions:

  1. What flavor of cognitive scientist are you? (psychologist, philosopher, enactivist, representationalist, Jerry Fodor in the actual flesh, etc)
  2. Do you know about the symbol grounding problem?
Then, if you do,
  1. Are you concerned by the implications of the symbol grounding problem for mental representations?
  2. Do you think the problem has already been solved? If so, how?
Obviously I have opinions, but this time I am very much interested in yours!
References
Bickhard, M. H. (2009). The interactivist model. Synthese, 166(3), 547-591.

Harnad, S. (1990). The symbol grounding problem. Physica D: Nonlinear Phenomena, 42(1), 335-346.

Newell, A. (1980). Physical symbol systems*. Cognitive science, 4(2), 135-183.

Searle, J. R. (1980). Minds, brains, and programs. Behavioral and brain sciences, 3(03), 417-424.

Back to Featured Articles on Logo Paperblog