Add psychology to the list
A fairly common response to our theory post was 'here's my theory, which is designed to replace and fix all the others'. However, it's more a symptom of the problem I was discussing than a solution for everyone to have their own entirely separate theory which doesn't talk to any other work in the field (see above). One of my personal goals in science is to not be that guy. I want to see cognitive science become more integrated, not more fragmented. We have also been asked, however, and quite sensibly, what we think the solution to our problem is. The question then is how to propose a theoretical approach for psychology and cognitive science where we don't just reinvent the wheel.Sabrina and I have been working on this for, well, the entire blog. It has been a place for our "brave attempt to think out loud about theories of psychology until we get some" since day one; we've been identifying problems but, just as importantly, solutions the whole time. The theory post identified the big picture problem we see in psychology; time to lay out some solutions.
Step one is to present a map of the blog, organised thematically to guide new readers to work we've already done here. This should also help map out the gaps in the approach, so we can focus on things to do next; feel free to point us to problems we can't yet address! (And yes, we know about episodic memory and language - we're working on it.) This post is not a comprehensive summary of past work - it's a map for you to use to find what we've done so far.
To summarise: in essence, and some minor details aside, we are advocating for Chemero's (2009) radical embodied cognitive science, with the addition of some elements he was missing (network science & task specific devices). Cognition is embodied, extended and held together by the direct perception of affordances and events; the result is a complex, nonlinear dynamical system that must be analysed as such. The brain is not the sole source of our behaviour, nor is it representing the world; it clearly plays a critical role in this system, though, and we propose that we'll need the tools of network science to describe what it's actually up to (Sporns, 2010). Methodologically, we must carefully characterise the task, the resources available to solve the task (which include brain, body and environment) and the information these resources create which can sustain the formation and control of an embodied solution. This method is Bingham's (1988) task specific device approach (the main piece Chemero was missing, I think).This approach applies to all and any behavior you want to explain, including the hard stuff like episodic memory and language.
Critically, this approach, while new (and uncommon in insisting on a role for Gibson's ecological approach) isn't just something we invented: all these elements are active parts of modern cognitive science. The only new part is bringing it all under one roof, with the goal of getting on and getting some decent normal science under our belts.
Here's what we've covered so far. If you want more details on any point, click on the links!
Cognition is embodied
The first claim we want to defend is that cognition is embodied. Embodied cognition is not the hypothesis that the contents of cognition can be affected a bit by our bodies (as implied in this study). Embodied cognition is actually the fairly radical hypothesis that the brain is not the sole resource we have available to us to solve problems. We perceive and act in very particular ways so as to generate information and solve problems non-computationally (for example, fielders catch fly balls by moving in such a way as to cancel out either the optical curvature or acceleration of the ball's motion, which happens to bring you to the right place at the right time). The bodies we move are built in very specific ways; our hands, for example, are built as if they are implementing certain computations that are required to control them. This 'morphological computation' isn't actually computation, it's more like the Watts governor (van Gelder, 1995). A great example of this idea in action is Big Dog, one of the many awesome robots built by Boston Dynamics.
Embodiment changes what 'cognition' will end up looking like. By changing the job description (e.g. what resources we have available to solve problems) we end up proposing entirely different solutions to tasks. An excellent recent book on this topic is Barrett (2011), Beyond the Brain: How body and environment shape animal and human minds. If you allow yourself bodies, behavior and perception, then you typically don't end up needed complex computational solutions being implemented in the brain.
Cognition is extended
A logical extension to embodied cognition is the claim that cognition is extended (Clark & Chalmers, 1998). This is the claim that things in the environment literally form part of the cognitive process. This can be summarised in Clark & Chalmers' 'parity principle':
If, as we confront some task, a part of the world functions as a process which, were it done in the head, we would have no hesitation in recognizing as part of the cognitive process, then that part of the world is (so we claim) part of the cognitive process.There is still debate about how well this idea works, mostly coming from Adams & Aizawa (2010). They believe that the hypothesis is grounded in a confusion between coupling and constitution; while we are, indeed, coupled to things in the world, they need not then constitute part of our cognition. We've had various arguments with Ken Aizawa about this (summarised here); I think the main problem with their argument is that there is no need for all the parts of a cognitive system to have 'the mark of the cognitive' if we're happy the system as a whole is cognitive. This works, I think, because of the nature of the coupling that goes on when we interact with the world: objects literally become part of us when we interact with them, and the kind of ongoing perception-action loops that support this run deep.
Clark & Chalmers, 1998, pg. 2
In order to solve a given task, then, we use a wide variety of resources; some of these are neural, but not all. Some of the resources are our bodies (our visual system is composed of mobile eyes in a mobile head on a mobile torso equipped with legs, for example), while some are objects and other people in our environments. A theory of psychology must therefore include all these resources.
The role of perception
Extended, embodied cognition requires impressive perception. Typically, perception is seen as the end point of a complex process, taking impoverished input and enriching it until it is good enough to be useful. Cognition then becomes a computational process of adding knowledge and structure to our experience. If, however, cognition is to be the solving of problems using resources distributed beyond the brain (as above) then this account isn't good enough.
We already have a theory of perception that is up to the task of providing the kind of access to the world that we need: James J Gibson's ecological approach to perception (Gibson, 1979; see the reading group posts on this book).Gibson's book begins with the environment; what is available to the perceiving organism that they might be interested in using. Starting there, rather than with the anatomy of the eye, led Gibson to propose his two key ideas: affordances, and information.
Affordances
Affordances are the opportunities for behavior the world offers to a given organism; a handle affords grasping to a organism with a hand, for example. Technically, they are dispositions of the environment. Salt is disposed to dissolve in water, for example, but doesn't dissolve until placed in water. Affordances are dispositions supporting behaviour, but that behavior doesn't show up until a matching organism comes by. This way of thinking of affordances makes them real properties of the world which persist in the absence of organisms (Chemero (2009) advocates treating affordances as relations (see here and here); I talked about this debate here, here and summarised it here. Long story short, I think Chemero is confusing affordances and information; the latter is relational and does every relational thing Chemero wants affordances to do, without the problems).
Some terminology (based on Turvey, 1992 & Turvey, Shaw, Reed & Mace, 1981): affordances are complex dispositional properties, composed of combinations of anchoring properties. These anchoring properties are things like the composition of surfaces - their size, shape, etc. Some organisms have complementary anchoring properties (e.g. a hand of the right size and shape) and can effect an affordance. Which properties of organisms matter is still a matter of debate: some people have proposed body scale as a key property (e.g. Warren, 1984) while other researchers feel we need something more like ability (Chemero, 2009) or effort (Proffitt, 2008). The latter is likely the right path, but is, as yet, poorly defined.
Information
Affordances are easy to define; the real question is whether there is perceptual information available to an organism for those affordances. The most detailed explanation of how affordances give rise to information is Turvey et al (1981), who lay out the concept of ecological laws to expand on Gibson's (1979) account (see this post, this post and this post on Chapter 5, and this post and this post on Chapter 6 for Gibson's description). These laws govern how the anchoring properties of affordances interact with energy such as light to create structure in light; this structure, by virtue of the law, is specific to the affordance. The laws are ecological in the sense that they have a limited scope: the law does not apply universally, but only in the kinds of niches we find ourselves. Within the scope of the law, however, ecological optics explains how affordances structure light to create information.
Because this information specifies the affordance (i.e. there is a 1:1 mapping between the optics and the world) if you detect the optical information, this is equivalent to perceiving the property of the world. Perception is therefore direct: unmediated by any internal states. As things currently stand, direct perception requires this law based, specification relationship. Whether specification is actually required is a topic of debate and we'll be getting into that soon.
The one thing that information needs to have done to it in order to be useful for the control of action is calibration. Information variables are unitless: optical information is all angular, for example, and in order to use information to act in space you must apply a metric to the measurement. This does not require internal states or processing; calibration arises from making your perceptual measurement with a ruler marked off in action relevant units (these could be body scale, or effort, etc; in other words, the units are the organism's complementary anchoring properties). The intuition here is simple; you can measure the same amount of space with a ruler marked up in, say, inches or centimetres - it's the same amount of space, but the numbers that come out are quite different. If you instead measured that space using, say, the length of your arm as a unit, you would have a number who's units tell you something directly about your ability to cross that space with your arm; this is useful for, say, reaching to grasp an object. You can directly measure all kinds of things if you have a measuring device marked up (i.e. calibrated) in the appropriate way: one example is the polar planimeter which directly measures area (Runeson, 1977).
Dynamical systems
The hypothesis that we have embodied, extended minds which rely on perception to establish the required couplings means that cognition is a complex, nonlinear dynamical system. Dynamics is the mathematical language of change over time, and provides just the right formal tools to model the kinds of systems we are. An excellent example of using dynamics to model a perception-action system is Bingham's model of coordinated rhythmic movement; this simple task is an excellent model and test bed for the ideas laid out so far.
Dynamical systems is not, in itself, a theory of behaviour; this is an error made by a lot of researchers and the data do not support them. It is, however, the right analytical tool for the job.
No mental representations
The big 'negative' thing we're going to insist on is that we're going to rule out computational mental representations as entities you can invoke in your explanations. The reason is fairly simple: there is no limit to what representations can do. Whenever you come across a problem in your explanation, say a potential lack of perceptual access to some required information, you can simply claim that the gap in your explanation is filled by a representation that has just the right size and shape. Because they can be anything you need them to do, they cease to have any explanatory power.
We've been critiquing representations since day one; Sabrina has summarised a lot of the issues here. Because representations aren't good explanations, and because when you embrace embodiment they tend to become unnecessary, we are strong advocates of the 'radical' hypothesis that we do not trade in mental representations.
Cognition does not have to be representational: the standard cry of 'what else could it be?' has been answered. van Gelder (1995) described a device, the Watts steam governor, that does a job that today would typically be solved using an algorithm and does it efficiently, reliably and stably in the face of noise and perturbations. The moral is simple; while there is an algorithm to describe the solution, it's not how the device actually works, nor would the device work well if it was. Another useful device for metaphors is the polar planimeter (Runeson, 1977); this device directly measures the 'higher order' variable area, without implementing the algorithm of 'two measurements of length combined via multiplication'. The most recent defence of a non-representational approach is Chemero's book, Radical Embodied Cognitive Science. So while you may not agree with the idea, a non-representational cognitive science is, at least, a viable option, and we believe one justified by taking embodiment seriously.
(I would love to see an inventory of all the representations that have ever been invoked in cognitive science; I would guess that it is a tangled and incoherent mess of things designed to just fix that one problem.)
What about the brain?What is the brain doing, if not representing? Are ecological psychologists really committed to the idea that brains don't matter? No, of course we aren't; we only look like we are because instead of plunging off the deep end with rampant speculation about what the brain is up to we've spent our time working out what it has to work with. First things first!
The brain is clearly important, it's just not representing anything; rather the brain is in a constant state of change in response to it's environment. I'm inclined right now to treat it as the fast responding resource which coordinates the assembly of task specific devices, as well as a system that can implement embodied solutions to computational problems.
The most promising approach to neuroscience I've seen recently is that described by Olaf Sporns' book Networks of the Brain (Sporns, 2010). In this book, Sporns describes how the mathematics of networks are being applied to neuroscience datasets to uncover structure extended over both space and time within the endless modelling and remodelling of neural connections. This seems to me to be the right toolset for neuroscience; combined with our radical, embodied cognitive science it could be a powerful approach, and we're waiting to hear about funding for a project to set this idea in motion.
Summary
There's a lot of work to do. But these are our core theoretical commitments - when we try to explain our data, we must characterise the task resources (which can include the brain, body and the environment) as well as the information supporting the coupling of these resources into task specific devices which solve the current task. The tools exist, and there are plenty of problems to study. Psychologists will need to get a little better at physics, biology, and maths, and we'll need help from experts in these fields. But I truly believe that taking this strong theoretical stance will allow psychology to apply itself to a coordinated programme of research that, right or wrong, will produce a wealth of data and drive our understanding for. After all, that's what a theory is for.
References
Adams, F., & Aizawa, K. (2010). The Bounds of Cognition. Wiley:Blackwell. Amazon.co.uk
Barrett, L. (2011). Beyond the Brain: How body and environment shape animal and human minds. New Jersey: Princeton University Press. Amazon.co.uk
Chemero, A. (2009). Radical Embodied Cognitive Science. Cambridge, MA: MIT Press. Amazon.co.uk
Clark, A., & Chalmers, D. (1998). The extended mind. Analysis, 58 (1), 7-19 DOI: 10.1111/1467-8284.00096 Download
Gibson, J.J. (1979). The ecological approach to visual perception. Boston: Houghton Mifflin. Amazon.co.uk
Proffitt, D. R. (2008). An action-specific approach to spatial perception. In R. L. Klatzky, M. Behrmann, & B. MacWhinney (Eds.), Embodiment,ego-space, and action (pp. 179–202). Mahwah, NJ: Erlbaum.
Runeson, S. (1977). On the possibility of "smart" perceptual mechanisms Scandinavian Journal of Psychology, 18 (1), 172-179 DOI: 10.1111/j.1467-9450.1977.tb00274.x
Sporns, O. (2010) Networks of the Brain. Cambridge, MA: MIT Press. Amazon.co.uk
Turvey, M. (1992). Affordances and Prospective Control: An outline of the ontology. Ecological Psychology, 4 (3), 173-187 DOI: 10.1207/s15326969eco0403_3
Turvey, M., Shaw, R., Reed, E., & Mace, W. (1981). Ecological laws of perceiving and acting: In reply to Fodor and Pylyshyn (1981) Cognition, 9 (3), 237-304 DOI: 10.1016/0010-0277(81)90002-0
van Gelder, T (1995). What might cognition be, if not computation? The Journal of Philosophy, 92 (7), 345-381 Download
Warren, W. (1984). Perceiving affordances: Visual guidance of stair climbing. Journal of Experimental Psychology: Human Perception and Performance, 10 (5), 683-703 DOI: 10.1037/0096-1523.10.5.683 Download