James A. Anderson Professor of Cognitive, Linguistic and Psychological Sciences

I study how brains and computers are different in the way they compute. These differences arise in large part because of basic physical differences in their hardware. I research brain-like computation using "neural networks," simplifications of the complexities of real brains. Besides offering clues to brain functions as models, neural networks perform some practical applications. For example, I applied a simple neural network model, originally derived to explain how humans form concepts based on experience, to the problem of "understanding" a complex radar environment. More recently, I have been working on a set of models for the intermediate-level organization of the nervous system. Scientists know a great deal about individual neurons. We also know a good deal about behavior and the functioning of very large groups of neurons. But we know almost nothing about how groups of a hundred or a thousand or even a hundred thousand neurons cooperate to compute, or perceive, or think, or behave. One question I am now trying to answer concerns scaling. Under what conditions can similar computational functions be performed by networks of greatly differing size?

I became interested in brains and computers as a graduate student in psychology and neuroscience at M.I.T. I wanted to know how I worked. And, if we learn enough about brains, perhaps eventually we may be able to build machines that work the way we do.

Brown Affiliations

Research Areas

scholarly work

JA Anderson, P Allopenna, GS Guralnik, D Sheinberg, JA Santini, Jr., D Dimitriadis, BB Machta, and BT Merrit (in press). Programming a Parallel Computer: The Ersatz Brain Project. In W Duch, J Mandzuik, and JM Zurada (Eds.), Challenges to Computational Intelligence. Springer: Berlin.

JA Anderson. A brain-like computer for cognitive software applications. Proceedings, 2005 IEEE Conference on Cognitive Informatics, University of California, Irvine, CA. IEEE Press.

JA Anderson. Arithmetic on a parallel computer: Perception versus logic. Brain and Mind, 4, 2003, 169-188.

JA Anderson and E Rosenfeld. Talking Nets: An Oral History of Neural Network Research. Cambridge, MA: MIT Press, 1998, Paperback edition, 2000.

JA Anderson. Seven times seven is about fifty. In D Scarborough and S Sternberg (Eds.), Invitation to Cognitive Science, Volume 4. Cambridge, MA: MIT Press, 1997.

JA Anderson and J Sutton. If we compute faster do we understand better? Behavior Research Methods, Instruments, and Computers, 1997, 29, 67-77.

JA Anderson. Introduction to Neural Networks. Cambridge, MA: MIT Press, 1995.

JA Anderson, A Pellionisz, and E Rosenfeld (Eds.). Neurocomputing 2: Directions for Research. Cambridge, MA: MIT Press, 1990.

JA Anderson and E Rosenfeld (Eds.). Neurocomputing: Foundations of Research. Cambridge, MIT Press, 1988.

JA Anderson and GE Hinton. Parallel Models for Associative Memory. Hillsdale, New Jersey: Erlbaum Associates, 1981. Revised Edition, 1989.

research overview

Jim Anderson does research in the areas of cognition and cognitive development; theoretical and computational models; computational models of learning, memory and neural development; theory of computation; and artificial intelligence and robotics.

research statement

Jim Anderson's research concentrates on applications of neural networks to cognitive science. An appropriately designed network can do many pattern recognition functions in ways reminiscent of human performance. Neural networks have practical applications and can also serve as models for human behavior.

Anderson's group does research in several areas. Networks have been applied to models of human concept formation, to speech perception, and to models of low-level vision, such as the way local motion signals can be integrated to determine global object motion or the direction of self-motion. A current project involves the study of elementary arithmetic, a problem that is surprisingly hard for both humans and neural networks. Study of elementary mathematics also raises questions about the way a neural network can be designed to perform effectively more general mathematical operations.

Recent work has considered how intermediate-level structure in the nervous system might be configured, and how it might be detected in experimental data, as well as what kind of computations it might perform. In light of data from both multiple unit recordings and functional MRI, a model using a network of local networks is being studied.

funded research

Small Business Innovation Research (SBIR) Phase II, "The Ersatz Brain Project," supported by the Air Force Research Laboratory, Rome, N.Y. (Advanced Computer Architectures). SBIR contract issued to Aptima, Inc., 12 Gill St., Suite 1400, Woburn, Mass.
$750,000, starts May 2006, ends May 2008.