Psychologist Gardner, an impressive expounder in his own fields of interest: creativity, developmental psychology, styles of thinking (Frames of Mind, 1983), is also an artful chronicler of social-science history. Here, he traces the entwined histories of computer science and cognitive studies, setting forth the confusions and paradoxes as well as accomplishments, challenges and prospects. For Gardner, ""cognitive science"" describes a loose union of disciplines which includes philosophy, psychology, artificial intelligence (AI), linguistics, anthropology and neuroscience. Their common ground: how human beings come to know the world and themselves, operating within it using language and reason. Needless to say, the philosophical roots of these questions are millennia old. Only in the 19th and 20th centuries, however, have the separate social sciences been carved out and undergone evolution and change. Gardner says that by the late 1940's, the seeds of revolution were well-sewn--what cognitive psychologist Ulric Neisser termed ""a paradigm shift."" By that time, behaviorism was dying, the neurosciences were beginning to burgeon, and psychology, anthropology, and linguistics experts were talking to each other. Soon they would be using a common tool--the computer--to explore new terrain. In Part I Gardner charts the pivotal symposia and figures who pioneered the new wave. They included Lashley in neuropsychology, von Neumann, Shannon, and Wiener in computers, information theory, and cybernetics. This early wave led to a braggadocio, in the 1960's, with pronouncements (some lingering) that computers would reach new heights in modeling human brain function. That naive optimism passed, and in its wake has come a new sophistication in each discipline. Each has moved ahead at an impressive pace, sometimes abandoning simple brain-computer analogies, sometimes developing more neurobiologically-based programs. In part II Gardner sketches the developments in each of the six fields, highlighting the work and critics of dominant figures such as Minsky and McCarthy in AI; Chomsky in linguistics; Eric Kandel, David Hubel and Torsten Wiesel in neuroscience. Today the horizon is adorned with still active masters and students who do ingenious computer simulations of visual perception, design experiments to demonstrate that mental images are real, document evidence that the ways people classify (not necessarily name) colors may be hard-wired into the brain, and demonstrate that people behave in certain predictably illogical ways. These examples are chosen from Part III, which examines current controversies and the degree of cross-fertilization occurring. Gardner argues for a redefining of disciplines by focus: let the philosophers, psychologists, neuroscientists, et al. interested in language get together. So, too, with those interested in perception, mental constructs. At the same time, he believes that cognitive science should not be limited to these areas but include as well studies of emotions, development, creativity, learning. . . Here, Gardner has eloquently described how the pioneers have marched bravely across the border, but it is clear they have far to go.