All of a sudden comes a spate of books glorifying neural networks. Do we sense a paradigm shift here? Down with the old reductionist approach of artificial intelligence? Up with the biologically more relevant parallel processing network models? So it would seem, but don't expect consensus. Unlike other recent writers in the field who argue for a specific theory (e.g., Gerald M. Edelman in Bright Air, Brilliant Fire, p. 297), Jubak (former editor of Venture magazine) provides a broad survey of current work in academia and industry, leaving it up to the reader to judge. Jubak is an enthusiast who's on top of developments in speech and pattern perception, robotics, organizing principles in brain development, and so on--but it's just not easy to convey the structure and behavior of computer networks in which the builders themselves are uncertain about what happens in the ""hidden"" layers connected to an input layer (responding to light signals, for example) and to the output layer (identifying a letter or other pattern). Overall, the models try to emulate features of the human brain in its connectivity and its ability to learn. ""Learning"" is often defined in terms of the Hebb synapse--a strengthening of the connection between neurons that fire together. Some basis for the Hebb synapse is revealed late in the book in the discovery of the NMDA receptor--one of two kinds of receptors at neuronal synapses that may be responsible for long-term potentiation. At this microcosmic level of brain science, we learn, the nerve cell itself may be a master microprocessor computing its behavior from multiple inputs summed over time and space and subject to its own feedback circuits. Jubak's useful if demanding survey reveals that the state of the science is such that the more we know the less we know; but that what the brain does is absolutely thrilling--and beautiful.