New Yorker staff writer Kolbert (Field Notes from a Catastrophe: Man, Nature, and Climate Change, 2006, etc.) returns with a deft examination of the startling losses of the sixth mass extinction occurring at this moment and the sobering, underlying cause: humans.
Although “background extinction” continuously occurs in varying slow rates among species, five major mass extinctions mark the past. Scientists theorize that all of these—from the extinction of the Ordovician period, which was caused by glaciation, to the end of the Cretaceous, caused by the impact of a celestial body on the Earth’s surface—were the results of natural phenomena. Today, however, countless species are being wiped out due to human impact. Global warming, ocean acidification and the introduction of invasive species to new continents are only a few ways that we are perpetrating harsh new realities for those organisms unable to withstand radical change. Kolbert documents her travels across the globe, tracing the endangerment or demise of such species as the Panamanian golden frog, the Sumatran rhino and many more. The author skillfully highlights the historical figures key to the understanding of the planet’s past and present turmoil, including Charles Darwin and Georges Cuvier, the first to theorize extinction as a concept. Throughout her extensive and passionately collected research, Kolbert offers a highly readable, enlightening report on the global and historical impact of humans, “one weedy species” that may offer valiant efforts to save endangered species but who are continually causing vast, severe change. Kolbert also weaves a relatable element into the at-times heavily scientific discussion, bringing the sites of past and present extinctions vividly to life with fascinating information that will linger with readers long after they close the book.
A highly significant eye-opener rich in facts and enjoyment.
Part science writing and part memoir, this adventurous fact-finding romp takes readers across the landscape of ideas about the universe, calling on the expertise of the biggest names in science—and also the author's lifelong partner in her pursuit of the meaning of everything: her father.
Gefter, an MIT Knight Science Journalism fellow and founding editor of CultureLab at New Scientist, is a crafty storyteller and journalist; she describes how she jump-started her career by crashing physics conferences and faking her way into interviews with world-famous physicists. Fueled by an insatiable curiosity about how the universe could be at once governed by the laws of cosmology (which define large-scale properties of the universe) and also by the laws of quantum mechanics (which define the behavior of microscopic particles), the author embarked on a scientific scavenger hunt while chasing leads across time and space. Gefter makes even the most esoteric concepts—and there are a lot of them in this book—lucid and approachable. From string theory to the multiverse to the holographic principle, the author's exuberance for physics and the possibility that cutting-edge theories may lead to a new understanding of "reality" is evident in her passionate prose. Underlying the joys of scientific pursuit is the author's formative relationship with her father, who first asked the big question—"How would you define nothing?"—that inspired her yearslong quest to define how "nothing" and "everything" can be explained by the forces that govern the universe. What she discovered about the new frontier of quantum cosmology and the importance of the role of the individual observer is astonishing and awesome, and Gefter's book is a useful presentation of this thrilling ontological shift for a general audience.
Beautifully written and hugely entertaining, this book is a heartfelt introduction to the many mind-bending theories in contemporary physics.
Photography, not computers, ushered in modern astronomy. Here, its bumpy evolution is in the expert hands of Harvard College Observatory associate Hirshfeld (Physics/Univ. of Mass. Dartmouth; Eureka Man: The Life and Legacy of Archimedes, 2009, etc.).
In this highly illuminating history, the author “explores the decades-long bridge of innovation that transformed Victorian-era visual astronomy into the scientific discipline that is observational astrophysics.” Although revolutionary when it appeared around 1600, the telescope is simply an amazing extension of the eye, not designed to function in dim lighting or make a permanent record. The daguerreotype dazzled the world in 1839, and an early photograph of the moon, unimpressive by modern standards, created a sensation in 1851, but stars and planets remained off limits until film sensitivity vastly increased with the dry plate in the 1870s. Equally essential to astronomers was the simultaneous maturing of the spectroscope. Splitting light into innumerable hues and lines, it allowed not only the discovery that stars were similar to the sun, but also the identification of their precise chemical makeup and movements. By the 1880s, “what had been a noisome, exasperating art had become a predictable mainstream technology that would eventually recast the telescope as an adjunct of the camera” and spectroscope. Until that decade, Hirshfeld emphasizes brilliant but now-unknown amateurs (Andrew Common, William Bond, William Huggins, Isaac Roberts) who fell in love with astronomy and had no objection to the clunky new technology. Afterward, they were replaced by academically trained but equally obsessive scientists who oversaw the creation of the massive 20th-century observatories (George Ellery Hale) and revealed an unimaginably immense, expanding universe (Edwin Hubble, Harlow Shapley).
A delightful, detailed chronicle of great men (and a rare woman) whose fascination with the night sky and the technology necessary to study it led to today’s dramatic discoveries.
Think you've heard it all about the grueling, fatigue-driven years suffered by interns and residents once they get their degrees? Think again.
Holt (In the Valley of the Kings: Stories, 2009) came 20 years later to medicine than most of his peers, choosing a writing career first. Whatever the reasons for that latter-day commitment, the result is a beautiful, riveting book that puts readers on the spot in the ward, in the ICU, making the rounds, talking to families, making hospice calls and participating in the “bedlam” of a “Code Blue” resuscitation. What Holt set out to do was to convey the “un-narratibility” of hospital life (“too manifold, too layered, too many damn things happening one on top of the other”) in parables that would condense and transform the experience, as he himself was transformed. To that end, he uses composites of many different cases. In the process, he has created unforgettable portraits of the gravely ill or dying: the obese woman hospitalized for a “tune-up” to rid her body of excess fluids; the young woman who should have died from too many Tylenols but was saved by a liver transplant; the hospice patient whose face was covered by a surgical mask to conceal the loss of most of her lower face to cancer. “Nothing happens in these pages that doesn’t happen every day in a variety of ways in hospitals everywhere,” writes the author. “I have had to simplify what defied narrative form, and alter or suppress whatever might have compromised the respect patients deserve. But in making sense of residency within the constraints of narrative form and human decency, I have hewed as closely as possible to the lived reality of the hospital.”
Holt says that he wrote the book over a period of 10 years. Let’s hope for a shorter duration before we next hear from this gifted writer/physician.
Living Planet Books co-founder Horwitz chronicles an ongoing collision of epic proportions between the U.S. Navy, intent on protecting its submarine warfare program, and environmental activists, who fight to save whales from extinction.
The author begins in March 2000, when, over several days, “the largest multispecies whale stranding ever recorded” occurred across 150 miles of beach in the Bahamas. Rescue efforts led by Ken Balcomb, a researcher who was conducting a census of whales in the area, were mostly unsuccessful, but he was able to preserve their bodies for later forensic examination. Having served as a naval sonar expert, Balcomb surmised that training exercises involving a top-secret “Sound Surveillance System,” developed during the Cold War to monitor Soviet nuclear submarines, were likely responsible. The use of high-decibel, low-frequency sonar signals by the Navy would have overwhelmed the whales' biosonar system and caused physiological damage as well. This was not the first such incident of whale strandings in the vicinity of naval exercises—nor, unfortunately, the last. The author reports on the battle led by Balcomb and Joel Reynolds—a senior lawyer for the Natural Resources Defense Council—to force the release of the forensic evidence and the attempts by the Navy’s top brass to stonewall any serious investigation that could lead to curtailment of their activities. The battle led to a court ruling against the Navy for overriding environmental law. The Bush administration overturned the court decision by executive order on grounds of national security, and the NRDC countered legally, asking for a ruling on the administration's action. The case went to the Supreme Court, which ruled that the administration was within its rights, but it opened the door for the requirement of “comprehensive Environmental Impact Statements” in advance of any future naval maneuvers.
Based on years of interviews and research, Horwitz delivers a powerful, engrossing narrative that raises serious questions about the unchecked use of secrecy by the military to advance its institutional power.
“Innovation occurs when ripe seeds fall on fertile ground,” Aspen Institute CEO Isaacson (Steve Jobs, 2011, etc.) writes in this sweeping, thrilling tale of three radical innovations that gave rise to the digital age. First was the evolution of the computer, which Isaacson traces from its 19th-century beginnings in Ada Lovelace’s “poetical” mathematics and Charles Babbage’s dream of an “Analytical Engine” to the creation of silicon chips with circuits printed on them. The second was “the invention of a corporate culture and management style that was the antithesis of the hierarchical organization of East Coast companies.” In the rarefied neighborhood dubbed Silicon Valley, new businesses aimed for a cooperative, nonauthoritarian model that nurtured cross-fertilization of ideas. The third innovation was the creation of demand for personal devices: the pocket radio; the calculator, marketing brainchild of Texas Instruments; video games; and finally, the holy grail of inventions: the personal computer. Throughout his action-packed story, Isaacson reiterates one theme: Innovation results from both “creative inventors” and “an evolutionary process that occurs when ideas, concepts, technologies, and engineering methods ripen together.” Who invented the microchip? Or the Internet? Mostly, Isaacson writes, these emerged from “a loosely knit cohort of academics and hackers who worked as peers and freely shared their creative ideas….Innovation is not a loner’s endeavor.” Isaacson offers vivid portraits—many based on firsthand interviews—of mathematicians, scientists, technicians and hackers (a term that used to mean anyone who fooled around with computers), including the elegant, “intellectually intimidating,” Hungarian-born John von Neumann; impatient, egotistical William Shockley; Grace Hopper, who joined the Army to pursue a career in mathematics; “laconic yet oddly charming” J.C.R. Licklider, one father of the Internet; Bill Gates, Steve Jobs, and scores of others.
Isaacson weaves prodigious research and deftly crafted anecdotes into a vigorous, gripping narrative about the visionaries whose imaginations and zeal continue to transform our lives.
A shimmering narrative about how the human and natural worlds coexist, coadapt and interactively thrive.
Prolific essayist and naturalist Ackerman (One Hundred Names for Love, 2011, etc.) offers absorbing commentary on both the positive and negative effects of human consumption and innovation on the Earth. We are an ever increasing population of “nomads with restless minds,” she writes, and her well-researched, substantiated observances take us from the outer reaches of space to view the world’s sprawling cities to the Toronto zoo, where the Orangutan Outreach initiative “Apps for Apes” improves the lives and expands the perceptions of primates whose population is declining. Humans have become “powerful agents of planetary change,” she writes, creating wildly fluctuating weather patterns and irreversible global warming, evidenced in our backyards and in the stratosphere and reflected in the migratory patterns of the animal world. Thankfully, Ackerman’s ecological forecast isn’t completely bleak; hope springs from fieldwork with geologists studying the fossilized record of the “Anthropocene” (the age of human-ecological impact), tech scientists creating bioengineered body organs from 3-D prints, and a French botanist whose research demonstrates the ability to “reconcile nature and man to a much greater degree” by rebalancing the delicate ecosystems damaged by invasive species. Ackerman optimistically presents innovations in “climate farming,” the exploding popularity of rooftop farming and the urban-landscaped oasis of Manhattan’s High Line. She also examines European attempts to harness everything from body heat to wind energy. Ackerman is less certain about the longevity of the animal world or the true charm of the robotic revolution, but whether debating the moral paradoxes of lab chimeras or the mating rituals of fruit flies, she’s a consummate professional with immense intelligence and infectious charm.
Through compelling and meditative prose, Ackerman delivers top-notch insight on the contemporary human condition.
Best-selling author Johnson (Where Good Ideas Come From: The Natural History of Innovation, 2010, etc.) continues his explorations of what he calls the “hummingbird effect,” unforeseeable chains of influence that change the world.
An innovation, writes the author, typically arises in one field—chemistry, say, or cryptography. But it does not rise alone—“ideas are fundamentally networks of other ideas,” and those tributary ideas likely came from many sources and disciplines, conditioned by the intellectual resources available at the time. Da Vinci aside, the author notes that even the most brilliant 17th-century inventor couldn’t have hit on the refrigerator, which “simply wasn’t part of the adjacent possible at that moment.” A couple of centuries later, it was, thanks to changes in our understanding of materials, physics, chemistry and other areas. Johnson isn’t the first writer to note that such things as the can opener were game-changers, but he has a pleasing way of spinning out the story to include all sorts of connections as seen through the lens of “long zoom” history, which looks at macro and micro events simultaneously. Sometimes he writes in a sort of rah-rah way that, taken to extremes, could dumb the enterprise down intolerably, as when he opines, “silicon dioxide for some reason is incapable of rearranging itself back into the orderly structure of crystal.” Take out “for some reason” and replace with “because of the laws of physics,” and things look brighter. However, Johnson’s look at six large areas of innovation, from glassmaking to radio broadcasting (which involves the products of glassmaking, as it happens), is full of well-timed discoveries, and his insistence on the interdisciplinary nature of invention and discovery gives hope to the English and art history majors in the audience.
Of a piece with the work of Tracy Kidder, Henry Petroski and other popular explainers of technology and science—geeky without being overly so and literate throughout.
Leroi (Evolutionary Development Biology/Imperial Coll. London; Mutants:On the Form, Variety and Errors of the Human Body, 2003) calls on his expertise and his experience as a BBC science presenter to explain why Aristotle's writings on science are still relevant today.
The author introduces readers to Aristotle's work in the field of biology and shows where it accords with modern understanding and where it is wildly off-base. Although best known as a philosopher, Leroi explains that the major body of Aristotle’s work (much of which has been lost) dealt with natural science. In his search for the causes of change, the philosopher embarked on an ambitious project. “By the time he was done,” writes the author, “matter, form, purpose and change were no longer the playthings of speculative philosophy but a research programme.” Aristotle based his groundbreaking efforts to discover the workings of nature on a wide variety of sources, including his own observations. In addition to humans, a whole host of animals came under his purview and led him to classify different species, thus anticipating Carl Linnaeus in the 17th century. Leroi shows how Aristotle pondered the common features of all living creatures, as well as their divergence, and attempted to account for their functional differences. According to the author, Aristotle’s line of thinking led him to attempt to understand the operation of “five interlocked biological systems”—the nutritional system, thermo-regulation, perception and cognition, and inheritance—and indirectly influenced Darwin's discovery of the theory of natural selection. He dismisses critics who fault Aristotle for being unscientific because he did not conduct experiments using controls. Many of his assumptions proved to be wrong, but this is to be expected in a new field. Leroi compares Aristotle's effort to assemble a huge volume of data to the practices of current scientists in the “age of Big Data.”
Freelance journalist and author Stark (Leaving Mundania: Inside the Transformative World of Live Action Role-Playing Games, 2012) has both fully researched her subject and poured out her heart in this blend of history, science and memoir.
As the family tree in the book’s front shows, cancer, and the threat of cancer, has plagued the author’s family for generations. When she underwent genetic testing and learned that she had inherited her mother’s BRCA1 mutation, which greatly raises the risks of both breast and ovarian cancers, Stark was well-aware of its significance. After coping with the hassles of close monitoring, she made the tough decision to have a preventative double mastectomy while still in her 20s. The story of that decision and all that follows from it is enough to make a book in itself, but the author goes much further. She provides a capsule history of breast surgery, from the pre-anesthesia days through William Halsted’s now-outdated radical mastectomy to today’s less disfiguring procedures, and she profiles geneticist Mary-Claire King, whose work led to the identification of the BRCA genes. In her discussion of the controversial issue of gene patenting, Stark presents all sides of the argument. Most impressive, she tells her personal story with considerable frankness and flashes of humor. The weekend before her breast-removal surgery, she and her husband threw a “goodbye to boobs” party for their closest friends. That lighthearted moment is followed by less sunny ones as Stark was forced to adjust to her new body and face the questions of whether to bear children and possibly pass on the gene mutation and deciding when to have her threatened ovaries removed. The book is a must-read for women questioning whether to be tested for the BRCA mutations and for women considering their options after testing positive.
A gutsy, deeply revealing account that more than fulfills the promise of the subtitle.
Stewart (The Management Myth: Why the Experts Keep Getting it Wrong, 2009, etc.) delivers a penetrating history of an American Revolution not yet finished and a stirring reassertion of the power of ideas unbound by the shackles of superstition.
Meticulously annotated and informed by imposing erudition, the book is a lively chronicle of the years leading up to the signing of the Declaration of Independence, especially noteworthy for detailing the unsung contributions (in word and deed) of such revolutionary figures as Ethan Allen and Thomas Young. It is also an admirably lucid survey of radical philosophical thought on the nature of man and the cosmos, a guiding principle grounded in reason and transmitted from Epicurus via the poet Lucretius, further developed by the great philosophical minds of the 17th century and embraced by the Founding Fathers. Stewart's capacity to render undiluted the complex deliberations of these thinkers glows on the page, notwithstanding the occasional Mobius strip of esoterica. The author locates these ideas in the heterodox, deist origins of the Republic, with a focus on corporeal reality, not spiritual mysteries. In doing so, he reveals the true and enduring significance of the American experiment: not merely as a revolt against an imperial monarch, but against the global reach and oppressive artifice of supernatural religion. Stewart gives the simplistic “common religious consciousness” and much presumed wisdom a fair hearing, then demolishes them utterly, though not dismissing what is useful in faith. By closely analyzing the writings of Jefferson, Young, Franklin, Paine et al., he quashes the delusion that America was established as a “Christian” nation.
In affording a fresh perspective on the difficult but exhilarating birth of this country, Stewart shows that the often superficially misunderstood words of the Declaration of Independence are even more profound than they appear.
Sallying forth to take on the benighted creationists, novelist and Esquire contributing editor Storr (The Hunger and the Howling of Killian Lone, 2014, etc.) takes pause and realizes that his way of thinking is not all that different from what is being presented from the pulpit of the church. Yes, his chosen approach is that of a rationalist, but how biased and compromised is it? What, really, does he know about the nitty-gritty of evolution, unmediated by the fine reasoning of a Darwin or a Dawkins? And where do our beliefs come from? It is unproductive and deluding to simply dismiss a belief as stupid; intelligence does not arbitrate against odd beliefs, for some clearly bright people hold some curious, complex, elusive notions. So Storr ventures with new eyes into their territory, to the outlandish and the heretical, all the while exploring theories of the brain and how it perceives the world. As he notes, each of us is a concoction of sensory pulses that fashions a unique vision: “Cognitive dissonance, confirmation bias, the brain’s desire to have the outer, real world match its inner models—it takes us part of the way there,” he writes. “It tells us that a properly functioning brain cannot be trusted to think rationally….” The author presents superb stories of visiting with voice-hearers, smug skeptics, sufferers of the Morgellon itch, Holocaust deniers, recovered-memory confabulators, and he combines these stories with his often humorous personal tale—which included experiencing his own murder through the process of hypnosis. Storr’s piercing narrative is piquant and full of surprises and reversals of circumstance, as well as plenty of undeniably valuable information.
“The mind remains, to a tantalizing degree, a realm of secrets and wonder,” writes the author, and so, too, does the world around us, which he entertainingly scours for the possibility of crucial anomalies.
Journalist Teicholz combs the science, or lack thereof, to learn how the fats in the American diet grew horns and cloven hooves.
“Almost nothing we commonly believe today about fats generally and saturated fats in particular appears, upon close examination, to be accurate,” writes the author. Appallingly, those are still fighting words when it comes to the mandarins who fashion our national health agenda, those crazy pyramids that flip on their heads now and again like the magnetic poles. Like a bloodhound, Teicholz tracks the process by which a hypothesis morphs into truth without the benefit of supporting data. The author explores how research dollars are spent to entrench the dogma, to defend it like an article of faith while burying its many weaknesses and contradictory test results. In this instance, Teicholz zeroes in on the worries over skyrocketing heart-disease figures in the 1950s. Some (flawed) epidemiological work suggested that serum cholesterol deposited plaque in arteries, leading to coronary disease. This type of associative simplicity is that spoonful of sugar: the easy fix everyone wants when long-term, clinical tests are needed to appreciate the complex processes involved. This desire to corner the bogeyman targeted the world of fats, and it has stayed that way despite all the evidence and advancements in medical science, especially endocrinological studies, that have pointed to other biomarkers. Galling, though hardly unexpected, is the role played by money and the power we let it bestow. There were reasons the food industry wanted to stick with trans fats as opposed to saturated fats, and Teicholz tics them off, and there are reasons that the next great hope, vegetable oils, have dangerous health issues hidden instead of heralded. Sixty years after the fat attack, “a significant body of clinical trials over the past decade has demonstrated the absence of any negative effect of saturated fat on heart disease, obesity, or diabetes.”
Solid, well-reported science in the Gary Taubes mold.
Science journalist Vince chronicles a two-year journey around the globe to evaluate warnings that we face an ecological tipping point.
“Deserts are spreading…forests are dying and being logged….Wildlife is being hunted and dying because of habitat loss,” writes the author, who also notes that we currently use 30 percent more natural resources per year “than the planet can replenish.” Geologists are calling this the Anthropocene epoch due to “the changes humans are making to the biosphere.” As the author acknowledges, we are the first species “to knowingly reshape the living Earth's biology and chemistry. We have become the masters of our planet and integral to the destiny of life on Earth.” Despite this dim picture, the author found grounds for optimism on her travels. Vince takes the hopeful view that we will act in a timely fashion to “preserve nature or master its tricks artificially.” In China and India, she chronicles government efforts to address atmospheric pollution and looming water shortages. Her main interest, however, is the inventiveness of people at the local level dealing with these problems. Vince believes that they are ushering in “an extraordinary new human age…creating artificial glaciers to irrigate their crops, building artificial coral reefs to shore up islands, and artificial trees to clean the air.” The author was most impressed by the cumulative effect of small changes in heretofore-inaccessible mountain regions that now generate electricity using microhydropower; these areas have also gained access to the Internet and improved sanitation. She discusses the work of “[h]ydrologists in Peru [who are] building tunnels to drain an Andean glacial lake” as a way to control disastrous flooding. On a smaller scale in the Indian village of Ladakh, a local engineer is leading a project to convert mountain wastewater into a series of man-made miniglaciers connected to irrigation canals. Everywhere she traveled, Vince continued to see great promise in human creativity.
A well-documented, upbeat alternative to doom-and-gloom prognostications.