Despite manifest shortcomings, the American health-care system has come a long way over the past 250 years.
Retired surgeon and public-health professional Rutkow (James A. Garfield, 2006, etc.) provides an anecdotal overview of the theory and practice of medicine in the United States from the colonial period to the present. “The evolution of American medicine has often closely mirrored the nation’s history,” he writes, from the early days when the “bleed, blister, puke, and purge” therapies were advocated by America’s leading doctor, Benjamin Rush, to the nation’s superpower status after World War II. By the mid-1800s, prevailing wisdom recommended avoiding doctors, but the invention of the stethoscope in 1816 by a Parisian physician was an important step toward modern diagnostics, especially when coupled by the French practice of measuring pulse rate. Infection remained the major cause of death during the Civil War and even in 1881—after the role of germs had been established—when President Garfield died from an infection in the aftermath of a gunshot wound. The passage of New York City’s Metropolitan Health Bill in 1866, which established an effective sanitation code controlling raw sewage and other harmful materials, was “a major triumph in the history of public health and American medicine,” allowing contagious diseases such as cholera and typhus to be brought under control. Major advances continued during WWII—the development of blood plasma storage, orthopedic procedures, operating techniques—and in the postwar period with open-heart surgery and organ transplants. Rutkow is optimistic about the future role of new biotechnologies (“cellular scanners, gene therapies, robotic surgeries, wireless monitoring”—but he recognizes the difficulty of navigating the tradeoffs between private service and public good.
A useful primer for policy wonks and medical practitioners.