Before January 2009, Cass Sunstein was a legal scholar best known for academic books like Nudge: Improving Decisions About Health, Wealth and Happiness and Infotopia: How Many Minds Produce Knowledge and “good government” reforms like reducing paperwork and simplifying tax forms.
After January 2009, he was conservative media’s Public Enemy No. 1.
When the Obama transition nominated Sunstein to head the White House Office of Information and Regulatory Affairs, which oversees information technology policy and streamlining of federal regulations, the right got out the long knives over some of Sunstein’s writings in academic journals on animal welfare and gun rights.
“I didn’t think of myself as particularly partisan in any way,” Sunstein says. “I thought of myself as pretty moderate. Then I was nominated by the president for a Senate-confirmed position and was astounded to find I was being portrayed as a Marxist, a Communist and a Nazi.”
Glenn Beck, then of Fox News, named Sunstein “the most dangerous man in America.” The NRA’s Wayne LaPierre called him “a radical animal rights extremist who makes PETA look like cheerleaders with pooper-scoopers.” The American Conservative Union set up a “Stop Sunstein” website. The Senate eventually confirmed him after months of delays and on a largely party-line vote.
To understand the source of conservative ire at Sunstein, it helps to appreciate the difference between working in academia (where ideas are the point of the exercise) and working in government (where ideas are partisan volleyballs). It also helps to distinguish his politics (moderate to center-left) from his pragmatism (creative, quixotic, and often radically inventive).
I talked recently with Sunstein, who returned to Harvard Law School after three years in the Obama administration, about his new book, Conspiracy Theories and Other Dangerous Ideas, a collection of some of his more controversial essays.
You’ve heard the phrase: “The fox knows many things, but the hedgehog knows one big thing.” Do you think of your writing and ideas as a reflection of many interests, or is there something that ties it all together?
That comes from an Oxford philosopher named Isaiah Berlin who said there are two different kinds of thinkers — a fox and a hedgehog. I don’t think he invented it, but he’s been a popularizer.
The idea of a nudge, or the behavioral-economics approach, is much more of a hedgehog-like idea. Behavioral economics has a lens on human behavior, which is that people make mistakes and we can actually specify the mistakes and make lives better accordingly, and a nudge is certainly a framework that has a hedgehog-like quality. By nature, though, I am more fox than hedgehog in the sense that the range of ideas that seem useful to me is very eclectic, and the range of topics worth exploring and, if you’re lucky, trying to make a contribution to is wide.
During your work in the Obama administration, did you get a chance to see how some specific things you wanted to do actually worked in practice? Did they work the way you thought they would?
No academic should go into government thinking, “Now I’m going to see my ideas realized.” You go in thinking you’ll serve the American people and what problems Americans are facing.
Like many others, I had thought about things like the fuel-economy label on cars being insufficiently informative, and we produced a better label. Like many others, I had been concerned about the old food pyramid, which was hard to understand, and we replaced it with the food plate. Like many others, I had been concerned that seniors under the [Medicare] prescription drug plan weren’t given enough help in figuring out which plan would be best for them, and we worked to help seniors make that decision in a way that would suit their needs a bit better. With many others in behavioral economics, I thought that automatic enrollment in retirement plans would help give people more security when they leave the workforce and worked on helping that to happen.
The lead piece in the book is about how a conspiracy theory can find an audience even when there’s no evidence to support it. Can you give me a good illustration of that?
The idea that the U.S. or Israel was behind the 9/11 attack is an example of a conspiracy theory. So far as the evidence can be found, it’s a very false conspiracy theory.
How did that find an audience?
It was a combination of people who don’t like the United States very much in some countries, people who don’t like Israel very much in those countries, were partly motivated by their feelings about those two countries and seizing on apparent ambiguities about what happened. If a person gives an account that appears credible to non-experts, then the idea can get traction.
What’s interesting about the conspiracy-theory question is that people who believe false conspiracy theories are often completely rational and intelligent people. It’s just what they hear from the people they are surrounded by that they trust is not consistent with truth. There are some people who are drawn by their nature to conspiracy theories—either because they are generally distrustful or angry—but any one of us could believe a false conspiracy theory if people we trust believe it too.
And some conspiracy theories, of course, turn out to be true. With Nixon and Watergate, some people initially thought that the White House couldn’t be involved in that kind of stuff. Well, it turned out to be true.
A lot of what you write about conspiracy theories strikes me as applying more broadly to other widely held beliefs that have no factual basis.
You’re completely right. Conspiracy theories are interesting, they can be bad and dangerous when people believe them, but the same mechanisms by which conspiracy theories spread are the same mechanisms by which falsehoods of all kinds spread. For children (I hope little kids don’t read Kirkus) the belief that Santa Claus is real is the same mechanism.
Spoiler alert! It’s the same mechanism. It’s kind of illuminating. What makes the conspiracy-theory topic more interesting is that it’s a window onto how beliefs of all kinds get formed.
One thing about false beliefs that has always mystified me is this: I’m a Tennessee football fan; when they’re terrible, I know they’re terrible. Then you read accounts of election night in 2012, and a lot of the Romney people sincerely thought he was going to win despite overwhelming polling to the contrary. How do you draw good rationalizations with one and totally crazy rationalizations with the other?
With Romney people, I think it was a combination of things: One was motivated reasoning and the other was plausible ambiguity. I think that’s a very powerful phenomenon in a political contest where people think their person is more likely to win than reality warrants.
With Romney, the basic problem was that they didn’t believe the polls because they thought that—as I recall—certain people weren’t going to turn out. They had a projection with an error in it; it wasn’t an idiotic error, but they probably wouldn’t have made it had they not cared about the outcome. The caring about the outcome can jeopardize accuracy.
With Tennessee football, or for me the Chicago Bears, there’s no ambiguity in the situation. If reality is staring you in the face, you have to have an extremely strong motivation. There has to be some ambiguity to seize on for you to ignore it.
Scott Porch is an attorney and writer in Savannah, Georgia. He writes frequently about books for Kirkus Reviews, Salon.com, and Huffington Post, and he is writing a book about social upheaval in the 1960s and '70s.