![]() |
Volume 49, Number 2, Summer 2000 |
by Robert Kanigel
![]() |
|
As a writer of popular magazine articles and books about science, I'd all along fancied I was doing something worthwhile: You take a reader weary of political scandal, mindless commercials, and maybe a rotten job, and lead him into a world of brilliant men and women doing important work at the edge of human knowledge. Worthy endeavor I thought it was.
So imagine how I felt when I realized that many scientists viewed my metier quite otherwise. Meyer's cartoon colleague, after all, takes deep satisfaction in never having "stooped" to it.
Science popularization covers a broad territory-from news items in the paper and on TV to magazine articles to books aimed at a general readership. But the common ground is that, as writer, you make hard work easy. Of course, the hard work you make easier is that of the reader; achieving that means brutally hard work on your part.
I feel I'm popularizing whenever the subject of the writing is itself difficult. Science and technology are the readiest sources in our culture for intellectually daunting material. But consider music theory, law, health policy, intellectual history, literary theory, philosophy. All these, and many other fields, present most of the same problems: They deal with difficult ideas on which the reader is apt to gag unless they're presented in palatable form. Doing just that-making the complex simple and the obscure familiar-is what I mean by popularization.
Within this broad range, consider a whole spectrum of popularized treatments. You could write about a would-be miracle drug for the National Enquirer. Or for the New York Times. Or, conceivably, for the old Mosaic magazine. Until a few years ago, Mosaic was published by the National Science Foundation. If the National Enquirer represents the outer limit of science popularization, where there's almost nothing left of intellectual substance or nuance, Mosaic might be thought to represent a kind of inner limit.
Who read Mosaic? Scientists. Smart, hard-driving scientists with Ph.D.s, and big labs, and ambitious research agendas. Why include it within the precincts of popularization? Because the scientists who read it were reading outside their own fields. Mosaic was for particle theorists reading about genetics. For parasitologists with no more knowledge of number theory than, well, the average reader of the National Enquirer.
I wrote a couple of articles for Mosaic and they were among the most challenging I ever did. To a reader of the National Enquirer, or maybe even the New York Times, neither might seem particularly "popular." And yet, writing them demanded many of the same skills and sensibilities I'd use to write on the same subject for the other two publications: They demanded that I step out of the world of the expert, and move to the mind of a reader presumed largely ignorant of what that expert knows.
Here, then, is the essential step, the one that makes popularizing what it is and makes it so difficult: It's the change of frame. It's seeing a subject not through the eyes of a neuroscientist, chessmaster, or choreographer, but through the sometimes vacant eyes of the rest of us.
To popularize means never writing from one insider to another. The fact that information may be new to a group of readers is not the deciding element: If they share a deep common base of knowledge, a common frame, then there's little or no context to establish. That's not popularization.
Neither is writing for readers who, while they may have little specialized knowledge of the subject, need to learn it. As popularizer, you don't teach your reader to analyze a heart-monitor trace, navigate through a piece of software, or pass an exam. Your reader has no motivation save normal human curiosity.
What, then, are the perils of popularizing science? What can go wrong in trying to take something complex and nuanced, perhaps something at the very edge of the writer's own understanding, and render it easy-to-take?
In a word, everything.
The most obvious thing that can go wrong is that you just bollix things up, make dumb mistakes: It's not adenine that's one of the four nucleotide bases in DNA, you write, but, uh, aspirin. Some magazines have fact checkers to help catch bloopers like that, but it should never get that far; it's the writer's responsibility.
A more serious problem is that in distilling out the essence; in sieving out needless or distracting detail, you leave out crucial details, and so give a fundamentally wrong impression. If a scientist's conclusion is surrounded by half a dozen ifs, ands, and buts, to omit them all is to distort.
Another problem is that in simplifying, you make it simplistic. DNA, the reader "learns," is a sort of curlicue molecule that makes your eyes blue or green. While not in every way wrong, what you've written is weightless. You leave the reader to grumble "So what?" Or "Oh, I knew all that." Or "There's nothing new here."
Or else you can go the other way, failing to sufficiently boil down difficult material. You're all over the place with hydrogen bonds and ionization constants and calcium channels. Maybe the scientists you interviewed give you points for not making a mess of things. But those points don't count, because your reader, who hasn't been near a classroom in 20 years, comes away hopelessly lost. Make no mistake, you've let your reader down.
Finally, you neither get the facts wrong, nor leave a seriously wrong impression, nor go too far, nor fail to go far enough, but somehow, oh-so-subtly, skew it wrong: Does the new, much-vaunted AIDS treatment have serious problems but nonetheless give cause for hope? Or does the new AIDS treatment, looked to with much hope, have serious problems? What's the right shading to give it? Is there a "right" shading?
These are real perils with serious consequences: Readers misinformed. Excitement about a breakthrough drug that isn't. Thousands cruelly disappointed. People going about their daily lives feeling stupid and incompetent. A scientist hurt because his contribution wasn't fully recognized. Your reputation, as a writer, soiled.
On the eve of the American Revolution a man named Simon Winship lived in Lexington, in the old Massachusetts Bay Colony. When they asked him about it later, he swore the redcoats fired first. But William Sutherland, a British officer, said the rebels opened fire first-that it was their attack that provoked the British response that killed eight colonial militiamen.
Who really fired "the shot heard round the world," sparking the American Revolution? No one knows. These statements were among more than a dozen first-hand accounts of that April morning coming down to us today. Together they've been described as "scanty, conflicting, contradictory, and incomplete." Which is why, some years ago, a Vassar College class studied them.
And the course in which they studied them? American History 101? A graduate course in historiography? No, it was a course called "Introduction to Scientific Inquiry," part of an Alfred P. Sloan Foundation-sponsored program to bring science and technology to liberal arts classrooms. Students analyzed seemingly contradictory reports in a medical journal on the risks of estrogen. They read popular accounts of scientific research, comparing them with the journal articles in which they'd first appeared.
And they studied what happened that day on Lexington Green. Because, while removed from science, this case study showed how even evidence as seemingly straightforward as eyewitness testimony can wallow in ambiguity. Just as scientific evidence can. What many see as science and technology's black-and-white world of hard data and established fact, the lesson went, is as mired in murky gray as any other.
Who among the eyewitnesses on Lexington Green "got it right"? To this day, we don't know.
Who among the many accounts of cold fusion got it right? It now seems clear that the researchers themselves didn't.
Recently, I wrote an article about the development of a vaccine against Lyme disease. But my piece was also about the risks the scientists took in pursuing their ideas, the mistakes they made, their doubts, how they came to understand how their vaccine worked. An important part of my story was the gentlemanly competition between two groups of researchers, from Harvard and from Yale.
I like to think there wasn't a single factual error in my piece. But did I truly "get it right"? How large a role did that friendly rivalry play? A little? A lot? How give it its proper weight?
The fact is, you always get it a little wrong. How many who have read Samuel Beckett's Waiting for Godot, or watched it performed, take away from it the same insights? How many remember the same bits of dialogue?
Ambiguity intrudes whenever you translate the real world onto the printed page, and intrudes all the more as that page plays out in the minds of the reader. Even when the printed page is a peer-reviewed article in a distinguished scientific journal, something will be lost. Scientists themselves, I think, realize that a printed journal article only loosely represents the experimental work, leaving out as much as it includes.
So those who worry about the work of science popularizers have reason to worry. Representing the intellectually daunting work of science is easy to mess up. Dare we leave the job, one might reasonably ask, to a mere writer?
Every writer, then, and every editor, and every educator of science writers and editors, owes his or her best efforts to surmounting these difficulties. Yes, learn the science well. Listen carefully. Scrupulously check your facts. The perils are great and the standards set by stern critics of science popularizing justify all the intelligence and care you can bring to the job.
And yet, merely to satisfy such concerns is to set the bar too low; a higher standard properly applies, one beyond issues of accuracy and balance.
It is never enough, I would argue, that the reader should, by hard work, be able to understand what you have written. He's got to feel moved to understand it.
Put another way, there must always be an emotional component in science popularization; and the ultimate peril is leaving it out. Any work of science writing-however clear, accurate, and balanced-that fails to engage the reader emotionally is, at some level, science writing that has failed.
It's natural, I suppose, to assume that popularizing science means taking something away from it. But I wish to suggest that the science popularizer adds something, too-something not residing in equations, or charts, or the arcane terminology of the journal article, or the formal conference presentation.
I once wrote a biography of the Indian mathematician Srinivasa Ramanujan. But The Man Who Knew Infinity is really a dual biography, its second subject being G. H. Hardy, the English mathematician who discovered Ramanujan, brought him to England on the eve of World War I, and fruitfully collaborated with him for five years.
Now Hardy, as it happens, was a gifted writer. Speculating about what career he might have chosen other than mathematics, he wrote that "journalism is the only profession, outside academic life, in which I should have felt really confident of my chances." Of their joint papers, recalled one of his collaborators, "He supplied the gas."
The "gas," as Hardy once defined it, was the "rhetorical flourishes," the equivalent of "pictures on the board in the lecture, devices to stimulate the imagination of pupils." A reviewer said of one of his mathematical texts-one of his advanced texts, mind you, aimed solely at mathematics students-that Hardy had "shown in this book and elsewhere a power of being interesting which is to my mind unequaled."
It is this "gas," this species of emptiness, that Hardy supplied. And so does a good science popularizer. But it is not empty. And it does not water science down. It juices science up. It adds sensuous detail, human context, feeling.
Hardy was a number theorist, a pure mathematician. He'd get annoyed when people said how mathematics was important because, after all, it was useful. "I have never done anything 'useful,'" wrote Hardy. "No discovery of mine has made or is likely to make, directly or indirectly, for good or ill, the least difference to the amenity of the world." That mathematics might aid in the design of bridges or enhance the material comfort of millions, he wrote, was to say nothing in its defense. Pure mathematics "must be justified as art," Hardy wrote, "if it can be justified at all."
I confess some sympathy for his view. Deep down, I care little if my reader comes away better able to function in the world of science and technology about which I write. Rather, I write articles and books which, if they cannot be dignified as art, at least confer on my reader an intellectual, emotional, or sensual experience. I try to leave my reader with a heightened few moments which might have something in common with those produced by viewing art or listening to music. I fail more often than I succeed, but I try.
But while I don't write in order to teach, this is not to say
that readers of my articles and books do not learn; I think they
probably do. I don't set out to teach my readers about Lyme disease,
or machine tools, or prime numbers. But some of them will, I suspect,
learn a little about them along the way.
And they will learn more, my deepest prejudice holds, by virtue
of that emotional component-the sense of movement, drama, excitement,
and surprise that flows over them as they read.
I learn more when feeling quickens thought. I think most of us do.
Yes, sometimes we must memorize the organelles of a cell or calculus integration formulas. But don't we learn more, and better, when sustained by the waters of some deep emotional spring?
All understanding not in some way infused with feeling is, I'd go so far as to say, incomplete.
My first book, Apprentice to Genius: The Making of a Scientific Dynasty, is about mentor relationships among elite scientists. This dynasty had its roots among a handful of scientists in the 1930s and 1940s, and its impact is felt today in pharmacological research labs around the world.
While researching this book I interviewed dozens of scientists. I learned that, far from cool and heady, their relationships with their mentors were deeply personal ones that burned with intensity. I had but to broach the name of the person who had molded him or her as a scientist, and any cold recital of facts would cease. The voice would soften, or quicken, or rise to anger, or otherwise fill with feeling. One's mentor, I found, was rarely a neutral subject.
If you're like us, you woke up this morning with two questions burning at your brain: (1) what choice of breakfast entree will start my day off just right?, and (2) whatever happened to Joan Collins? It turns out, conveniently, that the answer to both questions is the same: sardines! Collins, you'll recall, is best known for starring in the '80s TV show "Dynasty" as Alexis. It turns out Collins is still very much around, and she tells the London Daily Telegraph that she owes her health and vigor to the consumption of a foodstuff that some people tend to think of largely in terms of bait. "I'll often open a can of sardines, because sardines are terribly good for you," Collins says. "Sardines contain a tremendous amount of DNA-which is one of the things one should eat a lot of." The actress-turned-nutritional-scientist does not specify how she determine that sardines have a virtual monopoly on DNA, which, being the basis of all life, was hitherto believed to be parceled out fairly equally among fishes. We're left to second the Telegraph's diplomatic summary: "Collins' greatest advantage, health-wise, is not, one suspects, an academic awareness of medical matters." Source: Private Eye, San Diego Union-Tribune, July 13, 2000 |
Do the works of Lewis and de Kruif give a truly accurate sense of the work of science? "Two hundred and fifty years ago," de Kruif writes, "an obscure man named Leeuwenhoek looked for the first time into a mysterious new world peopled with a thousand different kinds of tiny beings, some ferocious and deadly, others friendly and useful, many of them more important to mankind than any continent or archipelago."
Such stage-lit prose grew out of the author's wish to illuminate dramatically a terrain that might otherwise be mistaken for being stark, alien, and gray. He knew it was not. But when he writes that way is he conveying the kind of scientific substance that critics of popularization want us to impart?
By the stern standards established by a just-the-facts-ma'am paper delivered at a scientific conference, this falls far, far short. It is idealized, semi-fictionalized, romanticized, bastardized, insufficiently substantive, hopelessly imprecise. This is science popularization, one might cringe, way over at the extreme. Some might say, at its worst.
But it seduced, and inspired, the budding scientists of its day.
We need more of it, not less.
Edited from "The Perils of Popularizing Science," the Seventh Alfred and Julia Hill Lecture on Science, Society and the Mass Media, delivered Apr. 7, 1999, University of Tennessee, Knoxville.
Robert Kanigel is a professor of science writing at the Massachusetts Institute of Technology.