The Crimson's strident coverage of the "radiation" experiments performed in the '50s and '60s by investigators here and elsewhere strikes me as carping, judgmental and in places sensationalist. I would say the same for the reporting in The New York Times and The Boston Globe.
I write from the vantage point of long experience both as a hematologist--who while doing routine clinical work often administers radioactive isotopes to patients for diagnostic purposes--and as a biochemist, who has been using radioactive tracers for in vitro experiments since 1950 when I began as a young investigator in the regional Atomic Energy Commission (AEC) laboratory at the new UCLA Medical School.
The director of that unit, known at UCLA as the Atomic Energy Project, was UCLA's new Dean of Medicine, Stafford L. Warren, a former radiology professor at the University of Rochester--and before that at Harvard. He had also been chief medical officer of the Manhattan Project at Oak Ridge National Laboratory and was deeply involved with medical aspects of the first atomic bomb tests at Los Alamos and Eniwetok Atoll.
Staff Warren, by the way, should not be confused with Harvard's late and distinguished Shields Warren, so prominently mentioned in recent Crimson accounts.
The Project had two missions during the five years I worked there. One was basic radiobiological research into the nature of the effects of radiation on living tissue.
The other concerned operations: project personnel (including me) were active participants in the famous series of atomic bomb tests held at the Nevada Test Site at Camp Mercury over several years in the 1950s.
Those were the years in which most of the experiments we have been reading about were performed here and elsewhere. I would like here to make several points that in my opinion need badly to be made.
The first is simply this: no one in those years really knew very much about the hazards of radioactivity, especially when given in low doses. That contentious subject still preoccupies researchers.
It was in the early 1950s that our Atomic Bomb Casualty Commission working in Hiroshima observed an increased incidence of leukemia among individuals exposed to the atomic bomb--a puzzling observation because these cases did not emerge for several years after the detonations. There was also almost no understanding of the phenomenon called fallout.
Indeed, it was the early bomb tests that brought fallout to scientific attention (Project Nutmeg, 1949). Later it was the Camp Mercury tests and their melancholy consequences that brought it to public attention.
So, point one, we were really quite uniformed on the risks of radiation in those years.
Second: no one today would defend the administration of radioactive material to other human beings, sick or healthy, without their knowledge and consent. But the work in question was done 40 years ago--in another era in which the ethics of medical experimentation were perceived rather differently.
At the least, journalists reporting this story today should balance the picture by recalling the occasional horrors of medical experimentation--many far worse--that had nothing to do with "radiation" (that word again).
A litany of the acts we now deem offensive was long ago detailed in books and articles, beginning in the late '50s with the insistent writings of our late Harvard colleague, the Dorr Professor of Research in Anesthesia, Henry K. Beecher.
Statements like his raised our consciousness, albeit tardily, and led almost immediately to the establishment in medical research institutions of the human studies committees that now regulate ethical aspects of human experimentation.
Read more in Opinion
Lampoon Is Tasteless