ReadSpeaker:
ListenLarger documents may require additional load time.
NUREMBERG'S LEGACY: SOME ETHICAL REFLECTIONS
Perspectives in Biology and Medicine. 43.3 (Spring 2000): p347.
Full Text: 

The Nuremberg Code: Its Context and Content

The Nuremberg Code was promulgated by four American judges at the Doctors' Trial at Nuremberg, in the case of the United States of America v. Karl Brandt et al. in 1946 to 1947. Twenty-three defendants, all but three of them medical doctors, were tried for truly horrendous "crimes alleged to have been committed in the name of medical or scientific research" (both war crimes and crimes against humanity), including the horrific high altitude, freezing, malaria, and mustard gas experiments, among others, which resulted in countless severe injuries and deaths [1]. Fifteen were found guilty; seven received the death sentence. The court's judgment included the "Nuremberg Code," consisting of 10 "basic principles" to govern permissible medical experiments.

This code, which received substantial input from two American physicians-Leo Alexander and Andrew Ivy--was an attempt to close the gap created by the relative absence of formal statements of ethics in human experimentation by a more or less authoritative body. Several times in this essay I will use the metaphor of gaps to indicate how various efforts to formulate ethical principles, rules, and procedures seek to fill or close or cover holes in the protections of research subjects' rights and welfare. This metaphor certainly fits the promulgation of the Nuremberg Code. The judges probably adopted this Nuremberg Code, as Leonard Glantz suggests, because of "their shock in finding that there were essentially no written standards for human experimentation that had been adopted by an authoritative institution" [2]. Hence the 10 rules, a modern-day decalogue for human experimentation, so that no one could plead ignorance of ethical obligations.

The 10 Nuremberg principles were not created ex nihilo. There were few earlier codes, but, ironically, two German regulations earlier in this century (a 1900 Prussian directive and a 1932 Reich Circular) offered strong statements of ethical obligations and rights in research (see [3, pp. 127-32]). Unfortunately, codes do not guarantee ethical conduct. And several physicians had already articulated ethical standards to guide human experimentation [3, pp. 121-26]. There is debate, however, about whether they represented only "occasional voices" (Jay Katz) or actually constituted "a powerful tradition in ethical thought" (David Rothman) [4, 5]. At any rate, the two physicians who served as expert medical witnesses for the prosecution drew on earlier formulations including, somewhat inappropriately, the Hippocratic tradition, which doesn't really address the issues raised by research with human subjects [3, pp. 132-37].

WHY THE NUREMBERG CODE?

Glantz is probably right about the judges' shock at not finding a written, authoritative code and about their felt need to close that gap. They clearly did not need the Nuremberg Code in order to state principles by which they could condemn the doctors' actions: they had sufficient grounds to do so in existing laws, even German laws, that prohibited murder, mayhem, and maiming, which the physicians (and others) had not extended to Jews and to others they viewed as less than human (and then subjected to actions that were even prohibited by a 1933 German law for protecting animals) [3, p. 132]. The guilty verdicts were, in short, overdetermined, and they did not presuppose a code for human experimentation--the verdicts were adequately justified by other rules against war crimes and crimes against humanity.

The court and others, such as Telford Taylor, the chief counsel for the prosecution, viewed the Nuremberg Code as a summary of common medical morality, at least as affirmed even if not always practiced. It expressed putatively universal standards with "many antecedents" [6, p. 150]. The Code's prefatory statement notes that human experimentation can "yield results for the good of society that are unprocurable by other methods or means of study." "All agree, however," the prefatory statement continues, "that certain basic principles must be observed in order to satisfy moral, ethical and legal concepts" [7, p. 102]. The 10 principles then followed.

CONTENT OF THE NUREMBERG CODE

1. "The voluntary consent of the human subject is absolutely essential." The statement then specifies the meaning of this first principle to include the requirement that the "person involved should have the legal capacity to give consent; should be so situated as to be able to exercise free power of choice, without the intervention of any element of force, fraud, deceit, duress, overreaching, or other ulterior form of constraint or coercion; and should have sufficient knowledge and comprehension of the elements of the subject matter involved as to enable him to make an understanding and enlightened decision." This last requirement is itself further specified, along with the insistence that the one who initiates, directs or engages in the experiment has the "personal duty and responsibility" to ascertain the quality of the subject's consent.

2. "The experiment should be such as to yield fruitful results for the good of society, unprocurable by other methods or means of study, and not random and unnecessary in nature."

3. "The experiment should be so designed and based on the result of animal experimentation and a knowledge of the natural history of the disease or other problem under study that the anticipated results will justify the performance of the experience."

4. The fourth principle requires the avoidance of "all unnecessary physical and mental suffering and injury."

5. The fifth principle rules out experiments in which there is "an a priori reason to believe that death or disabling injury will occur." It recognizes a possible exception in experiments in which "experimental physicians also serve as subjects."

6. This principle requires proportionality between the risks and probable benefits of the research: "The degree of risk to be taken should never exceed that determined by the humanitarian importance of the problem to be solved by the experiment."

7. "Proper preparation should be made and adequate facilities provided to protect the experimental subject against even remote possibilities of injury, disability, or death."

8. This principle limits research to "scientifically qualified persons" who should be required to exercise the "highest degree of skill and care" throughout.

9. "During the course of the experiment the human subject should be at liberty to bring the experiment to an end if he has reached the physical or mental state where continuation of the experiment seems to him to be impossible."

10. This last principle indicates that the "scientist in charge must be prepared to terminate the experiment at any stage" if there is probable cause to believe that its continuation "is likely to result in injury, disability, or death to the experimental subject."

Two of these 10 principles (numbers 1 and 9) deal with the potential or actual subject's right to consent or refuse--voluntarily and with adequate information--to participate in research, and to withdraw from it, while the remaining eight deal with the subject's welfare in the context of the protocol, its design, its necessity, and its balance of risks and benefits and with ways to protect that welfare.

RECEPTION IN THE UNITED STATES

For many years the Nuremberg Code played virtually no role in ethical discussions, public policies, and legal decisions in the United States. It was effectively circumscribed and even marginalized in various ways. As formulated in the context of a criminal trial, it could be considered a code for barbarians, the Nazis, who were guilty of brutal excesses, not a code for civilized researchers. Another line of dismissal considered codes with sanctions unnecessary and insufficient because research subjects are truly protected only by virtuous professionals [8]. And yet adopting the Nuremberg Code's principles would have challenged several practices in the United States, including the use of prisoners in certain experiments. Ruth Faden and colleagues conclude: "the Code, at the time it was promulgated, had little effect on mainstream medical researchers engaged in human subjects research" [9].

In short, both bad and good reasons led to the Code's relative neglect. An indefensible reason was the failure by many physicians and investigators to view voluntary, informed consent as very important, especially in therapeutic research, in part because medical paternalism still reigned in therapeutic contexts. A more defensible (but still inadequate) reason is that the Code itself is imperfect and incomplete. While designed to close a gap--to articulate a formal, authoritative code of common medical morality in experimentation--it also left some gaps and filled others with rigid and unyielding principles. A few examples will illustrate the Code's deficiencies.

First, the Court conceded that it was mainly concerned with "those requirements which are purely legal in nature--or which at least are so clearly related to matters legal that they assist us in determining criminal culpability and punishment" [10]. Such a focus necessarily omits or at least downplays concerns that are more ethical in nature and have little to do with criminal culpability and punishment.

Second, the Nuremberg Code considered only non-therapeutic research, and it ruled out all research involving incompetent subjects because of its absolute rule of subject consent---only the subject could consent to his or her participation in research. (Incidentally, Dr. Alexander's memorandum for the court included proxy consent for incompetent subjects, but this was omitted by the judges [3, pp. 135-36].) The Code's omissions are certainly understandable in light of the terrible non-therapeutic experiments the court had to address, but these gaps almost certainly contributed to the code's "marginalization in modern medicine" [11, p. 308].

The Declaration of Helsinki

"Most discussions begin with Nuremberg," George Annas and Michael Grodin observe, but "almost none end there, and there has been a consistent and insistent movement away from the directness of the Code toward more flexible forms of judging the conduct of human experimentation" [11, p. 307]. I want to test their judgment with respect to the Declaration of Helsinki adopted by the World Medical Association in 1964.

FEATURES OF THE DECLARATION OF HELSINKI

Most interpreters of medical research ethics concur that the Declaration of Helsinki was "greatly influenced by the Nuremberg Code," despite its significant departures from the earlier code [6, p. 158]. The Declaration has been revised and updated several times since its adoption in 1964 (1975, 1983, and 1989). Perhaps one major reason is that Helsinki I (as the first is now called) failed to include informed consent as one of its "basic principles," even though it did incorporate informed consent into its requirements for both non-therapeutic and therapeutic research. And, in contrast to the Nuremberg Code, it stressed (in 3c) that "consent should as a rule be obtained in writing."(1)

The Declaration of Helsinki represents at least two major developments in ethics in research involving human subjects. First, it offers a less stringent requirement of the research subject's own voluntary, informed consent: it allows some incompetent subjects to be enrolled in some research protocols on the basis of a legal guardian's consent or permission. Second, the Declaration distinguishes therapeutic research (what it calls "clinical research combined with professional care") from "non-therapeutic clinical research." And it clearly extended its basic principles along with the requirement of voluntary, informed consent to therapeutic research.

Helsinki II in 1975 further strengthened informed consent by making it a basic principle. It also mandated independent ethical review committees, and insisted that reports of experimentation violating its ethical principles not be accepted for publication. Hence, it recommended procedures for protecting human subjects, not simply substantive standards.

ORDER OF PRINCIPLES

Debate continues about whether these codes, singly or together, adequately covered, or filled, important gaps in the protection of human research subjects. Jay Katz vigorously defends the Nuremberg Code over the Declaration of Helsinki, even in its later versions. For him Helsinki is partially responsible for "the unfulfilled legacy of the Nuremberg Judges" because it stresses the advancement of science rather than voluntary, informed consent (the Nuremberg Code's first principle) [4, p. 1665]. Katz also criticizes the later, revised versions of the Declaration that do include voluntary, informed consent among the "Basic Principles." His criticism continues because these revised codes list consent ninth rather than first--that is, consent follows principles concerning research protocols, qualified researchers, and balancing risks and potential benefits.

Katz's complaint forces us to consider the moral rhetoric of codes and declarations. In doing so, we need to distinguish three possible ways to arrange principles: order of presentation; order of practical consideration; and rank order. The order of presentation within a code--i.e., which principles are listed first, second, etc.--must be distinguished from the order of practical consideration--i.e., which factors the code indicates should be considered first, second, etc., in practice. And both must be distinguished from the rank order--i.e., the respective weights of different principles. What is critically important is how much weight or strength codes assign to their principles, even to principles listed later. It is possible, for instance, to list a principle--such as voluntary, informed consent--ninth rather than first but also to make it indispensable and even absolute.

A code's order of presentation may reflect the order of practical consideration, which is important when investigators propose or review boards examine particular protocols. For instance, I believe we should in practice consider, and ethically evaluate, the research protocol's design, the research's risks, its probable benefits and their balance, as well as the fair selection of research subjects, before we consider and ethically evaluate the protocol's form and process of voluntary, informed consent. Why? If the research protocol is ethically unsatisfactory on these other grounds, we don't even need to inquire into proposed subject consent. And potential subjects should not be asked to participate in research that fails to satisfy these principles; indeed, it would be unethical to do so. But even if a protocol is ethically satisfactory on all the other grounds, it shouldn't go forward without voluntary, informed consent. In short, it shouldn't matter if a code lists the requirement of voluntary, informed consent in some position other than first place, as long as consent has an essential role in the consideration and evaluation of a protocol.

In contrast to what Katz and others sometimes seem to suggest, voluntary, informed consent is not "a sufficient safeguard" for human subjects [4, p. 1666]. Even if it is the most important safeguard, it is only one such safeguard, and its function has less to do with protecting subjects from harm and more to do with respecting their autonomy. Other safeguards are also very important and even necessary--only when taken together do they provide sufficient protection for research subjects' rights and welfare. The egregious violation of the principle of consent/refusal in the Nazi experiments provided a reason for the judges who formulated the Nuremberg Code to place subject consent first and to make it absolute.

PROFESSIONAL FOUNDATIONS AND PUBLIC PARTICIPATION

Some commentators also note the different professional foundations of the two codes: the Nuremberg Code was written by lawyers, by judges, in the context of a criminal trial; while the Declaration of Helsinki was written by physicians for physicians [12]. These differences should not be overstated, however, because physicians provided important materials for the judges in the Nuremberg Doctors' Trial and because the judges insisted that the 10 principles reflected medical ethics. Nevertheless, the court, as I previously noted, concentrated on legal requirements and the determination of criminal culpability and punishment. Furthermore, many physicians found the Nuremberg Code problematic because of its rigid set of "legalistic demands," in contrast to the Declaration of Helsinki's "set of guides." Whereas the Nuremberg Code represented a "legalistic document," the Declaration of Helsinki represented all "ethical" one that was "more broadly useful" [13].

Neither Nuremberg nor Helsinki adequately reflects what, in my judgment, has become increasingly important: the role of the public. Of course, researchers and others often assume a public role, but it is also important to recognize the role of the public in deliberation about principles of research ethics and in the function of review boards, such as the Declaration of Helsinki included. The public should have such a role in part because research subjects are drawn from the public. In the United States public participation has been increasingly emphasized in the continuing development and application of federal regulations and guidelines for investigators and IRBs. I will now turn to selected developments in the United States, with particular attention to issues currently before the National Bioethics Advisory Commission (NBAC).

Selected Developments in the United States(2)

Human subjects research in the United States has not always followed either the Nuremberg Code or the Declaration of Helsinki. Revelations in the late 1960s and early 1970s about several unethical experiments, including the notorious U.S. Public Health Service's "Tuskegee Study of Syphilis in the Negro Male," led to the formation of the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research (hereafter the National Commission), which made several substantive and procedural recommendations for protecting research subjects. Procedurally, the National Commission relied on institutional review boards (IRBs), which had already emerged as important mechanisms to protect human subjects; substantively, it formulated several principles and guidelines, many of which became formal regulations in the Department of Health, Education, and Welfare (later the Department of Health and Human Services) and were later incorporated into what became the "Common Rule." (Below I will discuss the National Commission's principles of beneficence, respect for persons, and justice, as expressed in its Belmont Report.) With these developments, along with the work of the President's Commission for the Study of Ethical Problems in Medicine and in Biomedical and Behavioral Research and other structures and mechanisms, research with human subjects seemed both ethically settled and secure.

Indeed, in the 1980s, discussants commonly observed that the major controversies in research concerned the use of animals rather than the use of humans. However, in a fine prophetic statement in 1989, Alexander M. Capron observed: "today the subject [of research involving human subjects] is often naively viewed as one of settled ethical principles, detailed statutory and regulatory requirements, and multifaceted procedures. History suggests that such claims must be viewed skeptically: the principles may be less conclusive and the guidelines less protective than they appear" [15].

The Advisory Committee on Human Radiation Experiments (ACHRE), chaired by Ruth Faden of the Johns Hopkins University School of Public Health, was established by President Clinton in response to stories that put a human face on information that had circulated for years. For instance, a series of reports in the Albuquerque Tribune disclosed the names of Americans who had been injected with plutonium, the manmade material that was a key ingredient of the atom bomb. The ACHRE itself examined the records of several thousand experiments funded and conducted by different branches of the federal government, mostly in secret, as part of the Cold War. These experiments included feeding cereal with minute amounts of radioactive material to the science club at the Fernald School for the Retarded, total body irradiation of cancer patients, and testicular irradiation of inmates in Oregon and Washington prisons; many of the experiments clearly violated the Nuremberg Code and the Declaration of Helsinki. Following its review, the ACHRE recommended how the federal government should respond to its past actions and also indicated how it could learn from the legacy of these cold war experiments [16].

THE NBAC

The National Bioethics Advisory Commission (NBAC) was one result of the ACHRE's report. It was designed not only to respond to some specific issues the ACHRE had raised but also, more generally, to provide a national public forum for dialogue on ethical issues in research involving human subjects. Established by Presidential Executive Order in 1995 (though it did not meet until October 1996), the NBAC is required to "provide advice and make recommendations to the National Science and Technology Council [in the White House], other appropriate entities and the public, on bioethical issues arising from research on human biology and behavior, including the clinical applications, of that research" [17]. According to its charge, NBAC must give first priority to two main areas: (1) protection of the rights and welfare of human research subjects; and (2) issues in the management and use of genetic information, including but not limited to human gene patenting.

Extensive and substantial governmental regulations and guidelines already exist for research involving human subjects, in contrast to genetics, along with relatively settled professional standards. Here the task is to examine what appears in law, regulations, guidelines, and practices to determine where there are important gaps. Hence, in research involving human subjects, the NBAC has to identify and plug gaps, often by modifying or adding to what already exists. The NBAC's progress has been slower than desired, in part because of two scientific breakthroughs that led to presidential requests for specific reports. The first was the 1997 announcement of Dolly's birth several months earlier; following this announcement, President Clinton requested a report on and recommendations about cloning within 90 days [18]. The second was the report by scientists in late 1998 that they had isolated pluripotent stem cells from fetal tissue after deliberate abortions and from embryos left over after in vitro fertilization; President Clinton again requested a report and recommendations, the preparation of which consumed much of a year [19]. As a member of the NBAC, will briefly consider some possible gaps in human subject protections, some of which the NBAC hopes to plug.

The Protection of Research Subjects with Mental Disorders. One gap, many agree, appears in the protection of vulnerable or special populations. Guidelines already exist for some vulnerable populations: prisoners, children, and pregnant women. From the very first meeting of the NBAC's subcommittee on human subject research, concerns were registered about another possibly vulnerable population in need of additional protections: those with decisional or cognitive impairments.

The National Commission had proposed guidelines for those institutionalized as mentally infirm, but those guidelines were never adopted for various reasons, including additional and, to many, burdensome mechanisms, such as the use of consent auditors and possibly a subject advocate, which the DHEW had added in proposing regulations. Although the lack of specific regulations and guidelines may not have caused major harms and wrongs to research subjects, the NBAC believed that it needed to address the uncertainties and confusion surrounding research with subjects with mental disorders that may affect decision-making capacity. Its substantive and procedural recommendations departed from the Nuremberg Code's absolute insistence on the research subject's own voluntary, informed consent, at least for research with low levels of risk and for research that offers a possibility of direct therapeutic benefit. Nevertheless, the NBAC recommended stringent and controversial requirements for research that involves persons who lack the capacity to decide for themselves about participation, that offers no prospect of direct medical benefit to research subjects, and that involves more than minimal risk for those subjects [20, 21].

Shifting Paradigms of Research. Another gap has emerged in the protection of human subjects because of a shift in paradigms of research and, consequently, in perceptions of injustice in research. The earlier paradigm, prominent from Nuremberg on, focused on the risks and burdens of research and on the need to protect potential and actual research subjects from harm, abuse, exploitation, and the like. Ethical guidelines for this paradigm emphasize voluntary, informed consent--that's where the Nuremberg Code begins.

Our basic approach in the United States, according to Carol Levine, "was born in scandal and reared in protectionism" [22]. The dominant model in protectionist policies is non-therapeutic research, i.e., research that doesn't offer the possibility of therapeutic benefit to the subject. In the paradigm shift, however, attention turns from non-therapeutic to therapeutic research (e.g., clinical trials of promising new therapeutic agents), from protection to access, and from risks and burdens to possible benefits. This shift resulted particularly (but not only) from the epidemic of HIV infection and AIDS, as, for example, activists pressured the FDA to expand access to new treatments.

This inclusionist paradigm is important--it continues the shift from Nuremberg to Helsinki and beyond. However, we should not totally abandon the protectionist paradigm. The hard ethical task is to combine what is valuable in both in order to protect subjects' rights and welfare in light of a principle of justice that now rejects exclusion as well as exploitation. Other gaps, beyond the overly protectionist construal of justice, may also appear in the Belmont principles and their traditions of interpretation.

The Belmont Principles. The NBAC is not supposed to review and approve or disapprove particular research projects but rather to examine the "broad, overarching principles to govern the ethical conduct of research" [17]. One big question is whether major gaps exist in our heritage of ethical principles for research.

Three broad principles, articulated by the National Commission in the 1970s, particularly in its Belmont Report, still govern research involving human subjects [23]. Various guidelines and regulations specify these principles, and, where those guidelines and regulations are incomplete or unclear, IRBs further interpret the principles to determine whether to approve or reject particular research protocols. Those principles are:

1. Respect for persons. This principle requires that researchers respect the autonomous choices of those who are autonomous, and protect those with diminished autonomy. Rules of consent/refusal specify this principle.

2. Beneficence. This principle requires benefiting and not harming, but since both parts often cannot be fully realized simultaneously research ends up with balance of benefits (to subjects and others) and harms (to subject). The rules that specify this principle require not harming and maximizing possible benefits and minimizing possible harms.

3. Justice. This principle entails fairness in distributing burdens and benefits, especially in protecting from exploitation those who might be selected because of "easy availability ... compromised position, or ... manipulability, rather than for reasons directly related to the problem being studied." (I have already noted how an inclusionist model also focuses on justice in terms of access to research.)

Versions, or at least aspects, of these principles also appeared earlier in the Nuremberg Code and the Declaration of Helsinki, usually stated as more specific rules, such as informed consent.(3) However, the NBAC can't simply repeat these principles and rules in a fundamentalist way. Instead, it--and other groups and individuals--must continue to probe them, to see whether they need to be modified or supplemented.

At the NBAC's first meeting, Ezekiel Emanuel, then a commissioner, contended that these three principles and related guidelines do not adequately address community. Attending to community could mean, among other possibilities, that we should add community as a fourth principle, or that we should interpret all these principles in a communitarian rather than a merely individualistic manner. This second approach would involve reexamining the Belmont principles and other guidelines through the lens of community. A good case can be made that the NBAC should rethink these principles to make sure that community is sufficiently included. Following are a few illustrations of what this might entail.

Beneficence already includes attention to the society's welfare, within the benefit to be balanced against the risks to subjects. However, attention to community might also require, as has become more widespread in practice, attention to potential harms to particular communities, such as Indian or Jewish communities, rather than only harms to individuals. An example is the possible harm to a group identified with particular genes that are considered deleterious, such as cancer genes.

Reinterpreted through the lens of community, the principle of respect for persons would consider persons not merely as isolated individuals, who consent or refuse to consent to participate in research, but also as members of communities. However, we need to be cautious in such a move because it is not possible or justifiable to determine an individual's wishes and choices by reading them off communal traditions, beliefs, and values, or merely to subordinate the individual's autonomy to the community's will. And there is vigorous debate about how we should interpret the principle of respect for persons in cultures that are less individualistic than our own; this debate recently erupted in controversies about international research [25, 26].(4)

Finally, justice concerns more than fairly selecting research subjects and fairly distributing the benefits and burdens of research participation. It may include the participation by various communities in the design and evaluation of research. It could also include compensation for research-related injuries, as an expression of the community's solidarity with those who suffer injuries in research after assuming a position of risk on behalf of the community. From this standpoint, it is not enough to disclose on the consent form whether there will be any compensation for research-related injuries that are non-negligently caused; instead, compensation should be provided. In meeting with representatives of bioethics commissions around the world, the NBAC learned that most other countries, with a commitment to universal access to health care, do not view compensation for research-related injuries as a problem--such injuries would be routinely covered, at least for medical expenses. (The NBAC has not to this point endorsed a proposal by some commissioners to address compensation for research-related injuries, in part because some other commissioners thought that this was a solution in search of a problem.)

In short, it is important to revisit the Belmont principles in light of concerns about community, as well as other concerns, but the NBAC needs to do so in a way that does not neglect or distort what was important in earlier, more individualistic interpretations. It is not clear yet what might emerge if the NBAC takes this route. Even though the NBAC sponsored a conference, with other groups, in April 1999--the 20th anniversary of the Belmont Report's publication in the Federal Register--to revisit the Belmont principles, some commissioners would like to see the NBAC issue a new, succinct report on basic principles.

Institutional Structures and Mechanisms. Obviously principles, rules, and guidelines are not self-implementing--they require various structures, mechanisms, and agents for their implementation. And the NBAC is looking into the adequacy of some of these. Its one specific mandated task in the area of research involving human subjects is to examine the adequacy of the policies and procedures for protecting human subjects in each executive branch department and agency conducting, supporting, or regulating research involving human subjects.

The NBAC has also affirmed the ideal of protecting all research subjects, including those in privately funded and conducted research, through the twin mechanisms of institutional review and informed consent, but it hasn't defended a particular way to extend protections to such subjects. And serious questions have emerged about whether IRBs, which currently constitute the frontline protection for research subjects, can and do perform this task adequately.

Over the last 20 years in the United States, research involving human subjects has greatly expanded, and the number of protocols has increased dramatically. In addition, many are now multi-site protocols, which involve several teams of investigators and large numbers of research subjects. Regulations have also increased; institutional support is often minimal; individual members feel overworked and underappreciated; conflicts of interest are not uncommon; and so forth. As a result, fears abound that IRBs may not be able adequately to protect research subjects. In light of such concerns, the NBAC is preparing a report on the U.S. system for protecting the rights and welfare of research, but it is too early to anticipate the report's precise recommendations about ways to strength or supplement IRBs.

Some commentators suggest that efforts to protect human research subjects through the IRB system may fail because substantive and procedural guidelines are too complete and thus too burdensome, rather than because of gaps in those guidelines. As a result, many believe that investigators and IRBs spend too much time, energy, and resources on what is not so important and too little on what is really important, such as risky research. Hence, some propose that if the NBAC tries to bridge some gaps in substantive and procedural guidelines, it should also try to eliminate or reduce what is less important or indicate that it should receive less attention or lower priority.

Conclusions

Because of various gaps and other deficiencies, the Nuremberg Code does not provide a timeless and sufficient ethical guide to research involving human subjects--hence, Nuremberg "fundamentalists" are mistaken (as fundamentalists often are). The Code's real legacy, at a higher level of generality, is its vision of a fundamental and absolute commitment to the rights and welfare of research subjects, whatever the prospect of scientific advancement.

Appropriating and extending this legacy requires that we continue to reflect sensitively, imaginatively, and rigorously on ethical standards for research with human subjects, just as Helsinki did, just as the National Commission did, just as CIOMS did, and just as other bodies have done and continue to do. This ethical reflection takes place in a continuing societal conversation about the foundations, meaning, weights, and implications of various ethical principles, rules, and procedures in research, in light of various changes in research and its context. This conversation must be broad based and open; it must include the public as well as professionals; and it must involve various segments of the public, including those who view themselves as socially marginal. It also must continue into the future as we seek ways to protect subjects' rights and interests and to generate valuable scientific knowledge. Only by doing so can we continue Nuremberg's remarkable legacy.

(1.) For different versions of the Declaration of Helsinki, see Appendix 3 in [7].

(2.) Some ideas and paragraphs in this third section are drawn from [14].

(3.) The distinction between principles and other normative formulations, such as rules, often hinges on their level of generality, with principles often being view as more general than rules and other more specific formulations (and terms such as guidelines referring to both of them). Hence, in current discourse, the requirement to obtain voluntary, informed consent from potential research subjects might be considered a rule, while respect for personal autonomy might be considered a principle. Nevertheless, the Nuremberg Code's specific formulations are considered principles. Hence, in this paper, I have not operated with a precise distinction between principles and rules. See, for instance, [24].

(4.) I will not deal here with the important issues raised by international and crosscultural research, particularly issues of universalism and pluralism. For a very helpful discussion, see [27, 28].

REFERENCES

[1.] Taylor, T. Opening statement of the prosecution. In The Nazi Doctors and the Nuremberg Code: Human Rights in Human Experimentation, edited by G. J. Annas and M. A. Grodin. New York: Oxford Univ. Press, 1992. 70.

[2.] Glantz, L. H. The influence of the Nuremberg Code on U.S. statutes and regulations. In The Nazi Doctors and the Nuremberg Code: Human Rights in Human Experimentation, edited by G. J. Annas and M. A. Grodin. New York: Oxford Unix;. Press, 1992. 197.

[3.] Grodin, M. Historical origins of the Nuremberg Code. In The Nazi Doctors and the Nuremberg Code: Human Rights in Human Experimentation, edited by G. J. Annas and M. A. Grodin. New York: Oxford Univ. Press, 1992.

[4.] Katz, J. The Nuremberg Code and the Nuremberg Trial: A reappraisal. JAMA 20:1662-66, 1996.

[5.] Rothman, D. J. Letter to editor JAMA 277 (9):709, 1997; Katz, J. In reply. JAMA 277 (9):709-10, 1997.

[6.] Perley, S., et al. The Nuremberg Code: An international overview. In The Nazi Doctors and the Nuremberg Code: Human Rights in Human Experimentation, edited by G. J. Annas and M. A. Grodin. New York: Oxford Univ. Press, 1992.

[7.] See prefatory statement to Nuremberg Code. In The Nazi Doctors and the Nuremberg Code: Human Rights' in Human Experimentation, edited by G. J. Annas and M. A. Grodin. New York: Oxford Univ. Press, 1992.

[8.] Beecher, H. Ethics and clinical research. New Engl. J. Med. 274:1354-60, 1966.

[9.] Faden, R., S. E. Lederer, and J. D. Moreno. U.S. Medical Researchers, the Nuremberg Doctors Trial, and the Nuremberg Code. JAMA 276 (20): 1667, 1996.

[10.] See the tribunal's judgment in The Nazi Doctors and the Nuremberg Code: Human Rights in Human Experimentation, edited by G. J. Annas and M. A. Grodin. New York: Oxford Univ. Press, 1992. 103.

[11.] Annas, G. J., and M. A. Grodin. Where do we go from here? In The Nazi Doctors and the Nuremberg Code: Human Rights in Human Experimentation, edited by G. J. Annas and M. A. Grodin. New York: Oxford Univ. Press, 1992.

[12.] Annas, G. J. The Nuremberg Code in U.S. courts: Ethics versus expediency. In The Nazi Doctors and the Nuremberg Code: Human Rights in Human Experimentation, edited by G. J. Annas and M. A. Grodin. New York: Oxford Univ. Press, 1992. 205.

[13.] Beecher, H., as quoted by Refshaauge, W. The place for international standards in conducting research for humans. Bull. WHO 55(supp.): 133-35, 1977; which in turn is quoted in [12].

[14.] Childress, J. F. The National Bioethics Advisory Commission: Current challenges and future directions. J. Health Care Law Policy (forthcoming).

[15.] Capron, A.M. Human experimentation. In Medical Ethics, edited by R. M. Veatch. Boston: Jones and Bartlett, 1989. 128.

[16.] Advisory Committee on Human Radiation Experiments. Final Report. Washington, DC: U.S. GPO, 1995. Ch. 18.

[17.] National Bioethics Advisory Commission. Charter. 26 July 1996.

[18.] NBAC. Cloning Human Beings': Report and Recommendations of the National Bioethics Advisory Commission. Rockville, MD: 1997.

[19.] NBAC. Ethical Issues in Human Stem Cell Research. Vol. 1: Report and Recommendations of the National Bioethics Advisory Commission. Rockville, MD: 1999.

[20.] NBAC. Research Involving Persons with Mental Disorders that May Affect Decisionmaking Capacity. Rockville, MD: 1998.

[21.] Childress, J. F. An introduction to NBAC's Report on Research Involving Persons with Mental Disorders that May Affect Decisionmaking Capacity. Accountability in Research 7:101-15, 1999.

[22.] Levine, C. Changing views of justice after Belmont: AIDS and the inclusion of "vulnerable" subjects. In The Ethics of Research Involving Human Subjects: Facing the 21st Century, edited by H. Y. Vanderpool. Frederick, MD: University Publishing Group, 1996. 106.

[23.] National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. The Belmont Report: Ethical Guidelines for the Protection of Human Subjects. DHEW publication no. (OS) 78-0012. Washington, DC: U.S. GPO, 1978.

[24.] Beauchamp, T. L., and J. F. Childress. Principles of Biomedical Ethics, 4th ed. New York: Oxford Univ. Press, 1994.

[25.] Council for International Organizations of Medical Sciences (CIOMS), in Collaboration with the World Health Organization (WHO). International Ethical Guidelines for Biomedical Research Involving Human Subjects. 1993.

[26.] Vanderpool, H. Y., ed. The Ethics of Research Involving Human Subjects: Facing the 21st Century. Frederick, MD: University Publishing Group, 1996. 501-10.

[27.] Levine, R. J. International codes and guidelines for research ethics: A critical appraisal, in The Ethics of Research Involving Human Subjects: Facing the 21st Century, edited by H. Y. Vanderpool. Frederick, MD: University Publishing Group, 1996. 235-59.

[28.] Macklin, R. Universality of the Nuremberg Code. The Nazi Doctors and the Nuremberg Code: Human Rights in Human Experimentation, edited by G. J. Annas and M. A. Grodin. New York: Oxford Univ. Press, 1992. Ch. 13.

JAMES F. CHILDRESS, University of Virginia, Cocke Hall, Room 101, Charlottesville, VA 22903. Email: childress@virginia.edu.

JAMES F. CHILDRESS is the Kyle Professor of Religious Studies and Professor of Medical Education at the University of Virginia, where he also directs the Institute for Practical Ethics. He is the author of numerous articles and several books in bioethics, including (with Tom L. Beauchamp) Principles of Biomedical Ethics (1994). He is a member of the National Bioethics Advisory Commission.

Source Citation   (MLA 8th Edition)
CHILDRESS, JAMES F. "NUREMBERG'S LEGACY: SOME ETHICAL REFLECTIONS." Perspectives in Biology and Medicine, Spring 2000, p. 347. Academic OneFile, http%3A%2F%2Flink.galegroup.com%2Fapps%2Fdoc%2FA76134873%2FAONE%3Fu%3Dpcc%26sid%3DAONE%26xid%3Dfe2c7ea9. Accessed 17 Nov. 2018.

Gale Document Number: GALE|A76134873