A Medical History of the Contraceptive Pill

This article was first given on November 7, 2008 at the conference of the Center for Ethics & Culture, University of Notre Dame. A version also appeared on the MercatorNet website on January 15, 2010. A short time later, the well known Australian feminist Anne Summers quoted from it in her column in the Sydney Morning Herald.

The acceptance of the hormonal contraceptive pill has been cited as one of the most important historical events of the 20th century because of its effects on marriage and family life. In this paper, I would like to discuss the medical history of the development of the pill, presenting historical events primarily from the perspective of science and scientists, but also – and necessarily – from the perspective of other personalities outside of science whose contributions to the cause were just as important.

I will focus on the 50 year period from 1910 – the year of the birth of reproductive endocrinology as a scientific discipline – to 1960 – the year in which the first orally active hormonal contraceptive was first approved for sale to the general public in the US.

At the turn of the 20th century, there was growing confidence in the power of the medical sciences to finally understand human physiology and the pathophysiology of diseases. The source of this confidence was due in no small part to advances in the field of endocrinology: the study of hormones and the glands that produce them.


Ernest Starling (1866-1927) was the British physiologist who in 1905, along with William Bayliss, established the existence of “hormones”, chemicals which were produced in one tissue of the body, delivered into the bloodstream and elicited a response from another distant tissue.


The term "hormone" was coined in 1905 by the British physiologist Ernest Starling, after the Greek word meaning "to incite to activity". In the early 20th century, a variety of chemicals were found to have "hormonal" effects in humans: they were produced in one tissue, entered the bloodstream and incited a specific effect on another distant and unrelated tissue. Insulin, thyroxine, testosterone and cortisone were discovered at this time and were found to have remarkable restorative properties when given to patients with a number of common diseases. This enthusiasm for the therapeutic potential of "hormones" also extended to the area of human reproduction In the late 19th century, Professor Brown-Sequard, at the age of 72, injected himself with the extracts of guinea pig testicles and reported their rejuvenating effects.
Between 1910 and 1930, the hormones estrogen and progesterone were found to play important roles in the physiology of female mammalian reproduction, and so by around 1940, it became apparent that in human females, fertility depended upon the complex interactions of a hierarchy of hormones which affected the ovaries first, and then the uterus. The ovaries were found to be the source of a "female factor" in human development (the ova or egg): complementary to and just as necessary for human reproduction as the "male factor". This picture -- which seems so clear today -- was in fact a dramatic insight, considering that even in the second half of the 19th century many scientists still believed that conception occurred when the male factor – a seed – was sown into the female womb– the soil. A woman’s contribution to conception was thought to be that off a passive receptacle offering a favorable environment for the germination of the seed.
The picture that now emerged meant that fertility in women was in the normal case orderly and cyclical, and therefore predictable. This new understanding of human reproduction seemed to lift the veil on an awesome event that, until then, had been shrouded in mystery. Science began to expose the mystery, to reveal the mechanics of how human beings came about, and made that event accessible and open to manipulation.
Controlling fertility
The notion that fertility in mammals could be controlled by the manipulation of reproductive hormones was first proposed by the Austrian physiologist Ludwig Haberlandt, who based his hypothesis on his own work with laboratory animals He and others observed that the ovarian follicle would not mature and ovulation would not occur during normal pregnancy, and that




Ludwig Haberlandt (1885-1932) was the Austrian physiologist who first suggested that the female reproductive cycle could be controlled by the adminstration of hormones.
this suppressive effect was mediated by progesterone. Progesterone is produced by the corpus luteum during the second half of the normal menstrual cycle, and its blood levels remain elevated only if pregnancy occurs. As long as certain levels of progesterone persist in the circulation, hormonal signals favoring the shedding of the endometrium, ripening of additional ovarian follicles, and release the ova, would not occur.
If pregnancy does not occur during a normal menstrual cycle (above), the corpus luteum will stop producing progesterone, the endometrium will stop growing and menstrual bleeding will occur. If pregnancy occurs, the zygote produces human chorionic gonadotropin (HCG) which prevents the involution of the corpus luteum. Persistent progesterone production by the corpus luteum sustains the viability of the endometrium so that development of the embryo can continue.
So scientists like Haberlandt, who would seek to develop a birth control pill, focused their efforts on finding a chemical that would mimic the normal effects of progesterone. In effect, they sought a chemical that would induce a pseudo-pregnant state Haberlandt was probably the first to propose that "hormonal sterilization" -- which he had successfully induced in several animal models -- could and should be applied to humans. He even developed a progestin-based oral contraceptive which – a few decades after his death in 1932 – was studied and distributed as the contraceptive "Infecundin" in Eastern Europe by Marxist governments. His original observations on the contraceptive potential of progesterone in animals proved to be of use to others years later in planning strategies for control of human fertility.

Overcoming cultural resistance
Before moving on with the story, a brief digression: it is worth noting that while scientists were unraveling the mysteries of human reproduction, cultural attitudes during the first half of the 20th century strongly discouraged public discussion of sexual matters, and many -- if not most -- people agreed that the use of contraceptive devices was somehow wrong. In many states, dispensing contraceptives or information about them was a felony. So advocates of the birth control pill had to overcome not only important scientific hurdles but also widely held cultural and religious objections. There was the clear perception among early advocates of birth control that acceptance of a contraceptive pill would be difficult to achieve. Margaret Sanger led the campaign in the US that would gradually -- over decades -- desensitize the general public on matters of sex. A brilliant and remarkably tenacious woman,

Margaret Sanger (1879-1966) was an early and ardent supporter of birth control, founding the American Birth Control League in 1921. In 1942, the group was renamed Planned Parenthood Federation of America, today a major provider of contraception and abortion throughout the world.

she wrote pamphlets, published newspapers and books, smuggled birth control devices, founded birth control clinics and got arrested -- all to raise the issue of birth control from the perspective of women’s rights, at the same time publicly downplaying her own anarchist and eugenicist leanings She succeeded in her efforts, and she and her friends were pleasantly surprised when after the pill’s release in 1960, popular opposition to birth control rapidly diminished.
While Sanger was primarily a political activist using the language and methods of class warfare to foster grassroots support for her movement, others sought to justify the need for birth control through geopolitical arguments. Generously funded by important private foundations, eugenicist academics like Frank Notestein and Kingsley Davis developed demographic models that were appealing by virtue of their simplicity and logic. They were also founded on a vision of the human person where self-interest was the driving force of every human action and a human being’s capacity for transcendence was a priori disallowed.


The demographic transition theory,[1]first proposed in 1929 and developed further by Notestein, Davis and Coales, attempts to explain how economic growth can determine changes in human population. These early advocates of birth control argued that aggressive use of family planning could accelerate the achievement of low birth rates and low death rates (phase 4) in developing countries, a situation they believed to be optimal for social stability and human fulfillment.

Aided by their academic prestige and an abundance of financial resources, they argued the need for birth control as the most effective way of avoiding the otherwise inevitable depletion of the earth’ resources by a growing population of consumers. Both these groups skillfully influenced public opinion in the post-war period to soften opposition to the pill.


Scientific obstacles


The scientific obstacles to development of the pill were substantial as well. Most experts had settled on progesterone as the preferred agent, but progesterone --like estrogen and most other steroid hormones -- was digested, not absorbed, when taken by mouth. Chemists sought to alter the structure of the naturally occurring hormone so that it could be absorbed orally and still retain its natural effects. Raw progesterone was very expensive because it was very hard to come by. Until the 1940s, European pharmaceutical firms held a virtual monopoly on steroid hormone production, but their sources were limited to crude biological material from animal products: glands obtained from slaughterhouses, hormone derivatives obtained from the urine of pregnant animals. For example, the drug "Premarin", commonly used as an estrogen supplement, is an acronym of the terms "pregnant mare urine". These inefficient low-yield processes were inadequate to meet the high demand for hormones.
The challenge was to discover alternate sources of steroid molecules. This was possible because the hormones involved in human reproduction belonged to a large group of natural chemical compounds with a very similar structure. All the steroids have4 basic rings and their different effects in the body depend upon the addition or deletion of side chains to the rings.
Among those taking up the challenge was Russell Marker, a botanist and biochemist who sought and found an important source of steroid hormones in plants. Between 1939 and 1943, Marker and his team demonstrated that plant compounds called "sapogenins", could be used as precursors of steroid synthesis. He devoted himself to identifying plants with high concentrations of sapogenins. In 1941, while on a search for plant sources in New Mexico he stumbled on a reference book describing a tuberous plant of the Dioscorea family: a common yam plant called "barbasco" by the natives of eastern Mexico (Veracruz) where it grew abundantly. He decided to analyze Dioscorea plants, obtained samples and confirmed very high levels of a sapogenin called diosgenin. The yam species Dioscorea villosa was particularly suited to his work.


The Mexican connection


Marker developed a simple, five-step method of converting diosgenin to progesterone, a process called the "Marker degradation". He approached Parke-Davis, an American pharmaceutical firm that had supported his early research at Penn State, encouraging them to set up a production center in Mexico, but they declined. Being a pragmatic person, he sought local support by checking the Mexico City telephone book for chemistry labs that might be willing and able to work with steroid hormones. He found "Laboratorios Hormona", a company founded by two eastern European émigrés, Emeric Somlo and Federico Lehmann. When they first met, Marker showed them a sample of his work: 80 grams of pure progesterone, representing almost one third of the world’s supply of the chemical. They agreed to form a new company, naming it Syntex Laboratories, from "synthesis" and "Mexico". After a year of successful work, Marker and his partners disagreed over finances, and he left the company, taking the instructions for synthesizing progesterone from diosgenin along with him.



Russell Marker (1902-1995) developed a procedure to produce large quantities of progesterone from “diosgenin”, a chemical extracted from the root of a yam plant which grows abundantly in northeastern Mexico. Unable to interest U.S. pharmaceutical firms in his method, he co-founded Syntex Laboratories in Mexico with two Eastern European businessmen, but was no longer associated when Syntex scientists succeeded in producing the first oral contraceptive.


Syntex executives then hired George Rosenkranz, a Swiss-trained Hungarian chemist who had been stranded in Havana since 1942. The Pearl Harbor attack occurred while he was on his way to Quito, Ecuador to found a university chemistry department there. Rosenkranz was eventually able to piece together Marker’s process, and within a year Syntex was once again producing progesterone from diosgenin. By the 1970s, Syntex had become a billion dollar corporation, and one of the world’s largest producers of steroid hormones. It should be noted that Syntex sought first to produce industrial amounts of corticosteroids. Progesterone derivates were a secondary priority. But when Upjohn and Pfizer in the US beat out the Eastern Europeans from Syntex in developing efficient techniques for mass production of cortisone, Syntex priorities shifted. The Mexican chemist and technicians who had been assigned to the lower priority research – synthesizing an orally absorbed version of progesterone -- now held the key to the company’s future.


Mexican biochemist Luis Miramontes (1925-2004) -- working under Carl Djerassi -- succeeded in producing a chemical with progesterone-like effects that was absorbed into the body after oral ingestion. This chemical -- called norethindrone – eventually became the first birth control pill.

It was a young Mexican chemist, Luis Miramontes, who succeeded under the supervision of the Austrian-born Carl Djerassi. They called the orally-active progestational substance norethindrone. Djerassi later claimed that they did not at that time intend to produce a hormonal contraceptive specifically, simply a compound with potentially marketable uses. Regardless, he has been identified and identifies himself as the father of the birth control pill, and he has expressed his sense of fatherhood artistically, in the form of irreverent theatrical works exploring contemporary attitudes toward human reproduction.


Expansion to the United States

Syntex proved to be a very efficient hormone factory, but as a Mexican company, they lacked access to the US market and so sought to develop partnerships with established US firms. Their efforts eventually led them to Shrewsbury, Massachusetts and the Worcester Foundation for Experimental Biology.


Gregory Pincus was a zoologist, an authority in the reproduction of mammals and an eccentric. He was the first to succeed at in vitro fertilization in mammals. He used rabbits. Later in the 1930s, he produced a rabbit by parthenogenesis, and he allowed his work to be profiled in Collier’s magazine. In that article – the cover story -- he was misquoted admitting his intention to attempt IVF in humans. As a result, according to pill biographer Bernard Asbell, the public came to view him as kind of Dr Frankenstein. He was denied tenure at Harvard, and so became an independent consultant, founding the Worcester Foundation for Experimental Biology in 1944.


Gregory Pincus (1903-1967) performed studies in animals to confirm the contraceptive effects of norethinodrel. His data were used to justify human research using the same chemical. He collaborated closely with the obstetrician John Rock, and was supported financially and politically by Katherine Dexter McCormick, Margaret Sanger and other birth control activists.


The Worcester Foundation was intended to be a place for drug companies to send promising chemicals to test their pharmacological effects in animals. Their initial attempts, working primarily for GD Searle, were disasters and the Worcester Foundation almost went out of business several times. The change in his fortunes began in 1951 when he met Margaret Sanger at a dinner hosted by Abraham Stone, an executive of the Planned Parenthood Federation.
Sanger had been looking for a scientist with his skills, who would also be willing to take on a controversial project. Pincus’ collaboration with Sanger and Planned Parenthood started him down the path to the Pill, but their financial support was sporadic and limited. Real progress in identifying an orally active birth control pill did not occur until Katherine Dexter McCormick became involved, lending abundant financial support to the project. McCormick was the heiress by marriage to the International Harvester fortune. Born into a distinguished family of progressive Chicago attorneys, she was the second woman ever to graduate from MIT, the first with a degree in the sciences. She gave up plans to study medicine and reluctantly married Stanley McCormick, who within 18 months had to be institutionalized for schizophrenia. They had no children, and lived apart for most of their lives. She was an ardent supporter of women’s suffrage and birth control, having first crossed paths with Margaret Sanger in 1917.

Katherine Dexter McMormick (1875-1967) used her immense fortune to support the clinical development of the oral contraceptive pill. Historians acknowledge that her active assistance was critical to the success of the scientific work.

After Mr. McCormick’s death in 1947, she immediately stopped funding schizophrenia research, and shifted her attention to other projects, in particular birth control. Historians acknowledge the critical importance of her financial backing in accelerating the development of the pill. Because of McCormick and a few other private benefactors, the pill was produced using not a single dollar of public money.

The road to the Pill

With McCormick’s close involvement and funding, Pincus was able to ratchet up his efforts. He and his colleague, Min Chueh Chang, screened hundreds of hormonal products in animal models, and in the end concluded that two –norethindrone, discovered by Miramontes and Djerassi at Syntex in 1951; and norethinodrel, an almost identical compound produced by Frank Colton at GD Searle two years later – were the best candidates for human trials. Pincus found that both of these compounds retained potent progestational effects when given orally and were effective contraceptives in a variety of mammals.
The next step along the path to approval of the Pill would require studying these compounds in women, and so they turned to a physician – John Rock – who was an obstetrician-gynecologist. Harvard-trained, a pioneer in the treatment of human infertility and Irish-Catholic, Rock -– from the perspective of politically astute birth control activists -- was the perfect man for the job. When he was approached by advocates of birth control, Rock had already devoted decades to the study of human reproduction and embryology, and was renowned for being the first -– along with his Harvard colleague Arthur Hertig -– to successfully fertilize a human egg in vitro.


Harvard gynecologist John Rock (1890-1984) was among the first modern students of human embryology. He performed hysterectomy procedures on his patients very in early in pregnancy to obtain his study samples. In 1944, he and his assistant Miriam Menkin published the first report of a successful human in vitro fertilization procedure. He also directed the first human clinical trials of the hormonal contraceptive pill. This research led in 1960 to the formal approval of the Pill for human use.


Pincus and Rock had known each other in the1930s. Pincus had followed Rock’s attempts to develop methods to detect ovulation and had provided him with chemicals for clinical testing. They had lost touch in the 1940s, but renewed their contact during a chance meeting at a scientific convention in 1952. At the meeting, they learned that both were using the same compounds – estrogen and progesterone – to achieve opposite ends: Pincus contraception infertile rabbits and Rock conception in infertile women. Within a short time, they agreed to collaborate to develop an oral contraceptive pill.

Clinical trials begin

In 1954, Rock began the first clinical trials under the guise of another fertility study. This was the first-ever human trial of an oral contraceptive. The drug Rock used was the one developed by Colton at Searle, named norethinodrel. Rock and Pincus selected norethinodrel because Djerassi’s version –norethindrone -- caused the enlargement of male rat’s testicles and so it was feared that it might have "masculinizing effects" that would hamper its acceptance as an oral contraceptive. What they did not know was that the Searle product they selected had been inadvertently contaminated with a small amount of estrogen, and that likely masked the androgenic effect of norethinodrel.
The first human trials of norethinodrel were designed to measure ovulation rates and other effects on the reproductive system. It was first tested in Brookline, Massachusetts, on 50 women who were patients of Rock’s infertility clinic. None of them showed evidence of ovulation while taking the pill. The second group tested included 23 female medical students from the University of Puerto Rico. Within a few months of the trial, half of the women withdrew from the study -- despite veiled threats of adverse academic repercussions by some of the investigators -- citing side effects and cumbersome data collection requirements. A third group --patients at the local psychiatric hospital in Worcester – was also given norethinodrel and provided data that eventually led to the approval of large-scale trials to monitor its contraceptive effects specifically
The first large-scale study of norethinodrel's contraceptive effect was conducted in Puerto Rico, a site not chosen randomly. Comstock laws were never in force there; it was an island with very high population density; and the local government was very cooperative, having established a network of family planning clinics with the assistance of Planned Parenthood years before In addition, Pincus noted that the US press would be less likely to interfere if studies were conducted away from the mainland. The study began in April, 1956 under the supervision of Edris Rice-Wray, an American-trained physician with ties to the medical school, the public health department of Puerto Rico. She had run a family planning clinic there, and trained health care workers in contraceptive use. Norethinodrel proved to be a highly effective contraceptive, despite the side effects that began to emerge.
Pincus, Rock and Searle executives were soon satisfied that the contraceptive efficacy of norethinodrel had been adequately demonstrated. In 1957, G.D. Searle marketed norethinodrel as Enovid -- not for contraception but for "menstrual problems". The package insert prominently noted as a "warning" that the drug could induce temporary infertility as a side effect. This was in effect Searle’s trial balloon. They quickly learned that prescriptions for the Pill far exceeded the number of women who had previously complained of menstrual problems.

The 50th anniversary

Three years later -- on May 9, 1960 -- the FDA approved the Pill to be used for birth control. This was the first product to be approved by the FDA that was not designed to treat an illness but rather to modify a normal physical process.
The half century defined by the emergence of the science of human reproduction around 1910 and the approval of the first hormonal contraceptive pill in 1960 was a period of dramatic scientific and social change. Science demystified human reproduction, breaking down the process into its basic components and making it accessible to manipulation.
These new scientific insights offered some people – a relatively small group of influential individuals persuaded by a materialist worldview and eugenicist principles – an opportunity to advance their ideology of social engineering on a global scale. They were radical political activists working together with brilliant, ambitious scientists and academics, and wealthy agnostics, who eventually succeeded in presenting birth control as a moral imperative for modern societies. An analysis of the effects of the hormonal contraceptive on individuals and societies over the ensuing 50 years should provide important opportunities to value the mystery of the generation of new human life more fully.

For further reading
+ The Fertility Doctor by Margaret Marsh and Wanda Ronner is a recent account of the tragic life of John Rock written by two sisters, one a historian the other a gynecologist. They were given access to personal papers by the family of Dr. Rock.
+ Birth Control in America: The Career of Margaret Sanger by David Kennedy steers a course between hagiography and hostility.
+ Katherine Dexter McCormick: Pioneer for Women's Rights takes a benevolent view of its subject, but does not skate over her troubled life.
+ The scholarly book Sexual Chemistry by Lara Marks is an excellent detailed account of the history of the Pill. A scholarly article by Marks and Suzanne White may be found online here.
+ For a less detailed history see “On the Pill: A social history of oral contraceptives, 1950 -1970” by Elizabeth Siegel Watkins.
+ Carl Djerassi – self-proclaimed father of the birth control pill -- has written two entertaining books about his own role, “This Man's Pill” and “The Pill, Pygmy Chimps and Degas' Horse”.

Footnotes

[1] Image from http://auphumangeo.wikispaces.com/Renae. For a detailed discussion of the demographic transition model see this Rand Corporation publication.

Popular Attitudes toward Voluntary Death: Suicide and Euthanasia from Antiquity to the Post-Modern

This is a draft of a paper given at the "Dialogue of Cultures" Conference sponsored by the Center for Ethics & Culture at the University of Notre Dame from November 30 to December 2, 2007. The conference intended to consider the difficulties and opportunities of dialogue in a time of conflict. The conference was inspired by the Regensburg address of Benedict XVI, a lecture given on September 12, 2006 at a university in Germany where he had been a professor of theology from 1969 to 1971. In his talk, the Pope quoted a Byzantine emperor who had been in dialogue with his islamic opponent during a war been Christians and followers of Muhammad in the 14th century. The point of the lecture was to illustrate the irrationality of using violence to impose religious faith, and to trace the philosophical roots of how violence could be justified. Following this lecture, islamic activists rioted in different parts of the world in response to their apparent misreading of the text and the purpose of the address.

While the majority of human deaths are “natural” – the result of aging, injury or disease – the choice of “voluntary” death -- suicide, assisted suicide and euthanasia (SASE) – has been a human response to the problems of life since the beginning of recorded history. In the West, social acceptance of these actions has varied over time. Two periods of transition in popular attitudes toward voluntary death can be identified. The first shift – from approval to disapproval -- may be defined by the transition from the pagan cultures of antiquity toward a new Judeo-Christian civilization built on Greco-Roman foundations. The second shift – which we are currently living through – is characterized by the emergence of cultural attitudes more tolerant of voluntary death driven by a gradual loss of the sense of the transcendence of human existence.

In this talk, I hope to point out some of the historical factors which surrounded and to a certain degree caused these shifts in attitude. These factors include conflicting views of the nature of human life; how diseases were understood and medicine was practiced; and how scientific advances might have influenced the transitions.

For the Greek and Roman societies that preceded the appearance of Christianity, a “good death” could be either natural or voluntary. The practice of voluntary death was widespread, and depending upon the circumstances was considered to be a reasonable act. To relieve the pain or distress of an incurable illness, to avoid a humiliation or indignity, to end an unhappy or tiresome life or to express a sense of triumph over Fate by ending one’s life voluntarily in old age were felt to be justifiable or even honorable reasons to end one’s own life. In some cases, the governing regime of a Greek or Roman city would reserve doses of the appropriate poisons to give to those to whom voluntary death was permitted. In certain areas of the Greco-Roman world, suicide was a privilege reserved for the social elites and not permitted to soldiers, slaves or criminals. When a person committed suicide without apparent justification, their corpses were sometimes mutilated and buried shamefully in unmarked graves.

Pagan antiquity was characterized by a pessimistic attitude toward human existence, and individual human life lacked special significance or value. Within this context, the various philosophical schools shaping Greco-Roman culture reflected an eclectic and tolerant attitude toward voluntary death. In general terms, materialist schools, including the Stoics and Epicureans, adopted permissive or supportive attitudes toward SASE. Their view of the human individual included no sense of personal immortality. Death was annihilation, a natural, personal dissolution. With death, individuality and personality ceased to exist. In contrast, Greek philosophical schools admitting the existence of transcendent, spiritual realities in the Cosmos tended to limit or caution against voluntary death. These groups -- including the Platonic, Aristotelian and Pythagorean schools -- acknowledged the possibility of a personal, individual existence after death.

From the Pythagorean school, a unique group of philosopher-physicians emerged who challenged the attitudes toward SASE that prevailed at the time. Best known among them was Hippocrates. These physicians distinguished themselves for several reasons.

First, they attempted to work according to a set of well-defined professional standards, entered into by means of a solemn oath to the gods. Often the healers of antiquity were merchants of tonics and cure-alls, “root cutters”, slaves or soldiers, and so they did not always enjoy the confidence of the general public. The oath taken by these physicians, recognized today as the Hippocratic Oath, included the first known prohibition of physician assisted suicide in Western history. This prohibition represented a minority opinion at the time.

Second, they tended to reject the widespread, popular notion of disease as divine punishment, and sought natural rather than supernatural explanations for them. They developed a rational process of clinical problem solving – patient interview and examination, diagnosis, prognosis and therapy – which continues to be the standard medical practice today.

Third, they viewed human existence as bound up with the whole of nature, in an orderly not chaotic way. They recognized a tension and balance of opposing qualities throughout the Cosmos, and so they developed an understanding of human physiology made up of four humors (blood, phlegm, yellow bile, black bile) located in four organs (heart, brain, gall bladder and spleen). These elements were correlated with human personality traits or temperaments (sanguine (cheerful), phlegmatic (sluggish), choleric (irritable), melancholic (sad)). Similar patterns could be identified in nature including the four elements of the cosmos (fire, earth, water, air) and the four seasons of the year (summer, autumn, winter, spring), the four natural sensations (hot, cold, dry, wet) and in the four geographical points (east, north, south, west). Diseases resulted from disequilibrium between man’s humors and nature’s elements. The task of medicine was to restore the equilibrium. For over 1500 years -- through the Middle Ages and into the 18th century -- medical practice was based on these principles.

With the progressive collapse of the Roman political order and the gradual emergence of a Christian culture, popular acceptance of voluntary death declined. According to some scholars (for example, Rodney Stark), the clear contrast between pagan and Christian approaches toward the sick was an important factor contributing this change in attitude toward SASE. In the Greco-Roman world, basic forms of welfare and philanthropy were based on principles of reciprocity and self interest. There was no public duty toward the sick, and sympathy for strangers was considered irrational.

Below you will read the eye-witness accounts of two plagues: one by the historian Thucydides describing what he saw during the plague of Athens in 431 BC; the other by Pontius the Deacon and Cyprian, bishop of Carthage describing the plague that struck that city in 251 AD. Thucydides captures the despair and lawlessness which overcame the Athenians when confronting a devastating epidemic:

“People were afraid to visit one another, and so they died with no one to look after them, and many houses were emptied because there was no one to provide care. … The doctors were incapable of treating the disease because of their ignorance of the right methods…. Equally useless were the prayers made in the temples, consultation of the oracles, and so forth. In the end people were so overcome by their sufferings that they paid no further attention to such things. The great lawlessness that grew everywhere in the city began with this disease, for as the rich suddenly died and the poor took over their estates, people saw before their eyes such quick reversals that they dared to do freely things they would have hidden before, things they would never have admitted they did for pleasure. And so, because they thought their lives and their property were equally ephemeral, they justified seeking quick satisfaction in easy pleasures. As for doing what had been considered noble, no one was eager to take any further pains, because they thought it uncertain whether they should die or not before they achieved it. But the pleasure of the moment and whatever contributed to that were set up as standards of nobility and usefulness. No one was held back in awe either by fear of the gods or by the laws of men: not by the gods because men concluded that it was the same whether they worshipped or not, seeing that they all perished alike; and not by the laws, because no one expected to live till he was tried and punished for his crimes. But they thought that a far greater sentence hung over their heads now, and that before this fell they had a reason to get some pleasure in life. Such was the misery that weighed on the Athenians.”[1]

Confront this account with letters written by the Christians of Carthage. They had just (barely) survived a persecution by the Emperor Decius and now faced a devastating epidemic similar in every respect to the plague of Athens. Pontius describes the reaction of the pagan population and Cyprian describes the Christian response.

"There broke out a dreadful plague, and excessive destruction of a hateful disease invaded every house in succession of the trembling populace, carrying off day by day with abrupt attack numberless people, every one from his own house. All were shuddering, fleeing, shunning the contagion, impiously exposing their own friends, as if with the exclusion of the person who was sure to die of the plague, one could exclude death itself also. There lay about the meanwhile, over the whole city, no longer bodies, but the carcases of many, and, by the contemplation of a lot which in their turn would be theirs, demanded the pity of the passers-by for themselves."[2]

From Cyprian of Carthage we read:

"This trial -- that now the bowels, relaxed into a constant flux, discharge the bodily strength; that a fire originated in the marrow ferments into wounds of the fauces; that the intestines are shaken with a continual vomiting; that the eyes are on fire with the injected blood; that in some cases the feet or some parts of the limbs are taken off by the contagion of diseased putrefaction; that from the weakness arising by the maiming and loss of the body, either the gait is enfeebled, or the hearing is obstructed, or the sight darkened -- is profitable as a proof of faith. What grandeur of spirit it is to struggle with all the powers of an unshaken mind against so many onsets of devastation and death! What sublimity, to stand erect amid the desolation of the human race, and not to lie prostrate with those who have no hope in God; but rather to rejoice, and to embrace the benefit of the occasion; that in thus bravely showing forth our faith, and by suffering endured, going forward to Christ by the narrow way that Christ trod, we may receive the reward of His life and faith according to His own judgment!"[3]

Clearly, this way of looking at the brutal world in which they lived was different than that of the pagans: the Christians had discovered new values that influenced the way they faced the dilemma of human suffering. There are well documented accounts of Christians during this plague caring for and not abandoning the dying, including those who had lapsed under the recent persecution and those who had been their persecutors.

And so this first transition from approval to disapproval of voluntary death -- a change that might accurately be characterized as revolutionary rather than evolutionary – seems to have been initiated by the heroic attitude of service of ordinary people toward their suffering peers, actions which clashed with the prevailing standards of socially acceptable behavior. In these first steps of transition, we can glean a merging of those rich elements of Greco-Roman culture compatible with Christian anthropology, an anthropology that was first lived, and then explained.

In the Regensburg lecture, Benedict XVI alludes to the “inner rapprochement between Biblical faith and Greek philosophical inquiry” as an event of decisive historical importance. The first systematic argument against suicide in Christian thought appears to occur in the 4th century, when Saint Augustine argued in “The City of God” against the justification of suicide by Christian women who had been violated by barbarian soldiers. He made his case based on both logical argumentation and reference to Scripture. Elsewhere, he also admonished against assisted suicide, in the following terms:

"…it is never licit to kill another: even if he should wish it, indeed if he request it because, hanging between life and death, he begs for help in freeing the soul struggling against the bonds of the body and longing to be released; nor is it licit even when a sick person is no longer able to live".[4]

Later the philosophy of Thomas Aquinas marked a high point in the philosophical development of Christian culture. It may be considered to be the fruit of centuries of practical human experience, and of the engagement of human reason with the sources of divine revelation. The Thomistic argument against voluntary death was unequivocal and grounded on the analogy between the Creator and his creature.

Over 1500 years -- from the 4th to the 19th centuries -- notions of reverence for individual human life, of the dignity of self-sacrifice on behalf of others, and of the redemptive value of suffering took root in the popular imagination. In time pagan tolerance toward voluntary death came to be considered profoundly objectionable. This objection was expressed throughout society in popular customs, literary works, legal systems and medical practices which formally and enthusiastically prohibited SASE.

Over the same period of time, while the intellectual foundations of a Catholic culture were being laid, devastating epidemics swept through Europe and other parts of the world over and over again. Despite profound ignorance of their causes, despite a lack of effective treatment to cure or prevent them, despite terrible suffering, there was universal opposition toward suicide, assisted suicide or mercy killing during the historical period in which Judeo-Christian attitudes prevailed among the general population. During this time, the care of the sick by Christians grew in efficiency and organization, the direct precursors of today’s hospitals.

Weakening of the collective social disapproval of SASE began with a decline in the moral authority of the Roman Catholic Church. This decline culminated in the protestant reformation of the 16th century, and eventually led to the institutionalization of “secular” opposition toward the Catholic Church. In Regensburg, Benedict XVI traced this decline to an idea of God as hyper-transcendent, unapproachable being who can contradict himself in his creation. As God receded from the human orbit, the social customs, legal principles, academic and political institutions, and economic practices that reflected the principle of reverence for individual human life slowly weakened. Eventually intellectuals, scientists and clergy would begin to seek justifications for SASE.

The first break probably occurred among English intellectuals who began to debate a variety of rationalizations of suicide and voluntary death in the 18th and 19th centuries, and with particular enthusiasm following the French Revolution. The debates were motivated in part by an apparent epidemic of suicides throughout England in the 18th century, a phenomenon which led to suicide being designated “the English malady”. These debates were limited to elite intellectual circles, with very little sympathy gained among the general population. By the mid 19th century, several additional factors contributed to growing interest in and impetus for justifications of SASE.

First, social and political upheavals (the Reformation, the Thirty Years War, the Enlightenment, the Reign of Terror, the Napoleonic Wars) led to a generalized pessimism and moral relativism throughout Europe.

Second, materialist philosophical projects developed in direct opposition to the basic principles of Christian anthropology. Among these projects was the theory of “transmutation” or “evolution” which held that matter possessed the intrinsic capacity to randomly develop all known forms of life over very long stretches of time. By offering evidence for the “natural selection” of traits that conferred survival advantages on animals, Charles Darwin came up with a scientific support for the theory of evolution. His findings were used by others, including his cousin Francis Galton and the philosopher Herbert Spencer, to advocate a social philosophy promoting the improvement of human hereditary traits through selective breeding of humans, birth control and euthanasia in order to create healthier, more intelligent people, to save society's resources and to lessen human suffering. Darwin and others reasoned that charitable efforts to treat the sick and support the mentally or physically disabled could adversely affect the human race, leading to a “degeneration” of the human condition by favoring the survival of defectives. Their approach became known as “eugenics” or “the self-direction of human evolution”, a view that found support among intellectual circles in Anglo-Saxon countries first and later -- with particularly terrible consequences -- in Germany.

Third, developments in science led to a surge in popular confidence in the ability of physicians to uncover the basic causes of human diseases and to treat them effectively. Two early scientific advances are especially relevant to our topic.

The first is the verification of the “germ theory” of Koch and Pasteur. It could now be said that the mysterious diseases that repeatedly killed large fractions of European and other peoples over the centuries were caused by invisible living organisms. Soon after, vaccines proved effective in preventing these diseases, and by the first half of the 20th century, antibiotics began to cure many of those infected.

A second event affecting the SASE debates was the discovery of analgesic and anesthetic chemicals. In preceding centuries, physicians had little to offer patients in pain, but toward the middle of the 19th century, chemicals that could reversibly alter human consciousness and pain perception -- including chloroform, ether and morphine – could be given to the sick and dying to ease their distress. Some intellectuals argued that these chemicals should be used to cause the deaths of persons who were suffering excessively. However, no physicians are known to have publicly recommended this practice for their patients until the “euthanasia movement” began, first in Great Britain and soon after -- by the early 20th century -- in the United States.

Historical evidence suggests that the euthanasia movement of the late 19th and early 20th century and the contemporary right-to-die movement are campaigns organized by social elites – including intellectuals, Protestant and Unitarian clergy and wealthy agnostics – to overcome deep-seated popular opposition to voluntary death in its various forms.

In summary, over the course of Western history, changes in social acceptance of SASE have been driven primarily by re-examinations of the dominant cultural views regarding the nature and meaning of human existence in a particular historical period, and to a lesser extent by developments in medical care. One could argue that the first shift -- from approval to disapproval of voluntary death -- was driven by a “grass roots” movement, where ordinary people, inspired by a new understanding of who they were, acted with extraordinary heroism and encouraged others to do the same. The second shift on the other hand presents itself as a “top-down” imposition of ideology on a popular culture which struggles to retain its Judeo-Christian identity. The first shift involved the assimilation of Greco-Roman values of rationality into a lifestyle characterized by the radical gift of self. The second involves the withdrawal of man into himself – according to Benedict, into the “realm of the subjective” -- and his alienation from God, the source of rationality.

------------------------------------------------------
Thucydides. The Peloponnesian Wars, numbers 50-54.

From Pontius, The Life and Passion of Cyprian, the full text of which may be found at http://www.users.drew.edu/ddoughty/Christianorigins/persecutions/cyprian.html

From Saint Cyprian of Carthage, De mortalitate, the full text of which may be found at

http://www.ewtn.com/library/PATRISTC/ANF5-15.TXT

St. Augustine. Epistola 204; 5: Corpus Scriptorum Ecclesiasticorum Latinorum 57, 320. Quoted in Pope John Paul II, Evangelium vitae, 66.


Human Organ Donation: Gift or Graft?

Published in the Opinion page of the Chicago Tribune, May 12, 2000; of The Saint Louis Post-Dispatch, July 18, 2000; and of The Baltimore Sun July 27, 2000. It was also reprinted in a textbook by Lee Odell and Susan M. Katz entitled Writing in a Visual Age, Bedford/Saint Martin's, 2006.

A few days ago, while making rounds at a local hospital, I chanced upon the pictures of two children tacked to a bulletin board in the ICU. One photo showed a despondent, jaundiced infant with a huge, fluid-filled belly and limp, unsettled limbs stretched out in every direction. The other showed a pink, healthy baby beaming as he grasped his foot with his right hand and a toy with his left, as if trying to decide which to chew on first. Only after reading the note next to the photos did I realize that both were images of the same child. His mother sent the pictures with a moving letter thanking the hospital staff for their role in obtaining a suitable donor liver for her son, born with a fatal deformity of the bile ducts. “Thank you for giving our son back to us”, she wrote. A liver transplant was their only hope, and in this case, the benefit was nothing short of spectacular, clearly a gift of life.

Tissue obtained from humans – both living and dead – has proved to be life-saving and life-enhancing for persons with a broad range of medical problems. Despite campaigns to improve public and professional awareness of the benefits of donation, the sharp upswing in demand for human organs for transplantation has not been accompanied by a corresponding increase in supply. While solutions – of both the carrot and the stick variety – have been offered to reverse the donor dearth, none has proven particularly successful, and the unmet demand for organs has fostered commercialization and, occasionally, exploitation of donors and their families. Pennsylvania, for example, recently became the first state to offer relatives of deceased, prospective donors financial incentives to give up the bodies of their loved ones. Documentation of paid or coerced donations from the poor of developing countries and citizens of totalitarian regimes is on the rise. Sordid reports abound of abortion clinics feeding a lucrative black market in fetal tissue.

In the affluent west, “market-based” organ procurement ranks among the more creative approaches that have been proposed, but is little more than a euphemism for buying and selling of body parts. Prospective “organ vendors” could “opt-in” to an organ or tissue “futures market”. After their deaths, their estates would share in the substantial financial benefits currently enjoyed by a few for-profit tissue brokers. The concept of "rewarded gifting" -- an oblique, contradictory term concocted to blunt the funny aftertaste that comes from knowing you’ve just put your innards up for sale – is unlikely to gain wide acceptance.

Before organ transplantation was possible, laws regarding disposition of a dead person’s body made it clear that executors of an estate could not make it an item of commerce. The reason? The body was not considered “property” in the legal sense, and therefore not a part of the person’s estate. As transplantation developed into a therapeutic medical procedure, living organs from dead donors acquired value, and laws were adapted to fit the new reality. Under the Uniform Anatomical Gift Act, which regulates procurement and distribution of human organs for medical purposes, it is still illegal to sell the dead body or its parts, but not to give them away. So some non-profit procurement groups pitch donation to surviving family members, appealing to their altruism, then charge hefty processing fees to for-profit distributors. At times, the same people run both operations. The problem is that these organizations deliberately blur the distinction between giving and selling.

In his insightful book, “The Gift: Imagination and the Erotic Life of Property”, Lewis Hyde points to the existence of two “economies” operating in human relations. One is the “market economy”, where commodities are valued, bought and sold in a calculated exchange. The other is the “gift economy” where goods are given freely without calculation. Gifts can not be purchased or traded, only bestowed on one person by another. Gifts are not capital. Rather than foster competitive self-interest and social fragmentation, they create and strengthen bonds between individuals. We all like to give and receive gifts because they express values that money can’t buy: the esteem of one person or group for another; the desire that others may experience good fortune and happiness, and share in the prosperity of the giver. Gifts are a material expression of good will, of love. They unite rather than divide. In a sense, the practice of organ transplantation today occupies the intersection of these two economies: one man’s gift becomes another man’s capital.

On the one hand, the costs incurred by harvesting, processing and transporting organs should in justice be covered, and some would argue that a reasonable profit may legitimately be made in the process. On the other hand, the organ itself, the generosity of the donor’s family and the life that is prolonged are values that resist an economic paradigm. According to Hyde, “Transactions that involve life itself seem to constitute the primary case in which we feel called upon to distinguish the right of bestowal from the right of sale”. The mother of the child in the photos came to understand that life is the ultimate gift. It is bestowed, not taken, made or earned, and it is a gift of incalculable value.

The industrial age we are quickly leaving behind was characterized by the conflict of capital and labor. As the new era of biotechnology begins, the new dialectic may well be defined by the dichotomy of commodity and gift. Advances in biotechnology that allow us to manipulate human life force us to continue to put a price tags on the priceless, to confuse gift and commodity, persons and property. The tension created by these contrasting values – already evident in the debates on the patenting of human genes, the use of human stem cells and fetal tissue for the treatment of disease – may introduce original and unexpected notions of value and wealth. In time, we may well discover that we have much to gain by giving away what never really belonged to us.

Life after death in our brave new world

Published in the Opinion Page of the Chicago Tribune on May 1, 1999
It was bound to happen sooner or later. Doctors have been squirreling away sperm from dead men for some time, but no surviving relatives had ever been interested in putting these frozen cells to use. At least not until July of last year when the young, childless widow of a thirty-something California man changed that. With the aid of a sophisticated in vitro fertilization technique, she conceived and -- just last month -- delivered a healthy, fatherless infant girl she named Brandalyn. Her husband, Brandalyn’s father, had died suddenly 15 months before conception, and his sperm was retrieved about 30 hours after his death, in a hospital morgue.

Now, another woman in London is waiting her turn to conceive a child from her deceased spouse, after waging a highly publicized court battle to allow doctors to harvest her husband’s sperm cells while he was on life support, comatose and dying. The ensuing hullabaloo among doctors on both sides of the Atlantic has focused on whether a woman’s “right to reproductive freedom” outweighs a man’s right to consent to or decline sperm donation; whether the man’s consent could be implied or should be explicit; whether a child should be “made” without a father in her future. While these issues are passionately debated, they miss the point. There’s more at stake here than legalistic arguments about who’s got a right to what. The real issue is deeper, broader and much more important.

A cynic, it is said, knows the price of everything and the value of nothing. We live in a cynical age, where human sexuality is perceived as “breeding” and, to some in the medical field, nothing more than animal husbandry. Ad campaigns, offering big bucks for the sperm and eggs of bright, good-looking men and women, have hit major college campuses across the country. A supply of “buff” gametes bought and sold to gratify someone’s desire for “superior” children, or worse still, some scientists demand for research material.

But that’s not all. Because the techniques for storage of sperm and eggs have, until recently, been unreliable for preserving the viability of these cells, in vitro fertilization is usually performed first. Then hardier embryos, rather than separate sex cells, are packaged and dipped in liquid nitrogen. By conservative estimates, there are now over 100,000 embryos stowed away in cold storage across the country, waiting for someone to decide what to do with them. They are the silent prisoners of an aging, de-humanized “high technology” bent on methodically undermining the value of human life.

Human beings are not called to breed, but to parent: to give life through love. By “love” we should understand neither mushy sentimentalism nor physical gratification, but the selfless gift of one person to another. Love has many forms: it is the sacrifice of parents on behalf of their children; the dedication of a good professional to his or her work as a service to others; the devotion of friends. In each case, one person makes a free and responsible effort to serve another, often at the expense of personal comfort or pleasure. The most sublime form of human love is the love between a husband and wife.

In marriage, a man and woman promise to give all of themselves to one another -- their bodies, hearts, talents, material possessions and personal ambitions -- as a mutual gift. Sex means giving physical expression to the complete, mutual gift of two persons in a stable, life-long relationship. And it is no coincidence that the physical actions which serve to forge the deepest possible bond between two persons are the same actions by which new human life is generated. This dual meaning of human sexuality – sex for bonding and for babies – reflects the physical and spiritual dimensions of the human person. The generation of new human life is the spiritual translation of the physical action, the deepest fulfillment of the sexual union. It makes perfect sense that a human being should begin life bound together by the love of a man and woman, not thawed out by the dull proficiency of a lab technician.

Human sexuality is about babies and about bonding. They are like two sides of the same coin. Each and every sexual act must respect these two inseparable aspects if sex is to retain its true meaning, and lead to the health and happiness of the persons involved. To separate these two meanings of the sexual act is to adulterate, to cheapen human sexuality. Techniques of assisted reproduction introduce a “third party” into the most sublime, intimate events of human existence. They violently separate babies and bonding, and in the process sever the deep, natural link that needs to exist between love-making and life-giving.

Developed countries, with the US in the vanguard, have rushed to embrace sperm and egg donation -- from the living, the dying or the dead --, embryo cold-storage, and a host of other in vitro reproductive technologies as an acceptable, if not entirely palatable, part of mainstream medical care. Why should we be surprised if the cells which serve as the biological substrate of human life are handled as a commodity to be bought and sold to the highest bidder? Sex is for sale, and a test tube has now become the cold, hard cradle we offer to welcome new human life into our brave new world.

A short history of voluntary death

By following this link, you can read a published version of the paper below. A Portuguese translation of the article may be found here. A version in Italian may be found here.

Differences over embryonic stem cells are part of broader debate

This article appeared in the opinion pages of the Detroit Free Press on July 27, 2001. The subtitle of the piece was "Get rid of mind-body dualism and recognize embryos as human beings". It was reprinted on a multiple sclerosis website together with other articles on this topic.

The debate raging over the use of human embryos and their stem cells is important for many reasons. Perhaps the least important of these relates to the physical ailments that might, in the future, be cured using these techniques. The debate is more important because it is part of a broader conflict whose origins can be traced back for centuries: a clash determined by vastly different ways of understanding what it means to be human.

About 400 years ago, a genius named Rene Descartes -- a man who contributed important and durable insights to science -- goofed when he dabbled in philosophy. He succeeded in driving a deep wedge between the once well-integrated spiritual and physical dimensions of the human person. Over time and almost imperceptibly, the "split" that Descartes imagined between the mind and the body took root, first among intellectuals. Today, it is flourishing in attitudes and behavior at all levels of society.

Thanks to Descartes, Deepak Chopra can claim, "Your body is just the place your memories call home," and people will buy his books. There are many other examples in popular culture that express the theoretical mind-body split in practical ways. But perhaps none is as disturbing as the prevailing attitude of science and society toward human embryos.

Scientists have become exceptionally skilled at growing them up in a petri dish. They can tell us what they might look like if they are allowed to develop and how they might suffer and die from diseases inscribed in their genes. They tell us the great good that someday may come from using parts of them to treat the diseases of other people. They may feel quite satisfied when they discard some embryos in favor of others. They are simply selecting those less likely to burden society and more likely produce. They are strengthening the gene pool and giving another human the best chance for a "good life." But they refuse to grant the human embryos they use the most vital gift of all: respect.

And the reason scientists are incapable of respecting the humanity of their embryos is because "your body is just the place your memories call home." They view the particular human body they are working on as a shell, a car in need of a driver. It can be produced, discarded or recalled for factory defect. It has no memory, no past and, if they decide so, no future.

It's clear that Descartes had no idea what he was getting into when he split himself and us in two. But ideas do have consequences, and unfortunately, bad ideas may have very bad consequences. After four centuries of mind-body dualism, it is time to put things back in their proper place. Honest science must accept nature on its own terms. Bias defeats its purpose, and it is the voice of antihuman bias in science today that claims that an organism shown to be genetically human, biologically human and anatomically human is just a body and not a human.

To be human is to be both body and spirit. Each completes the other, and together both confer wholeness and integrity to each and every human person.

To arbitrarily deny the full humanity of a human embryo is symptomatic of the disintegration that some human beings have forced on others. Today, the weak, the dying, the disabled and those who are unable to speak for themselves, can easily be defined, either in theory or in practice, as less than human. They become living, breathing shells devoid of spirit, of value and therefore of humanity.

Some today might ask, not without an edge of cynicism, "So when exactly does the embryo become a human being?" This critical question is one that science is now fully equipped to answer unambiguously: "It has always been a human being." Where there is a physically distinct human organism, alive and growing, there is a complete human organism. From the moment of conception, there exists an autonomous human life. All that he or she needs to develop is proper nourishment. Nothing else needs to be added to them.

The ethical dilemmas posed by stem cell research, pre-implantation genetic testing and other assisted reproductive technologies are caused by the dualist view of the human person that has dominated scientific theory and practice for four centuries. This view turns the embryonic human body into the disposable part of a potential person, rather than an integral part of a whole person. We should recognize that treating the human body in this way has implications far beyond science and academia.

Could it be that the many real and imagined conflicts that polarize families and society today are an expression of the same split that inclines us to deny scientific reality and imagine the human embryo to be anything other than human?

"Therapeutic" cloning: A human misconception

The term "human cloning" refers to the creation of a genetically identical copy of an existing -- or previously existing -- human being, or to the production of genetically identical tissue from that individual. In 2001, scientists at Advanced Cell Technology in Worcester, Massachussetts announced that they had successfully cloned a human embryo. It was the first time human cells had been grown through "somatic cell nuclear transfer", the same technique used to clone Dolly the sheep about 5 years before. The egg used to coax the human DNA to divide was taken from a paid human donor, and the embryo survived for only a few cell divisions. This artice comments on that event. It was published in the Philadelphia Inquirer on December 13, 2001.

The human cloning effort took its first baby-steps recently when Dr. Michael West and colleagues at Advanced Cell Technology announced the successful regeneration of human embryonic cells from a sample of adult DNA. After removing the normal female half-dose of DNA from an egg given up by a paid volunteer donor, they inserted a full dose of adult DNA into it. Then, using subtle electrical and chemical prodding, the newly formed entity was induced to divide.

And divide it did, for two or three generations, before succumbing to the inhospitable climate of the petri dish. The well-orchestrated media blitz that followed was remarkable for its hubris. ACT investigators had barely summoned up six primitive cells from the human clone they manufactured yet references to the therapeutic benefits of the work figured prominently in their publicity campaign. In an allusion lost to no one, they identified the donor of the cloned DNA as a paraplegic Texas physician and father of two.

Tugging at our heartstrings to gain sympathy for an experiment fraught with ethical and scientific problems hardly seems appropriate, in particular when the weight of current scientific data overwhelmingly favors the therapeutic potential of adult stem cells over those derived from embryos or from cloning. Moreover, a careful look at cloning suggests that a certain “deception” of nature is inherent to this process.

A normal egg will divide only if fertilized. In cloning, the egg is fooled into thinking that conception has occurred. Exposure to the unique natural environment found in the egg and the artificial environment contrived in the test tube causes partially dormant adult DNA to revert back to an activated, embryonic state. DNA normally exists in a fully activated state only once: when the DNA of sperm and egg come together at the moment of conception. So human cloning is simply human misconception.

And misconceptions abound in the understanding that the ACT group seems to have of their own labors. Ron Green, Dartmouth religion professor and ACT’s chief bioethicist, justified the experiments claiming that the clone was not a human embryo, but rather “an activated egg”, “a new type of biological entity never before seen in nature”. Yet the scientific paper and their article in Scientific American referred to a “human cloned embryo” as the result of their experiment. Semantics aside, the ACT group overlooks a simple fact: each and every human life is a new and unrepeatable force of nature. Like any work of art, each human person draws value from singularity. Breaking the mold, ensuring that there can be no two alike, affirms and enhances the value of the original. Cloning humans serves only to promote high-tech narcissism.

Scientists know next to nothing about what happens to human DNA when subjected to the unnatural prodding required for cloning. There is no doubt that the changes to the cloned genome are explosive, as complex as they are subtle. If observations in cloned animals are any indication, DNA cloning frequently causes grave and unpredictable adverse effects to the resulting organism. This simple observation makes blind faith in the therapeutic potential of human cloning difficult to understand.

The technique used by ACT has already resulted in the generation of fully developed animals of a variety of species. Dolly the sheep is the prototype. If this technique is perfected for human DNA and the resulting cells are allowed to develop, the manufacture of a fully developed human organism would be the foreseeable outcome. West and his colleagues at ACT have assured us that their intent is purely “therapeutic” and never “reproductive”. They desire only the parts and not the whole, and since West affirms that he works “to rid mankind of suffering and death”, we might conclude that somatic immortality is his ultimate goal.

Cooking up “new types of biological entities” is hardly a task fit for science. The push to clone advocated by ACT and others is rooted in and will inevitably foster deep misconceptions about what it means to be human, and worse abuses are sure to follow. Remember that, to justify his own utopian agenda, Josef Stalin boasted, “You’ve got to break a few eggs to make an omelet”. By failing to recognize that human life is untouchable, as sacred as it is frail, ACT risks an unsavory association with the totalitarians of the last century. And we know that is a recipe for disaster.

Surgeon General's report on sexual health: The public effects of private behavior

On June 28, 2001, then Surgeon General David Satcher issued a report entitled "A Call to Action to Promote Sexual Health and Responsible Sexual Behavior". An editor at the Philadelphia Inquirer editorial page asked for my impressions, and they appeared in their opinion page on July 8, 2001 next to those of Susan Wilson, a sex educator at Rutgers University. Soon after, it also appeared alongside "counterpoint" articles by Bill O'Reilly of Fox News in the Miami Herald (July 11, 2001) and by Pastor Madison Shockley of the Religious Coalition for Reproductive Choice in the Houston Chronicle (July 16, 2001). The Sacramento Bee (July 13, 2001) and the News-Gazette of Champaign, Illinois (July 15, 2001) and the Holland (Michigan) Sentinel (July 12, 2001) ran the piece as well. The terms "self-possession" and "self-donation" do not appear in Satcher's report. They are my way of classifying categories of social pathology that Satcher describes. The references to freedom of indifference and freedom for excellence are taken from the thought of Servais Pinckaers as published in his book, The Sources of Christian Ethics. A helpful review of this approach to understanding human freedom was published by George Weigel in a paper entitled "A better concept of freedom" in March, 2002.
The “Call to Action”, recently issued by Surgeon General David Satcher, is an effort to identify sound strategies for promoting responsible sexual behavior in the United States. Meant to serve as a starting point for a national dialogue on this important issue, the document offers in broad brush strokes a clear portrait of sexual health -– or perhaps more accurately, sexual pathology -- in America today. At first blush, one might be tempted to ask why is more discussion about sex necessary? We already talk about it quite a bit. Ours is a culture saturated with sexual information and imagery. If Dr. Satcher feels even more dialogue is needed, perhaps it is because the conversation thus far has been neither helpful nor healthful.
Three broad themes run through this document. The first points to what may be called a “crisis of self-possession”. The epidemic of sexually transmitted diseases, including AIDS, and teen pregnancy lamented in the report are often the result of sexual promiscuity. If some persons choose to be promiscuous -- and by Satcher’s account, many do -- it is usually not because they always enjoy what they are doing, but because they want to be truly loved by another person, and believe that sex is a way to achieve this. They seek to placate their loneliness by giving themselves away, but more often than not are sorely disappointed, frequently infected and inevitably embittered by their experiences. Promiscuity occurs because, all too often, persons have not been helped to discern their authentic dignity and the true value of their sexuality. They choose intimacy with anyone and end up trusting no one.
A second trend within the document suggests a “crisis of self-donation”. Satcher suggests that abortion and unwanted pregnancies are bad for society, and their incidence should be reduced. But fixing the numbers will never fix the cause of these tragedies. When a man and woman conclude that their pregnancy is unwanted, they are saying that the life they have helped to bring about is not worthy of being. It’s too much trouble. They fail to understand that the demands that children impose upon parents are actually a great blessing. Nothing can shake us out of our complacency and selfishness faster than the arrival of a child. They offer us countless opportunities to forget about ourselves. Rising to the occasion, rather than shrugging it off can be among the most salutary experiences known.

The third theme implicit in “A Call to Action” is simply this: private behavior has public consequences. For years now, a new right to privacy -- conceived in the so-called “penumbra” of the U.S. Constitution by
Roe v. Wade -- has served to protect the personal reproductive choices of Americans from external interference. Yet the unrelenting emphasis on “privacy” and “freedom of choice” advocated for three decades by many of our fellow citizens has not led to demonstrable improvements in sexual health. Why not? Perhaps it is because the brand of freedom they advocate leads only to indifference: it’s OK to abort and it’s OK to keep your baby; it’s OK to have sex before marriage and it’s OK to wait; it’s OK to be promiscuous and OK to be monogamous, and for that matter it’s OK to not play the game at all. So what’s the difference? Who cares! The only thing that matters is the “freedom to choose”. And we all watch dumbfounded as America’s families – husbands, wives and children -- despair of trusting one another, victims of the apathetic chill brought on by freedom misused.
Dr. Satcher would like his “Call to Action” to serve as a point of departure, “common ground upon which the nation could work to promote sexual health and responsible sexual behavior”. This might be a good thing, but only if we acknowledge that true freedom has limits. Those limits are defined not by arbitrary rules and individual choices, but by a law inscribed in human nature. This law binds all of us equally, and guides the behavior that leads to health and happiness. Freedom may then be understood as the struggle for excellence, the effort to achieve a desirable ideal despite obstacles and adversity. With this view of freedom, surrender to indifference is failure. What is needed to restore health to American family life is not more sex education, but education in the proper use of personal freedom.
The use of freedom leads to personal fulfillment only if we give ourselves to others in ways that are appropriate to our nature: self-possession for self-donation. Normally, we enjoy the company of colleagues and neighbors. We relish the companionship of friends and family. We desire intimacy with those closest to us, our spouse and our children. Each relationship fulfills in its own way a basic human need for others. When freedom is used to justify the gratification of our personal whims, we end up avoiding our spouse and children, caring little about family and friends and having sex with our colleagues and neighbors: self surrender for self-indulgence. Which life would you choose for yourself?

Insurance coverage for contraceptives?

This article appeared May 1, 2002 in the opinion page of the Philadelphia Inquirer. Leslie Anastasio, the executive director of the Pennsylvania chapter of the National Abortion Rights Action League, responded a few days later. In most states, the American Civil Liberties Union is leading the legal offensive to force Catholic institutions to pay for the contraceptives of their employees. This campaign is an important part of their Reproductive Freedom Project. See the ACLU's report entitled "Religious refusals and reproductive rights", which outlines their position on the subject. They later issued another document specifically addressing the issue of pharmacists refusal to dispense contraceptives.
Legislation has been introduced in a number of State Houses across America to force insurers to pay for the contraceptives of their workers. Politicians argue that failure to cover these services constitutes a needless burden on women. But don’t be fooled. Concern about providing more comprehensive health insurance is hardly the issue. The disturbing thread running through these bills is the intent to purge from existing law any possible recourse to conscientious objection in matters of human reproduction. And since today the Catholic Church stands alone in her opposition to contraception, one could argue that Catholic health care professionals and Catholic institutions are being singled out for harassment.
The primacy of individual conscience over the authority of the state is a foundational principle of Western culture. In the United States, this principle was challenged by the legalization of abortion in 1973. Within months of the Roe v. Wade decision, the late Senator Frank Church of Idaho passed an amendment to Federal health care legislation recognizing the right of doctors and hospitals to refrain from doing procedures felt to be morally reprehensible. The so-called “Church amendment” served as a template for state “conscience” laws that were subsequently enacted, and that currently stand in most States. These laws were hardly a legal novelty. They simply made explicit an uncontested principle implicit in legal practice for centuries.
Restriction of existing “conscience clause” laws has been attempted with varying success in the District of Columbia, California, New York, Louisiana, Illinois, Massachusetts, Maryland, Texas and other states. They are simply attempts to chip away at the integrity of individual conscience as a legal principle. Under a new, highly restrictive California law for example, the only entities allowed to object in conscience to contraceptive coverage are organizations of explicitly religious character that hire only persons of the same faith, and that devote themselves only to religious training or indoctrination. This law effectively forces Catholic hospitals, educational and charitable institutions to provide contraception to their employees, and would force them to violate state and federal anti-discrimination laws in order to qualify for an exemption. A bill with similar restrictions may soon pass in New York.
The ACLU has been a leading force in the push to restrict conscience laws around the country. Their views are clearly expressed in the document “Religious Refusals and Reproductive Rights” published this past January as part of their ongoing “Reproductive Freedom Project”. They ask, not without a stiff dose of hubris, whether or not an individual or an institution “should be allowed to refuse to provide or cover a health service based on a religious objection,” and argue that to do so in many cases can pose a health risk to the population. What they fail to recognize is that trampling on conscience is itself a cause of serious pathology. It causes schizophrenia: the split personality that results from severing the natural connection that should always exist between moral character and professional competence.
By forcing someone to perform actions that are morally repugnant to them, an unnatural split is introduced between “competence” – the technical ability to perform actions well – and “character” – the moral disposition of the person performing the action. If the legal recognition of conscientious objection is abolished, then the disintegration of the human person is the inevitable result, because conscience is the glue that holds moral character and professional competence together. To respect personal conscience is the only way to promote personal integrity. Only if a government fosters this unity of life can its citizens prosper.
The ACLU also argues that the “right to refuse” prescription of contraceptives –- or for that matter, to refuse sterilizations or abortions -- implies a pre-existing duty to provide these interventions. So according to the ACLU, “conscientious objection” is merely the benevolent state granting some religiously-minded citizens permission to be negligent. To them, “morality” is an arbitrary and malleable legal construct determined by the ruling majority, and a minority opinion may be tolerated as long as it does not inconvenience others.
While the Catholic Church is bearing the brunt of these not-so-subtle assaults on conscience, we need to recognize that this is not “just a religious issue”. It is a universal, human issue with broad implications for all of us. The state does not confer on its citizens the “right” to follow the dictates of conscience. To follow one’s conscience is a duty founded on the inherent moral nature of the human person. When our conscience makes a judgment regarding the morality of an action, we are bound by the verdict. In allowing for conscientious objection, the state merely acknowledges that the moral duty imposed on us by our conscience cannot be violated. It’s not that private conscience is infallible. It is not. Conscience must always refer to an objective truth that transcends the individual, applies to all, and points the way to personal fulfillment. The well-formed conscience, not “majority rules”, is the best moral guide we have. Respect for conscientious objection is simply the acknowledgement of the inalienable right of citizens to be free from external constraint by political authority in religious matters. Safeguarding conscientious objection, especially in matters of faith and morals, is never an obstacle to civil liberty.