Thursday, November 17, 2005

Aging, logic and bioethics


It’s been said before, but worth repeating: the bioethical priorities and concerns of developed and developing nations are sometimes poles apart, revealing striking differences in mentality, values and power. A case in point is a recent article in The Journal of Medical Ethics, entitled “Life extension, human rights, and the rational refinement of repugnance”. In it, the author -- a self-described biogerontologist – claims that aging is “humanity’s foremost remaining scourge”, that we are “within striking distance” of curing aging, and that those of us who are wary of the idea of radical life-extension via biotechnology are simply backward. All of this may sound strange to those living in countries with scourges more pressing than aging, or rather, where many people die of preventable diseases before getting much of a chance to age.

Then again, the notion of ‘curing aging’ probably sounds strange no matter where you are. Aging to most people is as natural as being born; it is what people do, or what at least their bodies do, between being born and being dead. Most people don’t like it much, but it is accepted as part of the picture of what life is like, and what we have in common with other creatures as well as what we share with environmental cycles of growth, decay, death and regeneration. This is probably why it is unsettling to hear aging turned from a feature of the human condition into an affliction – like gonorrhea – that we might get rid of with injections and pills.

The author of the piece (A.D.N.J. de Gray) contends that we should all embrace radical life extension in so far as we embrace logic and human rights. We all – except perhaps death penalty advocates – embrace the right of a healthy human being to go on living, and it is simply a matter of logical consistency to then embrace the related right to cure aging through the use of biotechnology. Our moral intuitions about aging as a human condition will be overcome by rational argument, or at least that is the way things work in Cambridge (England) where de Gray is located.

According to the author, bioethicists have a special role to play in this debate. Being especially logical folks, and being independent of the politics that pervades science (at least in Cambridge), bioethicists are in a position to persuade the masses to give up their irrational attachment to aging. They can demonstrate through rational argument that, despite vulgar appearances, ‘going on living’ indefinitely by means of sophisticated drugs actually accords with our deepest moral values. In other words, bioethics can do humanity a service by doing PR work for the anti-aging enhancement industry.

But who knows, maybe bioethicists can play another, less slavish role. They could namely critically reflect on what living 150 years might mean for ordinary persons -- its impacts on core spheres of existence such as interpersonal relationships, family life, and work – and offer a nuanced opinion that neither glorifies nor condemns the possibility of extending lifespans. All this, while affirming something that every poor Malawian villager knows: that death, even in Cambridge, waits for us all.

Wednesday, November 09, 2005

The great brain robbery

In the October 27th issue of New England Journal of Medicine, Fitzhugh Mullen’s article “The metrics of the physician brain drain” offers some fresh measurements of the extent of physician migration from less-developed to more developed nations. One of the more striking figures is the percentage of ‘international medical graduates’ from low-income countries estimated to be in the United States medical workforce: of the 25% of working physicians who have graduated abroad, 60% of them come from low-income countries. United States is not alone in luring foreign physicians to its shores: of the 28.3% of physicians in the United Kingdom who have graduated from medical school ‘elsewhere’, 75.2% come from resource-poor nations. Canada and Australia are also significant importers. In absolute numbers, India is the greatest exporter nation of medical professionals, while sub-Saharan Africa sees the largest percentage of its medical workforce leave for greener (or at least other) pastures.

This development has its benefits: overseas physicians wire much needed remittances to extended families in their home countries, and the move abroad opens opportunities for them as individuals. But the social burdens outweigh these benefits: the exodus of physicians have significantly weakened the ability of low-income countries to face the great health challenges posed by HIV/AIDS, malaria, tuberculosis and other serious conditions. And the exodus is not slowing down: Canada is adding residency positions to allow for more medical professions from overseas and is busy streamlining immigration and training requirements to insert them into medical practice. Australia and the United States have their own plans to make medical work in their countries more attractive to foreign-trained practitioners.

The British Medical Association has at least recognized that the brain drain phenomenon is largely the responsibility of developed countries and that they therefore have an obligation to do something about it. In May of this year, BMA issued a call to action on the healthcare skills drain that elaborated four key points:

1) All countries must strive to attain self-sufficiency in their healthcare workforce without generating adverse consequences for other countries.

2) Developed countries must assist developing countries to expand their capacity to train and retain physicians and nurses, to enable them to become self-sufficient.

3) All countries must ensure that their healthcare workers are educated, funded and supported to meet the healthcare needs of their populations.

4) Action to combat the skills drain in this area must balance the right to health of populations and other individual human rights.

Fine sentiments, but it remains to be seen whether the individuals and nations who profit from the skills drain are willing to do the ethically right thing. And what is the ethically right thing anyway? If it would be wrong to forbid physician immigration, what other means would be appropriate to stop the brain drain?

Monday, November 07, 2005

Burying the hatchets to help the developing world?

In today's New York Times, Nicolas Kristof argues that "if only left and right can hold their noses and work together, we can confront some of the scourges of our time - sex trafficking, genocide, religious oppression, prison brutality - on which there is surprising agreement about what needs to be done."

But is that true? While there is wide agreement about what the problems are, it is unlikely that there is similar agreement about possible solutions. Decisions are never based solely on the best available evidence. Our values and/or ideology also help determine whether a particular plan of action is considered (un)necessary or (un)warranted.

Kristof uses the example of maternal health: "For all the battles over abortion and condoms, both sides can agree that half a million women shouldn't be dying unnecessarily in childbirth each year around the world, when modest investments can save their lives." Not to be flippant, but I'd bet that most people would consider that a bad thing.

The struggle comes in defining "modest investments." Conservative political and religious influence on U.S. policymaking means that aid for global family planning efforts cannot be used for abortion counseling or services and global HIV/AIDS funds are restricted to those programs that are on record as opposing prostitution. These values and ideologies also helped convince President Bush to commit $15 billion to global HIV/AIDS efforts through PEPFAR, the President's Emergency Plan for AIDS Relief, as well as make controversial choices about how the program would be run . An administration with a more liberal perspective would likely come to different conclusions.

In the current polarized political atmosphere in America, bipartison consensus on identifying seemingly unarguable problems may be worth celebrating. But the devil is always in the details.

--Angela Thrasher, guest contributor

Thursday, November 03, 2005

Male circumcision and HIV infection: the ethics of a landmark study












In an earlier post, I mentioned some of the preliminary and partial results of a randomized controlled trial of the use of male circumcision to reduce risk of HIV infection in South Africa (the ANRS 1265 trial). Those results were presented at the International AIDS Society in Rio last July, and an article further detailing the study methods and findings have just appeared in the November issue of PLoS Medicine. In an unusual move, the editors of PLoS Medicine have seen fit to write an editorial to explain why it choose to publish the trial results – after having them combed over by six reviewers -- and the issue also includes a commentary by the head of one of the IRBs that originally approved the protocol. It is safe to say that the study is regarded as ethically hot, and its publication is unlikely to cool it down.

ANRS 1265 was a randomized controlled trial with one intervention arm (1640 men who received circumcisions after recruitment) and one control arm (1654 uncircumcised men). During the course of the study, initially planned to take 21 months, 20 men in the intervention arm became HIV positive as opposed to 49 in the control arm. This translates into a 60% protection rate – a rate comparable to an efficacious vaccine and achieved by means of a relatively simple surgical intervention. For decades, there have been observational studies and meta-analyzes suggesting the protective qualities of circumcision against female-to-male HIV transmission, but this study seems to finally offer experimental proof. The importance of the findings is unquestionable.

But where are the ethical flashpoints? There seem to be at least two related ones. First, while it would seem to be logical to limit the participants to HIV negative men, the researchers choose not to use a positive HIV test as an exclusion criterion, reasoning that such a criterion would risk subjecting prospective participants to stigmatization. Thus the presence of some HIV positive men in the trial was virtually guaranteed. Second, while the researchers drew blood to test for syphilis and HIV, interviewed participants in detail about their sexual activities, offered HIV counseling, and encouraged the men to be HIV tested at local clinics, they did not inform participants of their HIV status if they did not wish to be informed. The study design, in fact, called for researchers to be blinded as to the HIV status of the participants. So on the one hand, the trial presupposed that some of these at-risk men (both circumcised and uncircumcised) would be or become HIV positive during the trial, but on the other hand the trial left no room for a ‘duty to warn’ those who were or became HIV positive, even if the participant revealed in interviews that he was engaging in unsafe sex. As the article puts it:

They [the investigators] considered it unethical to inform participants of their HIV status without their permission, even if they thought that participants should be aware of their HIV status.

Some might say that in the name of avoiding stigmatization and respecting autonomy, the researchers were studying (at least in some cases) HIV positive persons engaging in harmful practices without protecting third parties. And why did the investigators need to be blinded in regard to the HIV status of the men in the first place? Couldn’t the same results be generated by an ethically less controversial trial design?

Readers are invited to read the PLoS article and draw their own ethical conclusions.

Wednesday, November 02, 2005

Calling Africa and beyond: guest contributors wanted

We are looking for guest contributors to the Global Bioethics blog from the developing world, preferably (but not exclusively) from sub-Saharan Africa. Do you have news or commentary about ethical issues arising from health policies, public health interventions, medical practices or biomedical research taking place in your part of the world?

We cannot promise financial compensation, but we can promise exposure: your post may be read in Japan, Spain, Australia and the USA, all in the same day.

Interested? Write a message to the editor at stuart_rennie@unc.edu.

Guest contributor Angela Thrasher

Guest contributor Angela Thrasher is a doctoral candidate in Health Behavior & Health Education at the UNC School of Public Health. In past professional life, she provided technical assistance and training to HIV/AIDS prevention and care programs around the U.S. Primary research interests are racial/ethnic health disparities and HIV/AIDS policies and politics. She describes herself as having no background in global health or bioethics, but finds the topics fascinating.

We are happy to have Angela contribute to the Global Bioethics blog when the spirit moves her.

Medical diplomacy: a short history

Guest contributor Angela Thrasher blogs:

Regarding your post on national security and global health, I happened to be a fly-on-the-wall when decisions were made to publicly promote American interest in global HIV/AIDS as an issue of national security. HIV/AIDS was the first health issue to be considered a national security threat by the US administration.

The Presidential Advisory Council on HIV/AIDS (PACHA) was established in 1995 by Executive Order to advice the President, Secretary of Health and Human Services, and the Administration regarding HIV/AIDS policies and programs. Its final report to President Clinton in September 2000 (AIDS: No Time to Spare, available through this site) explicitly acknowledged a "shift to a 'global' perspective" (p. 11) on HIV/AIDS and necessity for coordinated, sustained action. My job was to facilitate the process that led to the report.

Four of the six underlying themes of the report underscored the global nature of HIV/AIDS, including this one: "HIV threatens national and global security. The United Nations Security Council has recently characterized the global pandemic as a threat to security and stability, because HIV/AIDS has undermined the economic and political systems of many countries. Vice President Gore, speaking before the first-ever session of that body devoted to a health issue, stated, 'No one can doubt the havoc wreaked and the toll exacted by HIV/AIDS do threaten our security. The heart of the security agenda is protecting lives -- and we now know that the number of people who will die of AIDS in the first decade fo the 21st Century will rival the number that died in all the wars in all the decades of the 20th Century." Echoing that concern, U.S. Ambassador to the United Nations Richard Holbrook stated, '[AIDS] is the toughest and biggest of all issues, not just in Africa. Africa is just the epicenter... if you ask what is the number one problem in the world today, I would say it is AIDS.'" (p. 12)

During the deliberations for this report, then-AIDS Czar Sandy Thurman discussed the impetus of Administration thinking around HIV/AIDS and national security. This part of the summary particularly struck me: "The Department of Defense has allocated $10 million for military-to-military training, which has been hard won, although the military has been disproportionately affected by HIV/AIDS in Africa and elsewhere. In Congolese armies, infection rates run from approximately 40 percent among Angolans to more than 80 percent in Zimbabwe's army, for all ground personnel.This is particularly sobering when considering that the military constitutes the backbone of burgeoning democracies throughout Africa and maintains stability in the region. The best-educated professionals often advance through the military into positions of political power and are disproportionately infected. This has implications in terms of military engagement across national boundaries, in which, like other migrant activity, sexual conduct and the spread of the disease results."

On the May 2, 2000 McNeil-Lehrer News Report, then-National Security Advisor Sandy Berger provided an example of how HIV/AIDS destabilizes governments: "Well, the countries, for example, that have had enjoyed solid growth in Africa now are beginning to see that growth erode because its work force is not able anymore to function in the economy and the cost of the disease to the government is becoming overwhelming. What happens? Those countries begin to unravel. They are unstable. They're more likely to engage in conflict with their neighbors. And before long, we have something, a situation which is really tumultuous. And so the most important thing we can do is first of all, work with other countries to try to apply collective resources; two, work with the leaders of these countries to try to change the dynamic."

I also remember newspaper accounts of then-Joint Chiefs of Staff Chairman Colin Powell making similar remarks around this time. Interesting that HIV/AIDS is seen as an impediment to "stability" and not poverty (an underlying cause), for example, though that may be changing. I think that's one reason why framing global health as a national security issue is disconcerting to me. The real consequences for HIV/AIDS and other health issues on other countries' spending priorities, GDP, and military readiness are important, but should not be the primary reason to become involved. The alleviation of human suffering should be paramount. (A bit naive and idealistic, yes.)

On the other hand, I do think that a prioritization process for government spending and intervention should include consideration of broader non-health impacts. All of this is especially interesting given President Bush's call for $7 billion to prepare for the avian flu pandemic and previous talk of the military as first responder in such a scenario. But that's a post for another day.