Thursday, January 27, 2011

On flogging concepts that refuse to die

Cultural reproduction is a strange thing. It is normal and often desirable to transfer cultural artifacts to future generations. We want to instill certain values in own children. We want to ensure that our mother tongue survives and retains its integrity. We want to keep scientific progress going, and we want to perpetuate valuable institutions like the law and modern medicine. There is much good in this. However, it is easy to get the mistaken impression that cultural reproduction is under our control. Culture also has a life and momentum of its own, and some aspects are reproduced even if we think they have worn out their welcome. Racism, homophobia and sexism live on, and a whole host of other dubious inheritances besides. And efforts to discredit these inheritances from a rational point of view seem to have less effect than we sometimes suppose. An artifact can exist, even thrive, despite being pretty much debunked.

These were my reflections while reading a recent article in the journal Bioethics by Heather Widdows, entitled "Localized Past, Globalized Future: Towards an Effective Bioethical Framework Using Examples from Population Genetics and Medical Tourism." Widdows takes dead aim at the concept of individual autonomy, claiming that (a) the individual autonomy is conceived as a crucial or foundational value in bioethics and (b) the concept is inadequate to make sense of (or help resolve) bioethics issues. It is like turning up to a gunfight with a plastic spoon. She uses the issues of population genetics and medical tourism to make her point, but the point has far wider applicability. The thing is, though, this has all been said before, in a variety of ways: it would not be hard to draw up a long list of bioethics articles devoted to debunking the oversold status of individual autonomy.

I am not criticizing Widdows for flogging a dead horse. Yes, debunking autonomy is practically a cottage industry, but I am guilty of contributing to it as well. What I am saying is that the horse has been flogged for ages, it is still not dead, and the current zombie-like state of the concept requires an explanation. Why is it, say, that people still find it attractive to say that organ trade between the rich and the poor could be reasonably conceived as a fair and unproblematic trade if conducted between consenting adults? Or that exploitation in international health research is morally acceptable if the 'exploited' party in the transaction adequately consented and might be worse off if he or she did not join a certain study? When a notion seems to be debunked but somehow survives, it is tempting to look past its content and look at the social function it may continue to serve. What does the use of the concept of autonomy 'do' for (some) people when deployed in bioethics arguments? Who gains and who loses when these issues are viewed and defended within frameworks that see individual choice as paramount?

Labels: ,

Sunday, January 23, 2011

Shocking discovery: poverty messes up your head

According to legend (and it's only a legend), the novelists F. Scott Fitzgerald and Ernest Hemingway were in a bar when the former remarked to the latter: "The rich are different from you and me." To which Hemingway dryly answered: "Yes, they have more money." It is safe to say that the poor have less money, by definition, but the way it makes a difference ... er, differs. According to researchers at the University of Texas at Austin, relatively impoverished socio-economic circumstances have a negative effect on the rate at which small children start to realize their genetic potential, as this is manifest in their cognitive development. The environment created by wealth unlocks the genetic contribution to mental capacities; poverty seems to suppress it. As the news items about the study are quick to point out, one in five children in the US lives in poverty, and the country is undergoing a massive economic crisis.

It might be better to pull out a few other wider implications of the research, even if it is only one study. The study is basically saying that poverty strikes human beings at their core: if your mind is not at your core, it is hard to say what is. Poverty appears to literally incapacitate, making us relatively less capable of developing the features that make us persons; the idea that poverty dehumanizes us seems as frightening as the disintegration of personhood involved in late-stage Alzheimer's disease. The problem is, while one in five American children lives in poverty (and this is troubling), the poverty of the majority of children in many countries is far worse. Except for a lucky few, their genetic inheritance gets pretty much laid to waste. How much it is laid to waste has yet to be fully studied. But if the study in Texas is any indication, the impact starts much earlier than previously thought.

Labels: , ,

Wednesday, January 12, 2011

Global health research ethics in Vanity Fair

Vanity Fair, normally associated with glossy celebrity photo shoots next to swimming pools, is running something this month on ethics and the globalization of clinical trials. It really is. The basic thrust of the dramatically entitled article ('Deadly Medicine') is that pharmaceutical companies are guilty of a whole range of shady practices, from suppression of negative testing results to knowingly promoting products with serious side-effects or unknown efficacy, and there is little effective regulation to prevent or punish their irregularities and abuses. Furthermore, when pharmaceutical research takes place abroad in low- and middle- income countries, as it increasely does to cut costs, what goes on becomes even more obscure; when particular wrongdoings emerge from the global shadows, you can only guess how much exploitation, manipulation and harm is taking place on a regular basis.

What struck me reading the Vanity Fair article was a strong sense of deja vu. How essentially different is its content from those seminal Washington Post articles, the 'Body Hunters' series, written back a decade ago? Those were the exposes that blew the lid on the practices of global pharma and kick-started all manners of initiatives to raise consciousness about the ethics of global health research. So what happened in the meantime? A lot of activity in the public sector: the NIH consolidated its clinical ethics center, other bioethics centers popped up at universities around the country, new ethics journals were established, research ethics committees were established in developing countries, grants for research ethics projects were established, and so on. And yet, what did all this do in regard to the practices of for-profit multinational pharmaceutical companies as they scour the world for sites and populations favorable to their own economic interests? Did all this have any sort of impact?

From the looks of the Vanity Fair article, not much. The same sorts of 'irregularities' go on; what has changed through globalization is the quantity of institutions, investigators and researchers involved. Paradoxically, the bigger global health research becomes, the less visible its operations and effects seem to get.

Labels: , , ,

Sunday, January 02, 2011

The God Committee, Africa-style

Who knows when bioethics started? Like other annoying questions (When did philosophy start? When did romantic love start? When did rock and roll start?) there is probably no way to definitively answer it, but that does not stop people from trying. For some, bioethics started in the United States with the so-called 'Seattle God committee', the body of health care professionals and laypersons that was formed to decide who among patients with kidney failure should receive (then new, and very scarce) dialysis treatment. The situation in Seattle seemed to open a new field of inquiry: while the question 'who should get dialysis?' was partly a medical question, it went beyond that. It was not enough for doctors to invoke medical criteria alone, since many patients were medically needy -- the fundamental question was how to choose among them, if there are not enough machines for everyone. If not using medical criteria, what other criteria should be used, and how should we best come to reach such decisions? And so, the story goes, the idea was born of non-physicians assisting in resolving ethical problems within medicine.

This is an old tale. To read about it, you have to go back to a yellowed Life Magazine article from 1962. I was reminded of this piece of history when reading a recent article in BBC news about the rationing of dialysis machines in South Africa. It is striking that Tygerberg Hospital in Cape Town is wrestling with the same problems faced fifty years ago in Seattle, and from the reports, is not fairing much better. More specifically: just as the 'God committee' brought in controversial subjective criteria to decide who should gain access to dialysis, the committee in Tygerberg apparently cannot avoid doing the same. Part of the criteria are having 'good home circumstances' (i.e. running water, electricity, toilet, etc.), the motivation of the patient to adhere to the treatment plan and improve his/her health, and having a good social support network. It is fairly obvious that this criteria does not favor the poor. All other things being equal among needy dialysis patients, those who are better off socially and economically are more likely to gain access to dialysis. One could argue that these non-medical criteria have a medical justification: if a patient does not have good social circumstances, dialysis will not produce a favorable outcome for that patient. Probably. But the upshot is brutal. If you are really poor in Africa, and suffer from renal failure, you are seriously screwed.

In the United States, the federal government in 1972 made the unprecedented decision to extend Medicare provisions to enable the vast majority of patients with chronic renal failure to gain access to dialysis. Basically, faced with a very public rationing problem, they threw money at it. This is not an option in Africa. As chronic diseases become more and more prevalent in the region -- certainly more prevalent than new clinics, new technology and new medical professionals -- there are going to be a lot more stories like this.

Labels: , , ,