Thursday, January 27, 2005


I'm leaving town today for a week, so no more posting until I get back late next week.

Meanwhile, I'm going to take Mixing Memory's lead and ask for requests for biology-related posts (though not because I'm fed up with politics - just because I'd like to write more about science). My main area of knowledge is neuroscience but I'm happy to write about anything that I'm reasonably qualified to talk about. So just leave a comment or email me if you want me to write about, say, stem cells, or genetic engineering, or how neurons work, or anything else, and I'll get on it when I get back.

Wednesday, January 26, 2005

Climate change and extinctions

This is interesting:
Like a chaotic pendulum, earth's climate swings, at uneven intervals, between warm and chilly ages lasting from thousands to millions of years. New research suggests that about 251 million years ago, one of those swings jolted the world so violently that oxygen became scarce, the planet's thermostat went awry and nearly all life fell into oblivion in the greatest of mass extinctions. ...

Dr. Ward proposes that the climatic changes were wrought by geological ones. As the supercontinent Pangea pulled apart, the rearrangement of land lowered sea levels. That exposed decaying plants in the sediments to air, producing chemical reactions that dropped oxygen levels from 21 percent to 16 percent or lower. Trying to breathe at sea level then was as hard as breathing at 14,000 feet today.

The breakup of Pangea may have also set off widespread volcanic eruptions that flooded what is today Siberia with hundreds of thousands of cubic miles of lava. Carbon dioxide released by the eruptions created a greenhouse effect, and the hotter temperatures killed plants. "It's simply a world that's going climatically screwy very fast," Dr. Ward said.
It's amazing to consider how variable and unstable the earth's climate is and has been. One of the most fascinating concepts in geological history is the idea of the "snowball earth," in which increasing ice cover creates a runaway albedo effect (ice reflects light, so the earth absorbs less heat from the sun, so it gets colder, so there's more ice, etc.) so bad that glaciers covered almost the whole earth. The earth escaped from the snowball state because of carbon dioxide: as ice covered the land and oceans, it would block the carbon cycle by preventing carbon dioxide from getting locked up in rock and sediment (through weathering reactions with rock that normally convert carbon dioxide to calcium carbonate, or through photosynthesis). Meanwhile volcanoes kept pumping out carbon dioxide at the same rate they always do; with the carbon cycle blocked, carbon dioxide would continue accumulating in the atmosphere. After millions of years, enough carbon dioxide accumulates to reverse the snowball effect and the earth swings suddnely to a very warm climate. Apparently this occurred several times between 580 and 750 million years ago.

The snowball earth also suggests a highly speculative, but fascinating idea: Could the snowball earth explain why there was such a sudden flourishing of multicellular animals in the Cambrian explosion 525-600 million years ago, with representatives of all modern phyla appearing so quickly in the fossil record?
A series of global "freeze-fry" events would cause population "bottlenecks and flushes", observed to accelerate evolutionary rates in some species. The crash in population size accompanying a global glaciation would be followed by millions of years of comparative genetic isolation in high-stress environments. This is a favorable scenario for genomic reorganization and the evolution of new body plans.
I have no idea how plausible this idea is, but it's certainly an attractive explanation...

Tuesday, January 25, 2005

More on the cognitive unconscious

Todd Zywicki responds further to the critics of his criticism of the Implicit Association Test. When it was suggested that he dislikes the IAT because of an unconscious bias agasint the idea that he has unconscious biases, he replied:
And if it is the case that our views on the usefulness of Project Implicit are little more than a reflection of our subconscious, wouldn't it be pointless to have a conversation trying to persuade me to use my conscious mind to revise my supposed subconsciously-biased negative opinion of Project Implicit itself?
One suspects that political propagandists throughout the ages have come to the same conclusion.

On a less flippant note, Chris of Mixing Memory responds to Todd better (and more, ah, bluntly) than I could, at the bottom of his original post on this topic. Main points: even if most mental processes aren't accessible to consciousness, that doesn't mean they can't be influenced by conscious processes - for example, arguments using empirical evidence and reason about the nature of the cognitive unconscious. It just means that the translation from conscious perception to belief is pretty complex and is mediated by lots of mental processes that we aren't aware of, that exert their own influence on the end result. In any case, the cognitive unconscious can be pretty sensible, even rational. So the cognitive unconscious need not lead us to rush into a nihilist epistemology.

Note: This is a follow-up to this previous post about the cognitive unconscious.

Monday, January 24, 2005

Clinton v. Bush

This post says it all:
"Depends on what the meaning of the word 'is' is." --Clinton Administration

"Depends on what the meaning of the word 'torture' is." --Bush Administration
For all the conservative fury at Clinton's evasions, at least he was lying about something relatively innocuous, whereas Bush is evading his responsibility in much more serious moral crimes.

(Link via Andrew Sullivan.)

Implicit association test

Todd Zywicki should really learn a bit more about psychology before he dismisses the Implicit Association Test as "stupid academic research."
A lot of stupid academic research goes on every day. Today's Washington Post magazine features one of the dumbest I have come across in some time--Project Implicit. ... Is it really plausible that my impression of Bill and Hillary is driven more by whether I have a messy desk than my personal perception that Bill Clinton is a liar and Hillary Clinton is a megalomaniac and opportunist? ... From what I can tell, this is about as scientific and insightful as a horoscope or palm reading.
Certainly, the IAT has its problems (Mixing Memory has a post here), but these are problems of methodology, not in the general idea that unconscious processes influence our behavior, and that a large amount of information processing that goes on in the brain is not accessible to consciousness. And Todd Zywicki appears to be mocking the latter idea, not critiquing the IAT itself. (And when he addresses the IAT at all, he does so by ridiculing it, not explaining its flaws.) Incidentally, it seems to me that his distinction between unconscious bias and conscious reason is about on par with nature v. nurture in terms of unhelpful false dichotomies.

In fact, it's not as ridiculous as Todd Zywicki makes it sound that unconscious biases could influence our behavior, including our public policy preferences. He has a certain amount of privilege about his internal mental states, but not an unlimited privilege. There are a things going on inside his mind that he doesn't know about but that can be detected by psychological tests. The IAT may not be the best means to do so, but you can't claim that it is altogether "the dumbest [study] I have come across in some time."

Update, 24 Jan - The limits of common sense: Todd Zywicki has responded to my and Mixing Memory's posts, saying that he does believe in unconscious mental processes after all. That's good. But he doesn't concede the point that common sense and intuitive knowledge of how the mind works can be fundamentally incorrect. He insists that we should use our "critical thinking and common sense to determine whether research makes sense." Critical thinking, yes. But common sense, not necessarily. A great deal of science is diametrically opposed to common sense - even fundamental concepts like inertia (common sense disagrees with the idea that something will keep moving in the absence of net force, but prefers the incorrect idea that force is required to keep something moving); quantum mechanics (come on! how can something be both a wave and a particle??); special relativity (time slows down when you move really fast? huh?); general relativity (gravity is the same as acceleration?); evolution (humans descended from bacteria?); Milgram's obedience experiment in social psychology (nah, I would never give a lethal electric shock to a fellow human being). Even free-market economics violates common sense: how can following self-interest actually promote the welfare of others? In fact, reliance on common sense is profoundly anti-scientific and contradicts critical thinking.

This isn't a minor point - it forms a lot of the basis for Todd's original post ("does it really seem plausible...") and explains why he doesn't feel he needs to know anything about cognitive psychology to reject the IAT. (He notes now that he was concerned with the methodological flaws of the IAT, but strangely never mentioned them in his original post that rejected the IAT out of hand.) Point of comparison: suppose I were a naive leftist who argued that it "just doesn't seem plausible" that self-interest could promote economic welfare or that markets are a form of self-generating order, so therefore a recent economics paper that once again shows that socialism doesn't really work is "stupid academic research" and "the dumbest I have come across in some time."

Finally, his analysis of using common sense to dismiss Marxist history, Social Darwinism, and astrology is flawed. You can't dismiss those theories/ideologies out of hand: you have to have some conflicting knowledge that proves it false (non-Marxist history, a correct understanding of natural selection, and high-school level physics, respectively). You don't have to be an expert, but you do have to be somewhat informed.

Update, again: I have a new post responding to Todd's latest.

Sunday, January 23, 2005

Alternate Americas

A post triggered by reading two others in quick succession. First, Mark Kleiman speculates what might have happened if Columbus had sailed to America for England instead of for Spain - would Central and South America now be democratic and prosperous? Second, AdamSmithee notes that regions where Western colonizers couldn't settle (e.g., tropical Africa with its endemic malaria) developed highly unequal institutions because of the colonial legacy of a tiny elite ruling over a large mass of poor people, which led to post-colonial kleptocracies, whereas regions were Western colonizers could settle comfortably developed more egalitarian institutions because they wiped out the native population (e.g., USA, Australia).

The contrast between the English and Spanish colonies in America is stark, and to some extent reminiscent of the contrast that AdamSmithee draws between Australia and tropical Africa. The English formed a homogenous/egalitarian but exclusivist colonial society: everyone was English because the colonists killed all the Native Americans, stole/bought their land, pushed them westward, and (inadvertantly in most cases) wiped them out with smallpox. In contrast, the Spanish formed a mixed/inclusivist but hierarchical society: they incorporated Native Americans into the colonial society, but only as slaves, subjugated laborers on encomiendas, etc. Was this contrast determined by the circumstances the colonizers found in America, or by the colonizers' cultural prejudices?

To a certain extent, cultural prejudices played a role. The Spanish, having just wrapped up the reconquista at home, saw colonization as conquest, a natural extension of conquest at home. Spain was full of soldiers and missionaries; once the Moors were defeated, the Indians were the natural next target. In contrast, the English perceived their colonization as one of settlement and plantations, of duplicating the English village from back home. (It didn't hurt that the Pilgrims were Separatists breaking away from the Anglican Church - they wanted to establish a new pure utopia, not convert the heathens.) This difference extended even to the rituals that made colonization "legitimate" in the eyes of the colonizers. As Patricia Seed has noted, the Spanish legitimated their conquest by reading out a proclamation that the land now belonged to the King of Spain, whereas the English legitimated their conquest by building fences around their land.

But cultural preferences are constrained by the possible. English colonists found a land already nearly emptied by Old World diseases (the Pilgrims found empty village upon empty village when they arrived in 1620). In Central and South America, on the other hand, populations were much higher to start with and the societies were better able to cope with the epidemics - so death rates were "only" 50%, rather than 90%. It's much easier to kill and marginalize a sparse population of semi-sedentary horticulturalists than to do the same to a dense population of sedentary agriculturalists who live in a large, bureaucratic empire (Aztec and Inca). Likewise, it's much easier to enslave and conquer a pre-existing political unit inhabited by people with useful skills like gold mining and agriculture, than to do the same to people without centralized political systems who don't farm and have no gold.

So, it seems to me that the English project of settlement wouldn't have succeeded in the face of the Aztec and Inca empires. Even in the Caribbean, the climate was more suitable for plantation agriculture rather than settler agriculture. So, insofar as US/Canadian egalitarian institutions (and hence successful economic development) were due to the duplication of English society and exclusion/genocide of Native Americans, it seems that English colonists wouldn't have created a society in Central and South America conducive to prosperity and democracy.

And to AdamSmithee, I would suggest that diseases matter not just in making Europeans not want to settle in malarial regions, but also in making it (much, much) easier for them to settle in certain regions where Old World diseases kill 90% of the native population.

Thursday, January 20, 2005

Update on Summers

Larry Summers has apologized for offending people with his remarks on women in science. Here is his official statement (pdf).

Now let's brace ourselves for the debate about whether or not he should have apologized. (Correct answer: yes.)

DOMA upheld by federal court

Yesterday a federal district court upheld the federal Defense of Marriage Act from 1996, which says that same-sex marriages are not recognized by the federal government, and that one state need not recognize another state's same-sex marriages.

Maybe the Republicans will finally shut up about the "pressing need" for the Federal Marriage Amendment.

Or maybe not:
"Today we have witnessed a significant victory -- for marriage and democracy,'' said Tom Minnery of Focus on Family. The group is pushing for an amendment to the Constitution that would ban same-sex marriages.

"Unfortunately, at any time, marriage in any jurisdiction is only one judge away from being ruled unconstitutional.''
This "democratic" anti-judicial argument for the FMA just got a lot weaker. If DOMA holds, then there's no reason for any other state to worry about same-sex marriage in Massachusetts - they can't keep using this rhetoric of emergency. (I can't see the Supreme Court overturning this district ruling, and precedent is pretty strong in judicial rulings.) And as same-sex marriage becomes more accepted in Massachusetts, they can't even complain that it's all "judicial tyranny." In fact, they want to impose the tyranny of the red states on Massachusetts. So their position is exposed (which we knew all along, of course): simple animus against gay people.

Wednesday, January 19, 2005

Some links

Apologies for light posting this week, as I'm really busy. For now, a few quick hits:

This is alarming: CPR is often done wrong. And by real doctors, not just random people who got CPR-certified by the Red Cross.

The new double-decker Airbus A380. When I was a kid, I read about this in a magazine (back when the plane was still on the drawing board) and thought it was really cool. I always had a fascination with the upper deck on a Boeing 747-400 - how cool to have an upper desk all the way along! My enthusiasm has dimmed, but it's still pretty neat.

And continuing on the theme of how really simple things can do a lot of good in developing countries, a simple water filter system may be saving tens of thousands of lives in Sri Lanka.

Tuesday, January 18, 2005

Why do democratic revolutions succeed?

One of the notable things about democratic revolutions is that despite the "overpowering will of the people," they so often fail. In the face of a determined dictatorship, peaceful demonstrators will just about never succeed in overthrowing the government. In some sense, this is a tautology; peaceful = not overthrowing, by definition. But this observation also reveals its mirror image: that peaceful demonstrators only succeed if the government is willing to let them; that is, if the government is sufficiently weakened, divided, or driven by conscience not to fire on the demonstrators.

This was the case in Eastern Europe. Demonstrators or small steps toward democracy were crushed time and time again: 1953 in East Germany, 1956 in Hungary, 1968 in Czechoslovakia, and 1981 in Poland. Suddenly in 1989 everything was different. It wasn't that the people were more discontented or bold; it was that the Soviet Union under Gorbachev let up a little, and they took advantage. The Brezhnev Doctrine was replaced by the Sinatra Doctrine. Even in September 1989, as crowds marched in the streets of East Berlin demanding democracy, the Communist Party could have quashed it with force. But they vacillated after Gorbachev was no longer backing repression by force. Kurt Masur intervened and convinced the guards not to fire. The government of East Germany effectively threw themselves out of power by inaction. In contrast, earlier that year China did not hesitate, and the Tiananmen revolution failed.

It seems that a similar story can now be told about Ukraine. Apparently, senior Ukrainian intelligence officials managed to prevent the government from firing on the protestors.
As protests here against a rigged presidential election overwhelmed the capital last fall, an alarm sounded at Interior Ministry bases outside the city. ... More than 10,000 troops scrambled toward trucks. Most had helmets, shields and clubs. Three thousand carried guns. Many wore black masks. Within 45 minutes, according to their commander, Lt. Gen. Sergei Popkov, they had distributed ammunition and tear gas and were rushing out the gates.

Kiev was tilting toward a terrible clash, a Soviet-style crackdown that could have brought civil war. And then, inside Ukraine's clandestine security apparatus, strange events began to unfold. ...senior intelligence officials were madly working their secure telephones, in one instance cooperating with an army general to persuade the Interior Ministry to turn back...
Read the rest... It's a fascinating story.

Innate differences and sexism

There's been much uproar (and some praise) over Larry Summers (president of Harvard, former Secretary of Treasury under Clinton) recently saying that innate differences between men and women could help explain why women don't do as well in science.

One of the interesting things about this debate is that whether or not men and women have innate differences in math/science-relevant skills, this whole issue is almost completely irrelevant to the issue that actually matters, namely discrimination against women in science and academia.

This sounds counterintuitive. After all, if men really were better at science than women, it would seem to be okay if there were more male scientists than female scientists, at least if we take a strict meritocratic view.

But, almost no one is arguing that innate differences account for all of the gender gap in the ranks of scientists overall, tenured faculty, or whatever. Meanwhile, it is quite clear to anyone who even briefly considers the matter that discrimination is presently a real problem. Not just overt discrimination (which hopefully is pretty rare these days). There is also covert discrimination, such as elementary school teachers unconsciously responding more positively if a boy expressses interest in math than if a girl does the same; unconscious biases among those who judge scientists-in-training and those who hire new scientists and professors; subtle forms of sociability like the guys in the department hanging around chatting about football. There is also a subtle institutional discrimination inherent in the career structure of science: science forces people to prove themselves in their field by working extremely hard between age 25-35 - graduate school, postdoctoral fellowships, assistant professorship - which just so happens to be the age range when many women want to have kids and can't spend 100 hours/week in the lab. This is not something inherent to the pursuit of knowledge or about quantitative skills: this is about injust social institutions that really do block lots of fantastic scientists from advancing.

So long as these forms of discrimination exist, it will remain irrelevant whether men or women have differing scientific abilities, because a nonzero fraction of the gender gap will be due to discrimination and not to actual ability. Even if in the perfectly meritocratic world there were 5% more male physicists than female physicists, that would still be far better than the current sad state of affairs.

Update, 19 Jan: The Gene Expression post that I linked to, which praised Summers' comments, has a reply to my post. Razib critiqued my pointing out that even ordinary social interactions cause covert discrimination, by saying, "what are you going to do, stop guys from hanging out together in the lab?" (EW in comments below makes a similar point.) As I said in comments over there, I think this criticism is somewhat misplaced. Obviously I would not be in favor of regulating workplace speech, beyond egregious things like sexual harassment. The fact of women and men talking about different things is just a symptom of the underlying cause, i.e. majorities are self-perpetuating. Sometimes, just being aware of unconscious bias can help dispel it. It's also possible that groups like "University of X Women in Physics Society" or what have you can provide some social support. We should recognize forms of covert discrimination even if it's hard to address them directly - maybe there are still ways to address them indirectly (as would be fitting for covert discrimination, after all). Isn't ending both covert and overt discrimination a worthy goal in a liberal society, even if it is a difficult and complicated goal?

Update, 2: Matthew Yglesias expresses similar thoughts here.

Monday, January 17, 2005

Inverted fertility?

Geneticists in Iceland have discovered a major genomic inversion that appears to promote female fertility and longevity. In some people, a 900,000-base-pair region on chromosome 17 is flipped around, and this inversion is most common in Europeans (20% of the population has it), but rare in Africans and Asians. The inversion also shows sign of having been favored by natural selection in the recent past, suggesting that there's something in the European environment that favors it. And very weirdly, the inversion diverged from its more common counterpart 3 million years ago - suggesting that either there is some selective advantage to having both copies, or that the inversion entered into the Homo sapiens population through interbreeding from a now-extinct hominid population shortly before modern humans left Africa.

How fascinating! It's got all the interesting elements of a population genetics story - signs of natural selection, clues of an unusual evolutionary history, a mysterious reproductive and survival advantage... The best part of the article is near the end:
Dr. Stefansson said his findings were empirical observations for which functional explanations have yet to emerge. It is not clear why the inversion should affect fertility or longevity, why it is favored in Europeans or how it has endured for three million years.
What an understatement! Part of the explanation is that the inversion increases recombination, which helps fertility. But why longevity? With such a large region, I'm sure there are a ton of possible explanations - a gene near the edge loses its promoter; the genes interact with the chromosomal packaging proteins differently; etc. I'm also puzzled by this: if this inversion is so great (have more offspring and live longer! a Darwinian dream), why hasn't it spread throughout the whole human population? Does it only work in Europe?

So many questions!

Update, 20 Jan: Keats' Telescope agrees that the analysis in the paper is valid and also wonders what on earth is going on with this inversion. This is the cool thing about science... every answer raises new questions.

Sunday, January 16, 2005

Genetically engineered milk

This is cool:
A cow called Pampa Mansa could be the key to cutting the high cost of human growth hormone, which is used to treat thousands of children with growth problems. The genetically modified Jersey cow produces so much of the hormone in her milk that just 15 cows like her could meet the current world demand.

Human growth hormone once had to be extracted from human cadavers but is now made in genetically modified bacteria. This form of the hormone is safer, but treatments can cost $30,000 a year.

To create an alternative source, a team led by Daniel Salamone at the University of Buenos Aires in Argentina added the human gene to cow cells growing in a dish. Pampa Mansa was created by cloning one of the modified cells.

At the age of one, Pampa Mansa was already producing 5 grams of the hormone per litre of milk, 10 per cent of the milk's protein content. That translates into at least 4 kilograms a year, over four times as much as a typical bacterial fermenter, the team will tell a meeting of the International Embryo Transfer Society in Copenhagen next week.
Someone is currently trying to do the same with spider silk, one of the strongest materials by weight known to man (5 times stronger than steel, twice as elastic as nylon, and biodegradable too). (Amusingly enough, you can't farm spiders the way you can silkworms, because the spiders eat each other.)

I wonder if anyone has tried doing this with insulin? I remember reading that the drive to produce recombinant human insulin in E. coli was the impetus behind the start of the "molecular biology revolution" in the late 1970's, and yet when they actually figured it out, it wasn't a whole lot cheaper than harvesting pig pancreases.

Saturday, January 15, 2005

Redistricting, redux

Since I last posted about Arnold's plan to end gerrymandering in California, I've become more in favor of it (perhaps I should have trusted my initial reaction). Here's why.

The current gerrymandering setup is primarily designed not to increase Democratic majorities, but to preserve the seats of incumbents. This is because it was created in a bipartisan deal and approved by both Republicans and Democrats. The Democrats could have done far worse in terms of sheer partisanship. Peter Beinart says,
When it redistricted in 2001, California's state legislature drew congressional lines that virtually guaranteed reelection for every incumbent, Republican and Democrat. As a result, in 2002, only one of the state's 53 districts witnessed a contested race (and that district wouldn't have been competitive either had Representative Gary Condit not gotten embroiled in the scandal over murdered intern Chandra Levy). Two years later, in 2004, not a single California House seat changed party hands. 
This has two implications: first, that ending gerrymandering wouldn't necessarily lose the Democrats many seats; and second, that the current plan also hurts Democrats by making it impossible for us to pick up new seats. So the situation isn't necessarily the harsh "Prisoner's Dilemma" I thought it was.

Plus, as Jesse Zink points out, Democrats do need a galvanizing issue. The Republicans swept into power in 1994 partly on the idea that Congress needed a breath of fresh air. Already after 10 years, they have corrupted it with their ethics shenanigans and DeLay's disgusting gerrymandering in Texas. We need another breath of fresh air, and we Democrats ought to provide some moral leadership.

On the actual merits of the redistricting plan, some have argued that "political" matters ought to stay in the hands of politicians. For example, an article via Jesse Zink,
I've always held the almost incontestable position that redistricting is the most political of acts. And my correlative, therefore, has been that the process belongs in the hands of politicians, no matter how unattractive the work often becomes.

I've even made fun of silly ol' progessive Iowa for, after years of partisan legislative wrangling over redistricting as the fortunes of Democrats and Republicans in the Hawkeye State waxed and waned, giving the job, at least in the initial stage, to, egads, bureaucrats.
But that's an absurd objection on its face: Democracy is about limiting the naked exercise of power (which, after all, is what politics is at its heart). Voting is "political" too, but it doesn't therefore follow that voting belongs in the hands of politicians. In fact, usually we want to make sure the people in charge of elections are nonpartisan. It is precisely because certain elements of democracy are so central, so vital, so sensitive to corruption, that we have to take them out of the hands of those in power. This is just an extension of the principle of "separation of powers" that frames the whole US Constitution.

If Arnold manages to hammer out a consensus plan that works as well as Iowa's does, I'll support it.

Friday, January 14, 2005

Hooray for Huygens

This is exciting!
A European spacecraft plunged through the murky atmosphere of Saturn's moon Titan today and successfully came to rest on a bizarre landscape of mystery never before explored....

The first picture from the Huygens spacecraft did nothing to undermine the reputation of Titan as a strange and otherworldly place. The picture showed terrain of what appeared to be deep channels leading to the shoreline of a dark, flat surface, possibly one of Titan's hypothesized lakes of liquid methane.
The European Space Agency has released the first-ever picture of Titan's surface. (Also on the NY Times article.) Very cool!

When I was a boy I read that Cassini would be launched in 1998 and arrive in 2004. I thought that was ages and ages away and I could hardly wait. How quickly time flies...

Hope for tropical diseases

Earlier this week I posted about the lack of research into tropical diseases that mainly affect poor developing countries, like Leishmaniasis and sleeping sickness. This week's New Scientist has some encouraging news:
But now, small teams led by Ferguson and a few other groups of like-minded academics around the world are making a determined effort to find a way round the impasse. By bringing together a variety of disciplines and adapting commercial methods, they hope to come up with a set of safe and effective modern medicines designed to save the lives of the world's poorest people. [...]

Ferguson has borrowed a technique more usually employed by multinational drug companies: high-throughput compound screening, or HTS. After buying a commercial library of compounds and a robotic screening system, Ferguson can test 100,000 compounds against a chosen target. [...]

Ferguson is relying on charities and research councils. [...] Because these projects don't fit within the traditional academic framework, getting the money together hasn't been easy. [...] "In order to obtain funding you need to get research published in high-quality journals; drug discovery doesn't work like that."

As for the drug companies, Fairlamb believes they still have a place. Most of the bigger ones already put resources, albeit limited ones, into treatments for malaria and TB. And smaller generic drug companies, particularly in Asia, are in a good position to manufacture tropical medicines. "If we can offer the industry high quality leads for development, they will be under pressure to take them on." Ferguson is more cynical: "It depends on how [the drug companies] are feeling about PR at the time.
A blend of public and private approaches, of the academy and industry. Here's hoping they succeed.

Scientific "torture," again

In my last post, I criticized the BBC's use of the word "torture" to describe some harmless neuroscience experiments. Via Majikthise, apparently the Guardian has also referred to the experiments as torture. A quick Google search reveals that this 'torture' meme is pretty common. I wonder if they all came up with it independently (which would be very disturbing) or if it was in the press release. MSNBC's story says,
British newspapers gleefully pounced on the story Wednesday. One headline read, "Believers go on rack to prove God relieves pain," alongside pictures of medieval torture.

In reality, while the university's experiment has little in common with the Inquisition's infamous torture implement, the test will not be pleasant. [...] "Sixty degrees centigrade is considerable; it’s a strong enough signal for people to respond, but not enough to cause enduring harm," the doctor said, adding that a gel of chili powder could also be used.
At least someone's sensible.

May I also note that the Times article's lead is "People are to be tortured in laboratories at Oxford University in a United States-funded experiment to determine whether belief in God is effective in relieving pain." The insinuation that the U.S. is funding torture for science goes even beyond the the other stories, by referring specifically (if indirectly) to Abu Ghraib and Guantanamo. The U.S. really is sponsoring torture, and to confuse the real torture with a harmless experiment is completely unhelpful. Note also how this lead plays into the idea that the U.S. is a bunch of religious crazies, so of course we would use torture to test religious faith. Also that we are 'outsourcing' torture to other countries.

So the media prints stuff like this and then wonders why the public doesn't trust scientists anymore? I wonder why...

Wednesday, January 12, 2005

Scientific "torture"??

The BBC features a headline today 'Torture' to uncover brain secret. It's not really about torture at all - some psychologists are going to study pain using brain imaging, to see if religious faith relieves pain. In fact, the pain is quite mild and subjects have volunteered for it: "Volunteers will have a gel containing chilli powder or heat-pad applied to the back of their hand to simulate pain." I also heard the story on Radio 4, and the head of the brain science institute called it "a mild discomforting stimulus." It's got to be mild if people are volunteering for it! (And the study had to pass the Institutional Review Board research ethics review, of course.)

It's not just the headline - the 'torture' motif continues throughout the article:
Some volunteers will be shown religious symbols such as crucifixes and images of the Virgin Mary during the torture. ...

The team from the newly-formed Centre for Science of the Mind also want to include people with survival techniques in the torture experiments.
Now let's review what torture actually is:
torture means any act by which severe pain or suffering, whether physical or mental, is intentionally inflicted on a person for such purposes as obtaining from him or a third person information or a confession, punishing him for an act he or a third person has committed or is suspected of having committed, or intimidating or coercing him or a third person, or for any reason based on discrimination of any kind, when such pain or suffering is inflicted by or at the instigation of or with the consent or acquiescence of a public official or other person acting in an official capacity
The misleading motif of torture in the BBC article makes me a bit uncomfortable. On the one hand, it trivializes real torture, something that is actually a big deal, by cheaply capitalizing on the world's horror at Abu Ghraib to grab the reader's attention with an eye-catching headline (especially since torture is generally used to uncover secrets, so the shocking reading of the headline has a certain plausibility). And on the other hand, it contributes to a misperception of scientists as cruel monsters. There is a danger of science falling prey to evil -- witness Josef Mengele and to a lesser extent, American military doctors in Guantanamo and Abu Ghraib -- but blurring the distinction between real torture experiments and "a gel containing chili powder or heat-pad applied to the back of their hand" is unfair, both to good scientists and to the victims of the real torture experimenters.

I'm not sure if it was a reporter or one of the investigators who used the phrase 'torture experiments' - either way it's irresponsible and disturbing.

Update, 13 Jan: As reader Sylvain notes in Comments (and as you would see if you click on the BBC story), the BBC has now changed the text of the story so that 'torture' is replaced by 'pain' or 'burnt.' Except for the captions to the picture and the inset. ("Torture is being used to help scientists understand how the brain works" and "How scientists plan to torture volunteers.") Since it's not in the text or headline anymore, the usage is less disturbing (though still not entirely neutral), so... good for the BBC. I hope they remember to change the captions. Thanks to Sylvain for pointing that out.

Scientific ignorance. Or make that ignorance, period.

I have long been alarmed by polls showing the scientific ignorance of the American people. For example, 25% of Americans believe the sun goes around the earth; 46% don't know that it takes one year for the earth to go around the sun; 52% believe that early humans lived at the same time as dinosaurs; and 35% think that radioactive milk is safe to drink as long as you boil it. I thought this might be because, at least in the case of "the sun goes around the earth," some religious beliefs conflict with scientific knowledge.

Then I found out that Americans are pretty ignorant about religion too (LA Times article - free registration required).
According to a 1997 poll, only one out of three U.S. citizens is able to name the most basic of Christian texts, the four Gospels, and 12% think Noah's wife was Joan of Arc. That paints a picture of a nation that believes God speaks in Scripture but that can't be bothered to read what he has to say.
Two out of three Americans can't name the four Gospels? Some Christian nation we are. I'm no Christian and even I know that the Gospels are Matthew, Mark, Luke, and John.

(Via Kevin Drum.)

Update, 13 Jan: Dan of Sound and Fury points out that Europeans are pretty scientifically ignorant too (pdf link). This was the most upsetting:
52% of Europeans consider astrology "rather scientific."
Astrology is total bollocks. Few things make me more mad than pseudoscience masquerading as science.

The Invention of Siestas

The New York Times has an interesting article today on the problems posed by the traditional Spanish siesta in modern Spain. The gist is that whereas Spaniards used to take a long nap in the afternoon during the siesta, nowadays many people don't have time to go home (long commutes, etc.) so instead take really, really long lunches that last until 5pm. Unfortunately, that means the workday lasts from 9 to 8, and they end up depriving themselves of sleep because they go to bed so late but don't recover the sleep during an afternoon nap. So people are talking about getting Spain in sync with the rest of Europe, and the siesta is declining under the pressure.

But for me, the most interesting part of the article is that the siesta, so much a part of Spanish identity and tradition, isn't even that old of a tradition:
Mr. Buqueras said many Spaniards mistakenly believed that a long break at midday had always been a part of the Spanish lifestyle. "As late as 1930, lunchtime was between 12 and 1, and dinnertime started at 7 or 8," he said. "If you look at the newspapers or novels from the beginning of the century, they all show it."

What is unclear, he said, is why habits changed. Some historians point to the Spanish Civil War, which was fought from 1936 to 1939. It is possible, Mr. Buqueras contended, that "the hunger that is always caused by wars forced people to work two jobs to survive," one in the morning and one at night. The midday break would have given them time to get from one job to the other. "But there are no definite causes," he said.
Reminds me of the relationship between nationalism and the invention of tradition. Granted, siestas surely weren't started as a method to bolster Spanish national identity; but I bet someone could write a neat Ph.D. thesis on how a quirky work schedule possibly induced by wartime evolved into a part of what it means to be Spanish, so that after a few decades, the prime minister says that "The work schedule is what distinguishes Spaniards, but it is also what defines us." Even the BBC says, "For centuries in Spain, heading home mid-afternoon for lunch and a snooze was regarded as something of a national right."

Tangled Bank

My post on reproductive cloning has been included in this week's Tangled Bank carnival. Thanks to coturnix for including my post. Go check it out - there's a lot of cool stuff there, such as this neat article about the evolution of whales and adaptive immune systems (favorites for the creationist "that's impossible" fallacy); this post about how boring infrastructure improvements can save lives; and this article about how tandem repeats help explain the huge variation among different dog breeds. Enjoy!

Tsunami and a solution to the problem of evil

According to geologists, it's possible that life would not exist on earth without the plate tectonics that also happen to cause earthquakes, volcanoes, tsunamis, and general death and destruction. On the most basic level: Volcanism on the early earth is probably what spewed forth the atmosphere and oceans (in the form of water vapor). No oceans and no atmosphere = no life (at least as we know it). Also, without plate tectonics, over millions of years, carbon dioxide would get locked up in rocks as carbonate, and without that greenhouse gas, the earth would freeze over. Luckily, with plate tectonics, volcanoes spew out carbon dioxide, thus completing the carbon cycle. On a secondary level: volcanoes make soil fertile by spreading ash with helpful minerals for miles around. Tectonic action concentrates mineral deposits like gold. And so on.

This could provide some support for an interesting solution to the problem of evil (i.e., how can there be a God who's both all-loving and all-powerful when there is so much suffering and evil in the world?): the "modal realism" solution. (Hat tip: Crooked Timber.)
Let’s assume the following metaphysical claims are all true.
  • There is a class of abstract possible worlds W. (I’m not going to say what abstract and concrete amount to in any of this - on this distinction see Gideon Rosen’s SEP entry.) In other words, weak modal realism is true.
  • God cannot change any of those worlds without destroying it - what happens in a world is essential to its nature.
  • What God can do is make any of them that He chooses concrete. Abstract possible worlds have no moral value, but concrete worlds do have value, or disvalue if they are bad, so this choice is morally loaded.
  • God’s creation is timeless, so He can’t create one and then tinker with it. For each world He faces a take-it-or-leave-it choice. [...]
If all this is true [a big if!], what should God do? Well, I think He should create all and only the worlds such that it is better that they exist than that they not exist. And that will include worlds, like this one, that are not perfect but that contain more goodness than suffering. So the existence of this world as concrete entity is compatible with God’s existence, and indeed His omnipotence and benevolence.
With respect to the tsunami, one could postulate that when God was deciding whether or not to create this particular universe, he knew that the laws of nature in this universe dictated that life could only evolve if Earth had active plate tectonics, but that these plate tectonics would also create lots of pain and suffering. And perhaps that pain and suffering didn't outweight the goodness of life and humanity evolving. Given all the discussion of the problem of evil in the wake of the tsunami (and in light of the historical precedent for such discussion), this point is well worth pondering.

It's a sort of Deist conception of God (especially the idea that universes are timeless and take-it-or-leave-it so God logically can't tinker with them). I'm not sure that it address the problem of evil posed by the traditional Christian God, who sometimes intercedes with miracles in the Bible, etc. And it's hard to see how AIDS, the Holocaust, etc. could possibly be beneficial in the same way as plate tectonics.

In any case, the problem of evil doesn't especially distress me since I don't believe in God. Still, this possible solution is intriguing (and at least forces one to focus on the real reason for not believing in God, namely the lack of evidence thereof.)

Tuesday, January 11, 2005

Transient aid

From today's New York Times, an example of what I was talking about here and here: the victims of the tsunami will need sustained aid commitments, not just emergency relief. The headline says it all: "For Honduras and Iran, World's Aid Evaporated." When Hurricane Mitch struck Honduras, everyone promised lots of aid and sustained commitments - but it didn't happen.
But just as the first new bricks were being laid, the United States Congress set a two-year-deadline on Washington's reconstruction programs, contending that the money was for emergency relief. And when the federal money was gone, so were many private organizations, whether their projects were finished or not.

Phil Gelman, regional adviser for the United States Office of Foreign Disaster Assistance, was working in Honduras for Care International at the time. That deadline, he said, doomed American projects to fail at achieving long-term goals.
Let's not make the same mistake this time.

Let's not default on our pledges either, as happened after the Bam earthquake a year ago:
A year after an earthquake with a magnitude of 6.9 destroyed the central city of Bam, killing more than 40,000 people and leaving almost as many homeless, the streets there are still strewn with mounds of rubble. Tens of thousands of people who lost their homes remain crowded in prefabricated housing. [...] Iranian officials reported that they had received only $17 million of the $1 billion pledged by the international community to help rebuild the 2000-year-old city.

Monday, January 10, 2005

The poor man's diseases

Via AdamSmithee, an article in Reason by Ronald Bailey praising for-profit pharmaceutical companies for their efforts in combating infectious diseases, even those of the poor. He tries to deflect criticism that less than 10% of global health R&D is spent on 90% of the world's diseases, and argues that, actually, most diseases affecting the poor already have effective treatments and it is poverty per se, and hence inability to afford medicine, that causes disease, rather than the selfishness of Big Pharma.

First, a point about river blindness (onchocerciasis): Bailey points out that the drug for river blindness was developed by Merck and is now being given away for free. He neglects to mention that the drug was actually originally developed to treat parasitic infections in horses, because treating horses owned by rich people is more profitable than treating poor people. This isn't exactly a bad thing - in fact, it shows how even the profit motive can produce serendipitous results. But it's not exactly a guaranteed way of discovering future medicines to treat diseases of the poor. And Merck isn't donating the medicine merely out of good will, but rather because the poor people who need couldn't afford to pay anything anyway. Most of the costs of drugs go into R&D, not actual drug production - since Merck already had this drug lying around that was originally designed for horses, they figured they might as well give it away and gain the good publicity. Again, not a bad thing - but also not a reliable way to generate more drugs. Certainly not reliable in the way that the profit motive will generate drugs for rich people's diseases. (Viagra, anyone?)

Meanwhile, the attack on the 10/90 claim is a strawman. It's certainly true that HIV, TB, malaria, and diarrheal diseases are generally the biggest killers of the poor (so in terms of sheer lethality, the "90%" of diseases receiving only 10% of global R&D aren't that bad). But the 10/90 claim is just a rallying cry, shorthand for the inequity of global health system. Taking the real killers: Leishmaniasis, malaria, trypanosomiasis and tuberculosis account for 5% of global disease burden (in terms of Disability-Adjusted Life Years, or DALYs) and certainly more of the disease burden in poor countires; yet they receive only 0.5% of global health R&D. Bailey himself admits that leishmaniasis and trypanosomiasis (sleeping sickness) still lack effective treatments.

Furthermore, Bailey overstates the extent to which the big killers have "effective methods of treatment and prevention." Let's leave aside the ridiculous implication that there's no need for drugs if a disease is easily prevented (imagine telling a heart attack patient, "no need for bypass surgery! you could have prevented this with a healthier diet"). Though malaria and tuberculosis have effective treatments that just need enough resources and public health infrastructure to be put into practice (a pretty big hurdle in many places!), the microbes will always be evolving resistance. Multidrug-resistant tuberculosis is already a problem, and it's only a matter of time before malaria becomes resistant to the last "ultimate" treatment, arteminisin combination therapy. Meanwhile, HIV is completely incurable, and antiretroviral drugs are expensive because of the profit motive. Some companies have started selling antiretrovirals more cheaply in poor countries, which is great; but we should have more of that. Plus, because it mutates so rapidly, HIV exists in multiple strains all around the world. The West is mostly affected by HIV-1 strain B, but the Third World has myriads of other strains - A, C, D, and CRF02 in Africa; B, C, and CRF01 in Asia. And these may not necessarily be as susceptible to treatments or vaccines being developed in the West as strain B.

In the end, Bailey is partly right in saying that the ultimate cause of all this unnecessary suffering is poverty. But diseases also cause poverty, and we can't overcome the vicious circle without a serious attempt to address the Third World's diseases. The Gates Foundation and the Global Fund to Fight AIDS, TB and Malaria are welcome funders in that fight. And we need not heavily tax drug company profits or destroy their patent rights (another strawman); we can, for example, guarantee that Fund X will buy a certain amount of drug Y if a company develops it, and thus provide an incentive to research and treat the diseases of poverty.

GM trees

The Economist reports that genetically modified trees may be on their way. The goal is faster growing trees that make cheaper and better paper.
Lignin is one of the structural elements in the walls of the cells of which wood is composed. Paper is made from another of those elements, cellulose. The lignin acts as a glue, binding the cellulose fibres together, so an enormous amount of chemical and mechanical effort has to be expended on removing it. The hope is that trees can be modified to make less lignin, and more cellulose.

In a lucky break, it looks as though it might be possible to achieve both goals simultaneously. A few years ago a group of researchers at Michigan Technological University, led by Vincent Chiang, started the ball rolling. They produced aspens, another species of poplar, that have 45% less lignin and 15% more cellulose than their wild brethren, and grow almost twice as fast, as well. The mixture the team achieved leaves the combined mass of lignin and cellulose in the trunk more or less unchanged and, contrary to the expectations of many critics, the resulting trees are as strong as unmodified ones.
Given the argument about genetically modified field-crops that has taken place in some parts of the world, genetically modified forests are likely to provoke an incandescent response. [...] In the case of trees it might not even be necessary for the gene to jump species. GM trees, with immunity to insect pests and faster growth rates than their unmodified competitors, might simply spread by the normal processes of natural selection. That really would be survival of the fittest.
I don't usually object very strongly to GM crops, but even for me this seems to be pushing it. Especially when the seeds of poplar trees have fluffy hair to spread in the wind, and aspens grow in colonies with new seedlings growing up to 40 m away from the parent tree.

Friday, January 07, 2005

Brain drain

The UN has a new report out on the need for more scientific input into plans to end global poverty.
Prof Juma said: "We have seen with the challenges which southeast Asia has faced in eliminating poverty and hunger that scientific and technical capabilities determine the ability to provide clean water, good healthcare, adequate infrastructure and safe food. [...]

Prof Juma's report will identify information and communications technology, biotechnology, nanotechnology, and new materials as "platform technologies" that will have "profound implications for long-term economic transformation" in developing countries.

"Universities also have a vital role to play in economic development, particularly in training new generations of skilled scientists and engineers. But we need to fully utilise the talents of developing country scientists for development, irrespective of where they are located. It is ironic that some developing countries are putting their scarce resources into education and training that benefits the developed world," he added.
You'll get little argument from me on the importance of science and technology in alleviating poverty, especially on things like health care and clean water. Still, the last point highlights a certain tension in the concerns expressed by this post I wrote a couple weeks ago. On the one hand, I like science and want it to do well, and as an American I want U.S. science to do well. On the other hand, U.S. dominance in science relies, in a certain way, on suppressing the scientific infrastructure of other countries - for example, by the brain drain whereby the most talented people from developing countries are sucked away into the scientific machine in the U.S. (Much as U.S. military dominance relies indirectly on the relative poverty of most other countries, especially India and China.)

The decline in U.S. dominance in science doesn't really solve the problem because the "rising powers" in science are either other developed countries (Europe and Japan) or developing countries that are developing pretty quickly (China and India). The really poor countries are still stuck without the scientific infrastructure that could help a "long-term economic transformation." And it's not necessarily enough just to have "scientific advisers" from rich countries go into poor countries to provide the new technologies, because foreigners often don't understand the local cultural context.

I'm not sure what to do about this, other than encouraging scientists to go back home to help their home country out of poverty. Maybe the forthcoming report has some ideas...

The political economy of underdevelopment

Russell Roberts points out one of the major barriers to development.
...the richest people in the poorest countries live as well as the richest people in the richest countries. The real problem is that in most so-called poor countries, the powerful people, the government and their friends, live at the expense of the rest of society.
This is related to an argument that Andrew Janos has made. Historically, underdeveloped countries were stuck in a vicious cycle that kept them underdeveloped. Janos spoke about 19th century Eastern Europe. The problem was that Britain and the rest of northwest Europe managed to industrialize first at least partly because of the success of the agricultural and commercial revolutions at improving agricultural productivity (i.e., grain yields): it allowed the commercial class to accumulate capital and invest it. Unfortunately, these advances spread out of the northwest "core" at a glacial pace, such that Eastern Europe was always about 200 years behind northwest Europe in terms of agricultural productivity.

After Britain industrialized, a perverse mechanism kicked in: the industrial nations started exporting nifty new goods around the world: cotton clothes, cool gadgets, etc. Elites in other countries noticed the new goods, and because industrial countries exported tastes and values as well as goods, elites in underdeveloped countires wanted these new goods. Unfortunately, they couldn't afford them. Domestic populations in industrial countries could afford to buy these new goods because they were caught up in the cycle of increasing prosperity. But the underdeveloped economies of Eastern Europe couldn't support that kind of consumption. So Eastern European elites kept spending to "keep up with the Joneses," and spent themselves into debt and bankruptcy. Not exactly a good foundation for domestic capital accumulation and investment.

So there is a sense in which Russell Roberts is right - there is a vicious cycle of underdevelopment in many poor countries. But it's not quite the way he says, which is that corrupt governments suck away aid money. Obviously, corrupt governments do suck away aid money, but why not still try harder to invest more wisely? Especially with programs that have been shown to work, like giving free meals to schoolchildren so starving kids will want to go to school. Maybe we can still break the vicious cycle after all.

Industrialization and the state

Don Boudreaux's right to say that the state didn't consciously plan Britain's industrialization, but in quoting the economic historian Mathias, he happily skips over a very important passage:
All later industrializations have been much more involved with public initiative and imported capital...Even in the United States...American railways often prospered, like many of their colleges, on land grants from state governments.
Before jumping to the conclusion that Britain's example disproves statism, we should consider why no one else followed Britain's example.

The fundamental uniqueness about Britain is that Britain industrialized first. It did so when technology was comparatively primitive so that the takeoff industry was textiles, and not steel and railroads. Textile factories are cheap, so they could be set up with the capital that Britain's gentry class had accumulated after the increase in agricultural productivity that resulted from the agricultural and commercial revolutions. Later countries industrialized after technology had advanced to railroads and steel, which are "high-capital" industries. These industries needed things like land grants for railroads, big infusions of money beyond what a single entrepeneur could provide.

Perhaps more importantly, the very fact that Britain was unintentionally first highlights the favorable factors that made industrialization in Britain (but nowhere else) so easy. Things like - easy water transport (Britain is an island, after all); landless laborers impoverished by the Enclosure Acts and desperate for work; a population of technically proficient entrepeneurs; a Parliament made up mainly of the commercial class; a huge export market for textiles in India; etc.

Indeed, Eric Hobsbawm has argued that the export market - i.e., colonies created by a commercially minded Parliament - provided the spark necessary to kick start industrialization, while the domestic market provided the fuel for slow and steady growth. If you are a textile manufacturer in pre-industrial England, what motive could you possibly have for investing in new, industrial forms of production to drive your production exponentially upward? Why not just keep your old pre-industrial forms and expand them to meet domestic demand? It was only the sudden and drastic expansion of demand created by colonialism that sparked manufacturers to industrialize.

Actually, if you think about it, the unique case of Britain proves nothing about the general need for a state in industrialization. Before 1750, no one had ever conceived of industrialization (that is, the radical re-orienting of society toward a market economy, inanimate sources of energy, mechanization, exponential increases in production, etc.). Such a thing had never existed in the history of mankind. So obviously - tautologically - it could only come about spontaneously. Meanwhile, it most noticeably did not come about spontaneously in any other country in the world.

Obviously, a too-strong state is bad for industrialization - you do need a free market after all. But the fact that the British state didn't consciously push Britain toward industrialization doesn't contradict the idea that the state was necessary in many ways, or that conscious state intervention can successfully promote industrialization.

Thursday, January 06, 2005

Redistricting in California

I was excited about Schwarzenegger's plan to end gerrymandering in California until I read Kevin Drum's post on why he can't support the plan. Gerrymandering is awful and the plan is pretty good, but it would only increase Republican dominance of Congress, and DeLay would laugh his head off that he managed to get away with his naked power grab while the idealist liberals in California surrendered some of their own power in the name of democracy.

It's like the Prisoner's Dilemma. If all 50 states end gerrymandering together, everyone wins (less extreme partisanship because legislative candidates actually would cater to the center; less bias in favor of incumbents; a more truly democratic system of representation). But if blue states end gerrymandering first, Democrats lose but Republicans win; if red states end gerrymandering first, Republicans lose but Democrats win; and if no one ends gerrymandering, we all lose. Unfortunately, rational actors will always choose to screw the other guy, so it looks like gerrymandering is here to stay. (Or, perhaps the solution is for a swing state to lead the way and end gerrymandering?)

It also reminds me of Robert Dahl's Polyarchy, in which he argued that democracy would have the best chances of emerging/surviving if the costs of losing were relatively low (i.e., you wouldn't be executed, have your whole estate confiscated). Could it be that the stakes are just too high these days for democratic reform? (Obviously gerrymandering is not like political executions, but I'm speaking relatively here...)

Well, maybe Schwarzenegger will win the day and California will be as a city upon the hill in better representative democracy. It could be good in the long run, I guess...

By the way, I am also a bit bothered by the somewhat Bonapartist strain in the way Schwarzenegger keeps threatening to "go to the people" with a popular referendum if the legislature won't go along with his demands. Something troubling about superceding normal checks and balances with easily manipulated mass democracy centered around a personality cult...

Update, 11 Jan: Jesse Zink expresses similar thoughts re: referenda at Doubly Sure, a thoughtful and well-written blog that I just discovered (amazing - the "Next Blog" link on the Blogger bar actually turned up something good!)

Is U.S. foreign aid adequate?

I left some comments yesterday at Sound and Fury in response to Dan's posts about U.S. generosity and Dan was kind enough to respond with a whole post, so let me do the same.

Dan's right that I conflated the difference between overall foreign aid and official development assistance in my comments to his posts. Compared to most other countries, Americans actually give quite a lot of money in overall foreign aid, even though our official development assistance is not great in terms of direct aid. But, after all, this was the point I was trying to make yesterday: that even though it's great that we provide a lot of emergency relief, that's not enough - we have to provide the kind of aid that helps developing countries stand on their own two feet so that they can modernize and prosper enough not to need so much emergency relief in the future. And we have to give aid to less "glamorous" problems like education, clean water, and infectious diseases.

But the question of whether America is stingy relative to other developed countries is really beside the point. Dan was right to say that it is not my main concern. And I don't think it really ought to be the main concern - even if America were more generous than all other countries, you could still say (and I would say) that even the commendable generosity that we do show is not enough to meet the objective needs of developing countries.

After all, the comment by that UN official that triggered this whole debate about U.S. stinginess referred to all Western countries, not just the U.S. Are we not, all of us in the developed world, stingy?

A final note - Dan remarked that constantly berating Americans for being stingy would make generous Americans feel unappreciated, and thus donate less. Certainly, point taken about using carrots and sticks, attracting more flies with honey, etc. Still, it seems to me that Americans donate money to foreign aid not because this will make pundits praise American generosity, but because donating money to foreign aid is a Good Thing. (Aid by democratic governments sensitive to public opinion is another matter. Thus, we should praise Bush for pledging $15 billion to fight AIDS, but also criticize him for not fully following through on the pledge.)

Update: Thanks to Dan for linking here. He's got an excellent and well-written blog.

American meritocracy on the rocks?

I read this article in The Economist this morning, which seemed to go along with what I wrote near the end of this post.
A growing body of evidence suggests that the meritocratic ideal is in trouble in America. Income inequality is growing to levels not seen since the Gilded Age, around the 1880s. But social mobility is not increasing at anything like the same pace: would-be Horatio Algers are finding it no easier to climb from rags to riches, while the children of the privileged have a greater chance of staying at the top of the social heap. The United States risks calcifying into a European-style class-based society. [...]

Most Americans see nothing wrong with inequality of income so long as it comes with plenty of social mobility: it is simply the price paid for a dynamic economy. But the new rise in inequality does not seem to have come with a commensurate rise in mobility. There may even have been a fall.
When even an economically right-leaning journal like The Economist says that social mobility is not living up to the free market ideal, you know there's something wrong.

I was especially struck by this passage: far there are few signs of a reform movement. Why not?

The main reason may be a paradoxical one: because the meritocratic revolution of the first half of the 20th century has been at least half successful. Members of the American elite live in an intensely competitive universe. As children, they are ferried from piano lessons to ballet lessons to early-reading classes. As adolescents, they cram in as much after-school coaching as possible. As students, they compete to get into the best graduate schools. As young professionals, they burn the midnight oil for their employers. And, as parents, they agonise about getting their children into the best universities. It is hard for such people to imagine that America is anything but a meritocracy: their lives are a perpetual competition. Yet it is a competition among people very much like themselves—the offspring of a tiny slither of society—rather than among the full range of talents that the country has to offer.
That sounds very true, and perhaps explains why college-age kids are becoming more conservative these days: the American dream is still alive but only within the narrow world of the organization kid.

(I'll leave aside for now the idea that even perfectly "fair" meritocracy is unfair in the sense that successful people don't really "deserve" their success because successful talents and work ethic are determined by genetics and upbringing. I kind of like Rawls's Difference Principle, that inequality is justified to the extent that it benefits the least well-off. That is, meritocracy is instrumentally good in encouraging economic prosperity and thus helping the poor, rather than inherently good in that the winners deserve their success. At least we can hopefully all agree that meritocracy is at least better than entrenched class-, gender-, and race-based prejudice.)

Wednesday, January 05, 2005

More on international aid

Nicholas Kristof has a column today related to what I wrote about yesterday. Basically he says that disaster aid is great, but less visible needs like malaria, AIDS, clean water, and education are at least as important (if not more so). We could do a lot of good with very little money, with things like $5 insecticide-treated mosquito nets, to prevent malaria transmission; drugs that can cure TB (currently infecting one-third of the world's population) and cost about $1/day; free meals in schools to make sure kids show up. And we ought to be giving a lot more money to those issues than we are.

These issues are all tied up with the kind of prevention that I was talking about yesterday. Disease and illiteracy are huge barriers to ending global poverty and creating prosperity, and prosperity is ultimately the only way to get the kind of safe buildings and good infrastructure that could bring earthquake death tolls down from 1 million to 50,000. There's a psychological block - an attentional deficit - that prevents us giving the money that these problems deserve but that allows us to give after-the-fact disaster aid.

It's not just that giving money to these simple and cost-effective programs will save lives immediately (which it undoubtedly will, and which is a great thing). It's also that these basic problems are holding developing countries' economies back in ways that people often don't think about: Malaria, for example, is a significant burden on many developing countries. Malaria makes children miss or drop out of school or learn more slowly; adults lose productivity; tourists don't want to visit the area; etc. So reducing the burden of malaria not only saves lives right now, but it will save lives in the future by helping developing countries become prosperous (and thus build working health care systems, end hunger and starvation, etc.). The same goes with TB, HIV/AIDS, diarrheal diseases, and others. A similar argument could be made on education and women's rights.

Meanwhile, Daniel Drezner points out that though the U.S. is very stingy in direct aid, as Kristof said, we aren't so bad on other things like immigration (many immigrants send part of their wages back home) or smart investment (encouraging development). These other policies do help alleviate poverty by helping development. Still, they can't do as much to save lives as directly attacking infectious diseases. So, we really ought to give more money to charities like Doctors without Borders and The Global Fund to Fight AIDS, Tuberculosis and Malaria.

Tuesday, January 04, 2005

On prevention

The New York Times had an article on Sunday on the old theme, "earthquakes don't kill people; falling buildings kill people." This saying is somewhat less applicable to the horrible tsunami disaster last week, because really, huge ocean waves actually do kill people, but it's still true that things like better house construction, public health infrastructure, etc. really can prevent a lot of the damage that natural disasters cause, and could have saved a lot of lives last week (and in the coming weeks as the threat of infectious disease among the survivors looms large). I saw a picture in the Times today of a village in Aceh, where the entire village was gone except for a single mosque that was made of concrete. Would people have been safer if the whole village was made of concrete? (Maybe; maybe not...)

This is a theme that keeps cropping up. I remember after the floods in Haiti this past summer, people lamented that Haiti's poverty meant that the temptation would be to rebuild in a way that the next inevitable disaster would also be deadly: reconstruction on the cheap leaves residents in flimsy shacks that are okay as temporary makeshifts, but no good once the disaster aid money dries up and residents are still stuck in flimsy shacks. Same thing after the earthqauke in Bam about a year ago. (I get the sense that the sudden inflows and outflows of money are actually destructive, in some ways worse than doing nothing at all, kind of like yo-yo dieting.) And on a more global scale, the temptation is always to provide money for HIV/AIDS treatment, but not prevention, when in fact the cost-effectiveness of prevention in terms of lives saved is on the order of 25 times better than the cost-effectiveness of treatment. Of course, the point isn't to replace post-disaster aid with pre-disaster prevention, but to add prevention to emergency aid (with the expectation that in the future, less emergency aid would be necessary).

There is a psychological barrier, of course, to prevention: you'll never know if it worked. People feel the pressure to donate money after the emergency, but not to invest money in infrastructure that would prevent or mitigate future emergencies. I was struck by the following absurd statistic from the New York Times article:
Mozambique, anticipating major flooding in 2002, asked for $2.7 million to make basic emergency preparations. It received only half that amount from international donor organizations. After the flood, those same organizations ended up committing $550 million in emergency assistance, rehabilitation and reconstruction financing.
This is a real problem, perhaps too deeply rooted in human psychology to get around. Perhaps all we can do is to try to remember to continue giving money even after the immediate emergency has passed (both by giving money privately and by pressuring our governments to give more money). World poverty requires sustained aid and policy commitments, not passing waves of sympathy.

Comments, part 2

I switched the comments system yet again after discovering this site so now all comments should appear together and you can sign your name...