Monday, April 27, 2009

Today's Universities are as Outmoded as Detroit

New York Times op-ed contributor, Mark C. Taylor, pens a column on the state of the University.

GRADUATE education is the Detroit of higher learning. Most graduate programs in American universities produce a product for which there is no market (candidates for teaching positions that do not exist) and develop skills for which there is diminishing demand (research in subfields within subfields and publication in journals read by no one other than a few like-minded colleagues), all at a rapidly rising cost (sometimes well over $100,000 in student loans).

In other words, young people enroll in graduate programs, work hard for subsistence pay and assume huge debt burdens, all because of the illusory promise of faculty appointments. But their economical presence, coupled with the intransigence of tenure, ensures that there will always be too many candidates for too few openings.

He gives in particular one example:
The division-of-labor model of separate departments is obsolete and must be replaced with a curriculum structured like a web or complex adaptive network. Responsible teaching and scholarship must become cross-disciplinary and cross-cultural.

Just a few weeks ago, I attended a meeting of political scientists who had gathered to discuss why international relations theory had never considered the role of religion in society. Given the state of the world today, this is a significant oversight.

Read the whole column here.

Link to Executive Summary of the Levin Report

The Levin report opens with this epitaph:
What sets us apart from our enemies in this fight... is how we behave. In everything we do, we must observe the standards and values that dictate that we treat noncombatants and detainees with dignity and respect. While we are warriors, we are also human beings. - General David Petraeus

Here's the link: Executive Summary of the Senate Armed Services Committee on Detainee Treatment also known as the Levin Report.
The abuse of detainees in U.S. custody cannot simply be attributed to the actions of "a few bad apples" acting on their own. The fact is that senior officials in the United States government solicited information on how to use aggressive techniques, redefined the law to create the appearance of their legality, and authorized their use against detainees.

Sunday, April 26, 2009

Reckoning with America's Practice of Torture

In Frank Rich's column this weekend, he suggests that the recently declassified "Bybee memos" fill in holes in the timeline of events after 9/11 such that denial is no longer possible.
Still, it’s not Bybee’s perverted lawyering and pornographic amorality that make his memo worthy of special attention. It merits a closer look because it actually does add something new — and, even after all we’ve heard, something shocking — to the five-year-old torture narrative. When placed in full context, it’s the kind of smoking gun that might free us from the myths and denial that prevent us from reckoning with this ugly chapter in our history.

[Abu Zubaydah's] most valuable contribution was to finger Khalid Shaikh Mohammed as the 9/11 mastermind. But, as Jane Mayer wrote in her book “The Dark Side,” even that contribution may have been old news: according to the 9/11 commission, the C.I.A. had already learned about Mohammed during the summer of 2001.

As soon as Bybee gave the green light, torture followed: Zubaydah was waterboarded at least 83 times in August 2002, according to another of the newly released memos.

What motivated the enhanced interrogations?
Maj. Paul Burney, a United States Army psychiatrist assigned to interrogations in Guantánamo Bay that summer of 2002, told Army investigators of another White House imperative: “A large part of the time we were focused on trying to establish a link between Al Qaeda and Iraq and we were not being successful.” As higher-ups got more “frustrated” at the inability to prove this connection, the major said, “there was more and more pressure to resort to measures” that might produce that intelligence.

In other words, the ticking time bomb was not another potential Qaeda attack on America but the Bush administration’s ticking timetable for selling a war in Iraq; it wanted to pressure Congress to pass a war resolution before the 2002 midterm elections.

Indeed, it has been said by people who would know that torture is not useful for intelligence, it's useful in creating false confessions, which, of course, have their own uses.
Five years after the Abu Ghraib revelations, we must acknowledge that our government methodically authorized torture and lied about it. But we also must contemplate the possibility that it did so not just out of a sincere, if criminally misguided, desire to “protect” us but also to promote an unnecessary and catastrophic war. Instead of saving us from “another 9/11,” torture was a tool in the campaign to falsify and exploit 9/11 so that fearful Americans would be bamboozled into a mission that had nothing to do with Al Qaeda.

Read the complete column here.

Saturday, April 25, 2009

Experimental Philosophy and Acting Voluntarily



The relevant passage from Aristotle's Nicomachean Ethics:

"Done under compulsion" means that the cause is external, the agent or patient contributing nothing towards it; as, for instance, if he were carried some-where by a whirlwind or by men whom he could not resist.

But there is some question about acts done in order to avoid a greater evil, or to obtain some noble end; e.g. if a tyrant were to order you to do something disgraceful, having your parents or children in his power, who were to live if you did it, but to die if you did not — it is a matter of dispute whether such acts are involuntary or voluntary.

Throwing a cargo overboard in a storm is a some- what analogous case. No one voluntarily throws away his property if nothing is to come of it, but any sensible person would do so to save the life of himself and the crew.

From the translation of F.H. Peters

Wednesday, April 22, 2009

Torture and the Truth

Andrew Sullivan at the Atlantic makes an interesting connection between the Western Enlightenment's devotion to finding the truth and the prohibition on torture.
The Western anathema on torture began as a way to ensure the survival of truth. And that is the root of the West's entire legal and constitutional system. Remove a secure way to discover the truth - or create a system that can manufacture it or render it indistinguishable from lies - and the entire system unravels. That's why in the West suspects are innocent before being found guilty; and that's why in the West even those captured in wartime have long been accorded protection from forced confessions. Because it creates a world where truth is always the last priority and power is always the first.

Read his post here.

Thursday, April 16, 2009

Education and Opportunity

A very interesting column by Nicholas Kristof about I.Q. in the New York Times today.
Professor Nisbett strongly advocates intensive early childhood education because of its proven ability to raise I.Q. and improve long-term outcomes. The Milwaukee Project, for example, took African-American children considered at risk for mental retardation and assigned them randomly either to a control group that received no help or to a group that enjoyed intensive day care and education from 6 months of age until they left to enter first grade.

By age 5, the children in the program averaged an I.Q. of 110, compared with 83 for children in the control group. Even years later in adolescence, those children were still 10 points ahead in I.Q.

Tuesday, April 14, 2009

Twitter Will Not Morally Corrupt You

The blog at Discover Magazine confirms my suspicion about the article I mentioned earlier in the week which said Twitter could undermine your morality.
Quick! Grab the latest scientific study that may have something remotely to do with Twitter! Run it with a “Twitter Will Destroy Humanity!” headline! With a graphic by Hieronymus Bosch!

Somehow, this finding has grave implications for Twitter, since, according to Immordino-Yang:
For some kinds of thought, especially moral decision-making about other people’s social and psychological situations, we need to allow for adequate time and reflection.

Uhh, seriously? So does that mean pre-Internet humanity, with its countless hours of reflection, was also blessed with impeccable morality?

What Immordino-Yang says there may be too vague for assessment, but for some kinds of moral cognition time and reflection is not necessary, as the recent empirical moral psychology of Professor Jonathan Haidt and others has shown.

Posner on Morality and the Downturn


Here is the best one-paragraph summary of the depression/recession that I've seen.

It's by Richard Posner in an interview with Dwyer Gunn at the Freakonomics blog at the New York Times Online. I'll quote it in full.
Q: What caused the financial crisis? Was it the government’s fault?

A: The government was the facilitator of the crisis, in the following sense. Banking (broadly defined to include all financial intermediation) is inherently risky because it involves borrowing most of one’s capital and then lending it, and the only way to create a spread that will pay the bank’s expenses and provide a return to its owners is to take more risk lending than borrowing — for example, borrowing short (short-term interest rates are low, because the lender has little risk and great liquidity) and lending long (so the lender has greater risk and less liquidity). The riskiness of banking can be reduced by regulation. But as a result of a deregulation movement that began in the 1970’s, the industry was largely deregulated by 2000. Then the Federal Reserve mistakenly pushed down and kept down interest rates, which led to a housing bubble (because houses are bought with debt) and in turn to risky mortgage lending (because mortgages are long term and there is a nontrivial risk of default); and when the bubble burst, it carried the banking industry down with it. The effect on the nonfinancial economy was magnified by the fact that Americans had little in the way of precautionary savings built up. Their savings were concentrated in risky assets like houses and common stock. When the value of those savings fell steeply, people’s savings were inadequate, so they curtailed their personal consumption expenditures, precipitating a fall in production and sales, a rise in unemployment (which made the still-employed want to save even more of their income, lest they lose their jobs too), and, in short, the downward spiral we’re still in.

That's pretty good, I think. Clear, concise and doesn't take too many shortcuts that rely on special knowledge. But then he goes on to say something pretty enlightening about the moral outrage that's been directed at the "excessively risky behavior" of some in the financial community. It's a problem of regulation, not greed. And the regulation is consistent with Adam Smith.

Part of maximizing profits, however, is taking a certain risk of bankruptcy; it does not pay for a firm to reduce that cost to zero. Banking occupies a strategic role in the economy because of the importance of credit to economic activity; borrowing to spend increases consumption — it is how we shift consumption from future to present....

Moreover, banking is the main instrument by which the Federal Reserve creates money, and by doing so reduces interest rates (provided inflation is not anticipated; for if it is, long-term interest rates will rise), which in turn spurs economic activity. By buying government bonds, it pours cash into banks, both directly, when it buys the bonds from banks, and indirectly, when it buys the bonds from private owners but the owners deposit the cash they receive from the purchase into their bank accounts.

When banks start to hoard cash because their solvency is impaired, the money they receive from the Federal Reserve’s purchasing activity does not spread into the rest of the economy. That is why a cascade of bank bankruptcies is far more serious than a cascade of, say, airline bankruptcies. But a rational businessman does not, indeed cannot afford to, consider the cost of bankruptcy to the economy as a whole as distinct from the cost to his firm. So the rational banker will take more risk than is optimal from an economy-wide standpoint. That is the logic of profit maximization, as explained long ago by Adam Smith: the businessman cares about his costs and his revenues, but not about the costs and revenues incurred or received elsewhere in the economy. He is not an altruist. The responsibility for preventing the collapse of the banking system is the government’s, and it has been shirked, with extremely serious consequences.


That is quite an indictment of the regulatory bodies. Does it let the bankers off too easily? Or does he make a good point about risk and limiting risk? For one, this seems to only apply to risk taking and not to the big bonuses which were also a source of moral outrage.

Photo from Wikipedia

Monday, April 13, 2009

Twitter and Facebook could harm moral values, scientists warn

I found it rather unconvincing.
Using Twitter and Facebook could harm moral values, as they don't allow time for compassion or admiration, scientists have warned.

What do you think?

Provider Right of Refusal


Stanley Fish blogged at the New York Times last night about the "conscience clause" which says roughly that a pharmacist who has a moral objection to certain medications can refuse to sell them even though the medications are legal and consumers want them.

It's interesting as his stuff usually is.
This sequestering of religion in a private space is a cornerstone of enlightenment liberalism which only works as a political system if everyone agrees to comport himself or herself as a citizen and not as a sectarian, at least for the purposes of public transactions.

Read Fish's post and come back here to leave comment.

Photo culled from The New York Times

Saturday, April 11, 2009

Genealogies of Morals


The New York Times opens a review of Richard John Neuhaus's book "American Babylon" with a reference to the possibility of cloning Neanderthal man, which the Parr Center blog discussed earlier here.

But the book review is less about cloning Neanderthal man and more about religion and the public sphere or the philosophical foundations of democracy.
The fulcrum of “American Babylon” is, in effect, a simulated debate between Neuhaus and the American philosopher Richard Rorty (who died in 2007). Rorty argues precisely that we do just make up morality, and that there is no way to privilege one citizen’s first principles over any others.

Rorty holds that, as with Oakland, Calif., there is no there “out there.” The smartest people are therefore “ironists.” The ironist believes that we know nothing except our own vocabularies, that “nothing has an intrinsic nature, a real essence,” that concepts like “just” and “rational” are simply “the language games of one’s time.” An ironist may worry “that she has been . . . taught the wrong language game,” but “she cannot give a criterion of wrongness.” The cultural assumptions we share with Plato and Kant are less likely to be “a tip-off to the way the world is” than just a “mark of the discourse of people inhabiting a certain chunk of space-time.” Schools of philosophy or science are just different vocabularies. When an ironist works on developing her vocabulary, she is constructing her self, not getting in closer touch with some underlying reality — for if there is one, it isn’t knowable.

If you find the book review interesting, come back here and post a comment.

Photo of Richard John Neuhaus (Alex Wong/Getty Images for “Meet the Press”) culled from the New York Times

Tuesday, April 7, 2009

David Brooks Pens Column on Empirical Moral Psychology


Brooks' column is the 'most emailed' article today at the New York Times online and another in a growing series, as the popular press gets wind of what's been going in philosophically informed psychology labs and psychologically informed philosophy departments. I'm going to ignore the headline which is "The End of Philosophy." Such an end is not advanced in the article and in any case not supported in it. I'll chalk it to wanting an eye-catching hook.

Brooks does, however, more than suggest that moral philosophy, which Brooks calls 'bookish', will be surprised by these empirical results which show that emotion plays a large part in morality. Lots of comments at the New York Times online reference David Hume as a counterexample to Brooks' notion.

I think that Parr Center for Ethics original reporting indicates that Brooks' interest came from his participation a month ago in Darwin's 200 birthday celebratory activities. Brooks chaired a panel discussion (see here and here) at the John Templeton Foundation that included Michael Gazzaniga (UC-Santa Barbara), Jonathan Haidt (University of Virginia), and Steven Quartz (Caltech). Those are the three people he quotes in the column. Here's a bit cut-and-pasted from the transcript of the panel discussion:

Steven Quartz: Well, certainly, philosophers are rightly, I think, accused of emphasizing the frontal part of the brain to the exclusion of all else, historically, although, there are certain important historical counter-examples to that. For example, much of what contemporary moral psychology emphasizes with the role of emotion is what David Hume emphasized in his theory of ethics as well.

So Brooks knows about Hume, we must assume. I guess, here too, Brooks has 'augmented' his content so as to make it more catchy.

Late Update:

Brooks certainly tries to suggest that philosophy is surprised by the role of emotions psychology has discovered. But let's look to a text many philosophers have engaged with for a long time, Hume's Enquiry into the Principles of Morals from 1751, whose third paragraph begins like this:
There has been a controversy started of late, much better worth examination, concerning the general foundation of Morals; whether they be derived from Reason, or from Sentiment; whether we attain the knowledge of them by a chain of argument and induction, or by an immediate feeling and finer internal sense; whether, like all sound judgement of truth and falsehood, they should be the same to every rational intelligent being; or whether, like the perception of beauty and deformity, they be founded entirely on the particular fabric and constitution of the human species.

Later update:
Philosophers are discussing here and here.

Still later update:
A letter to the editor of the Salt Lake Tribune.




Photo from Language Log.

Sunday, April 5, 2009

CEO Salaries

The New York Times has published here its research on CEO salaries. It's pretty shocking.

The newspaper reports that the median annual compensation is 8.4 million dollars, with the top earner apparently making 104 million dollars per year.

That 8.4 million dollar salary means the average CEO makes $2876 per hour or 23,013 per day (calculating 365 days and 8 hours per day). That's great! The average CEO makes in a day more than I make in a year.

Is there an issue of morality here?

Iowa Supreme Court Rules on Same-Sex Marriage

This is how Time Magazine put it:
Deep in the rural heartland, a straightforward opinion - written by a justice appointed by a conservative Republican governor - methodically eviscerates one argument after another that for decades has been used to keep marriage the sole preserve of straight couples. "This class of people asks a simple and direct question: How can a state premised on the constitutional principle of equal protection justify exclusion of a class of Iowans from civil marriage?" Justice Mark S. Cady asked.

The answer? It can't.

Read the whole decision yourself here (pdf).

Thursday, April 2, 2009

The History of Marriage

At tonight's event, Marriage... Who's Allowed? Who Decides? a lot of time was spent on the history of the institution (or institutions) of marriage. One panelist claimed that because marriage "came before government" and "government should just recognize what marriage is by design" there should be a change in state or federal law to limit marriage to between a man and a woman. However, two other panelists gave many examples throughout history of different structures of quote-unquote marriage. There is a history of polygamy all around the world and also in early Christianity, for example.

How much does the position in favor of restricting marriage depend on a uniform history of the institution that needs protection or conservation? Isn't the basic idea simply that society needs to be "stable" in some sort of Rawlsian sense? Can't the advocate of restricting marriage argue that marriage between a man and a woman is the best environment for the raising of pro-social children?

But the next question, then, is: is that true?

Discussion Thread: Does Rampant Individualism Threaten the Institution of Marriage?

At tonight's event one of the panelists said that even if same-sex marriage was not in fact "a threat to marriage," marriage was still in crisis for other reasons. Particularly, it's a rampant individualism that is damaging to marriage. People choose to pursue careers rather than get married. People feel entitled to happiness and so divorce rather than "stick it out;" or, perhaps, rampant individualism also has a component of self-indulgence? In any case, our individualistic society has a social safety net and amounts of prosperity where it's possible to lead a successful life, whatever that might mean, without ever getting married. This is just to put these thoughts out there. Please leave a comment.

Open Thread: Marriage... Who's Allowed? Who Decides?

This is an open thread to continue online the discussion that was begun at the event tonight, Marriage... Who's Allowed? Who Decides? Just click below to comment on the panel or discuss.