Archive for the ‘Book Review’ Category
The New American Underclass, the Unraveling of America, and Bruce Springsteen: A Review of Someplace like America
According to a recent study by the National Employment Law Project, the vast majority of new hires made since the Great Recession has started have been in low wage jobs. People are losing mid or high wage jobs and gaining low-wage jobs. Meanwhile the wealth gap between races is the worst in 25 years. Black families have lost half their median family assets, and Hispanics have lost a shocking 2/3rds of their median household assets.
From a bird’s eye view, then, the Great Collapse of 2008, which many of us thought would be the death-knell of neoliberalism, has turned out to be its best friend. As is often the case, capitalism will come out of the crisis reorganized and restructured, with access to desperate low-wage workers, who no longer even expect mid-range wages, a larger reserve army of the unemployed in order to keep those with jobs in line, and a radical Tea-Party led (and Democratic- abetted) assault on unions, government regulation, and social services. Those who still have jobs are working more hours and seeing their responsibilities multiply, while getting paid less. The age of austerity may finally realize Milton, Barry, and Ronald’s dream of destroying the post-war social compact.
And yet, as Dale Maharidge points out in Someplace like America, a new book of photographs and stories from the Great Recession, for most low-income America, the Great Recession (or as he calls it “The New Great Depression), began decades ago. Since at least the 80s, Reaganite economic policy has created a growing underclass of Americans, who have largely seemed invisible to American society. For a brief moment in 2009, Mahardige remembers, journalists were interested in the tent cities springing up outside of cities in California and Florida. But they wanted to hear about people who had just lost their jobs. In fact, most of the residents had been struggling for years, even during the supposedly good times of the 90s. The Crash simply broke the camel’s back.
In some ways its an underclass whose lives seem familiar to us: the men and women who ride railroads, sleep under bridges, squat in abandoned factories, hover outside of overcrowded breadlines, and drift from town to town drawn by rumors of work seem all reminiscent of Woody Guthrie songs and John Steinbeck novels. But it also deeply alien to see it today, which is why many of the most jaring photographs juxtapose images of 21st century American consumption (a Kenny Rogers ad, a faded Office Depot sign, a Wal-Mart, etc…) next to images of 21st century poverty, to familiarize us with a phenomenon we were told doesn’t exist anymore. Suburbanization and the creation of highways has, to a large degree, pushed this poverty out of sight. The poverty rate in the suburbs, where poverty tends to be much less visible, has been skyrocketing.
In many cases, these are men and women who have known better. Much better. There is a sense of bewilderment to the unraveling of the Fordist social compact, as people can’t quite understand why they don’t have the opportunities their parents did. There is B.T., found selling his life’s possessions alongside a highway in Tennessee, in order to buy food. He was an auto mechanic for years, and was laid off a year and a half ago. “I’m ashamed. I’m the kind of guy who works. I always worked… I haven’t gone to the church yet, because I’m a little embarrassed,” he says. His area of Tennessee was devastated by NAFTA, as the Oshkosh factory closed up and went to Honduras.
The authors go to Youngstown and follow the diaspora of workers, scattered after the devastating closures of the steel mills. In a shack outside of Houston, they find one. His grandfather had worked in the mills in Youngstown, he had hauled steel. When it all crashed down, he lost his job, drank, and drifted. Now he “babbled nonsense… a creature that once was a man,” and lived on the streets. They interview Sally, an “upper-middle class” mother, whose husband was a business-owner. Now she’s waiting outside a food bank in Michigan with a crowd of people. “None of the 224 faces,” she stands with, “register as being any different from those you’d see in a suburban shopping mall.”
One picture is of a family—the baby stares directly at the camera– while the family is too embarrassed to look-up —in line at a free health care clinic in rural Virginia. The caption is the kicker: The Remote Area Medical Volunteer Corps used to only work with the “desperately poor” in third-world countries. Now it operates largely in rural America.
True poverty, Maharidge points out, takes some time to kick in. At first, when people lost their jobs in Youngstown, they survived. Some found new jobs, others accepted lower pay. Many drifted on unemployment or disability. But eventually these supports dry up, and the longer you’re out of work, the harder it is to get new employment. You are evicted or foreclosed, your family or friend gets sick of you crashing with them, and you move on. Maybe you hear about an opportunity a couple of towns over, but can’t afford a hotel room or a deposit on a rental. So you begin sleeping in your car. Gas and maintenance becomes too much, so you hitch or walk. Many sell blood or begin scrounging food from wastebins, at first in moments of despair, later as an everyday activity. The embarrassment and stress gets to people, who begin abusing drugs or alcohol. Many are victims of theft, murder, or rape, and rarely do the police investigate.
As the Great Recession continues (and it is continuing for most people…) the middle class is falling into the lower class, the lower class falling into the poor, and the poor falling off the social map into the informal economy of scrounging, subsistence farming, petty thievery, homelessness, prostitution, and the like. There are almost certainly more Americans who live like this then, say, Americans who watch the Daily Show. Mahardige and Williamson introduce us to “Edge men,” people who have completely given up on the prospect of employment, and squat or set up tents outside of cities, to live permanently outside of society. With years of high unemployment, and millions of so called ‘99ers approaching the end of their benefits, we better get used to seeing these people.
The authors of Someplace like America had previously collaborated on Journey to Nowhere, which had the distinction of inspiring a number of Bruce Springsteen songs, including Youngstown. There is a line in that song, “those big boys did what Hitler couldn’t do,” referring to the destruction of the steel mills. It comes directly from an unemployed steel worker Joe Marshall Sr. There is an odd section of Someplace like America, where the authors meet Springsteen and sneak into the abandoned Jeanette blast furnace (the “Jenny”) to contemplate what’s happening to America. I’ll give Bruce, who wrote the introduction to this book, credit. In his own way he’s been warning about this stuff for decades, while much of the left couldn’t even use the word “class.” During the heady days of Morning in America, while the Democratic Party began its now unbreakable marriage with Wall Street, he was writing about the consequences of these policies. He saw the future better than most political pundits did.
Even many of us on the left aren’t nearly as comfortable talking about poverty and class, as we were in the 30s. We’re comfortable with racialized urban poverty, which makes sense in our post-68, urbanized, information-age worldview, but have pushed rural poverty (white or black) completely out of our imaginations. We live in Brooklyn or Cambridge or San Francisco or Portland, little bubbles that have done much better than the rest of the country. Our discourse is almost completely unable to talk about poverty except as a technocratic problem. The moral core of the problem—that some have so much and some have so little, and that this is a result of collective decisions we have made—can not fit into our ideological imaginations.
Which is why the photographs by Michael Williamson are so important. I wish there were more. It’s a book that is self-consciously in the tradition of Let us now Praise Famous Men (the authors previously had done a follow-up, called And Their Children After them) and Dorthea Lange. The photos are stark black and white images, often of destroyed mills or houses, with only slight traces of human activity. When he focuses on a human subject, often the faces are obscured, as the child pumping water by hand, from the only source of fresh water in Bayview Virginia, a rusty iron spigot. B.T., the unemployed mechanic selling his life’s possessions from the back of his truck, looks down, ashamed of what he’s doing. Perhaps my favorite photography is of “Edge Man Ed.” You only see him in silhouette against a gigantic garish photograph of smiling Kenny Rogers smoking a cigar. Ed had saved the plastic sheeting from a Rogers’ billboard and was using it to make a tent out of.
Williamson walks a fine line, between drawing attention to the poverty and helplessness of his subjects and trying to find the dignity and resolve in their faces. It’s hard not to see the same tension in how Mahardige talks about the subjects or we think about them. Too much pity makes them degraded victims; too little anger is inappropriate.
It’s already known that for Janet Malcolm, no profession is sacred, not even her own. Yet while remaining hyper-aware of her role as journalist in her latest book Iphigenia in Forest Hills, she also assumes the mantle and mentality (with intense psychological portraits) of lawyer, judge, and executioner, not to mention father of the dead, daughter of the accused, state-appointed law guardian, and alleged murderess. Some might call it a performative contradiction, but then again she sees all the characters in the trial as performers with deep contradictions. Perhaps she’s merely joining the gang, or perhaps her own performance is intended to highlight the inconsistencies that surround her.
Iphigenia in Forest Hills recounts the murder trial of Mazoltuv Borukhova, a physician and member of the Bukharan Jewish community in Forest Hills in Queens, accused of hiring a hitman to murder her ex-husband after a court ordered their young daughter be transferred into his custody. I recommend it wholeheartedly. About her protagonist, Malcolm writes, “she couldn’t have done it and she must have done it.” This appears on page 32 of 155 pages, and by the end the reader is left with no further conclusion than that. Either we remain satisfied with this impossibility, or we start doubting Janet Malcolm’s authority. But why doubt Malcolm’s authority rather than someone else’s? Take the judge for instance: Robert “Hang ’em” Hanophy, whom one juror (apparently hand-selected for his gray everydayness) says (on page 96) is “real and down to earth and serious about his job. And funny. He had a good sense of humor.” But nearly 90 pages before, Malcolm has already described Hanophy as “a man of seventy-four with a small head and a large body and the faux-genial manner that American petty tyrants cultivate.”
I keep noting the timeline of the book because it tells us something about what Malcolm’s doing here. Malcolm doesn’t ask the reader to reach his or her own conclusions as testimony is laid out; she doesn’t pander to expectations of objectivity. The jurors and judge are already biased toward actions and behaviors that seem legitimate to their own understandings, and Malcolm isn’t about to let them get the monopoly on prejudice. Yet while Malcolm gives her narrative precedence, the nature of the written form allows her thoughts to become interwoven with those of other characters’; the reader flips back and forth to re-read a Malcolm characterization of someone an interviewee has presented in a very different light. And so Malcolm’s own narrative can be retroactively challenged. While I was initially convinced by Malcolm’s claim that Borukhova both couldn’t have and had to have killed her ex-husband, at some point I began to doubt that she couldn’t have. Despite this deep paradox, Malcolm is more convinced that she knows Borukhova’s character than I am (though in a recent Paris Review interview, Malcolm admits, “As I went along I felt I undestood her less and less… [Borukhova] becomes who you imagined she is.”) Flawed legal evidence abounded, and Borukhova appeared to be a successful career woman, a devoted mother, and quite possibly an abused wife, but none of this convinced me that she couldn’t have done it. Perhaps this makes me the radical relativist to the contrarian Malcolm, characterizations that make generational sense given her birth in 1934 and mine in 1983.
In January 2009 the media midwifed a new hybrid species and dubbed it Octomom. Octomom was 33-year-old Nadya Suleman, a California woman who had an unknown number of eggs implanted using IVF and gave birth to octuplets on January 26, 2009, bringing her total brood to 14. Since then Octomom has never quite left us. Just last week she appeared on Oprah to talk financial difficulties.
Once it became known that Suleman’s octuplets, only the second set to be born alive in the United Sates, were no miracle but the result of an assisted reproductive technology [ART], that all her other children had also been born through IVF, and that Suleman herself was single and unemployed, a media storm blitzed its way through the nation. The public spiritedly lambasted Suleman as a selfish woman who had irresponsibly used ARTs to bring 14 children into a world in which she and 11.6 million other Americans were unemployed.
Yet just a decade earlier in December 1998 Nkem Chukwu became the first American woman to successfully give birth to octuplets. Chukwu also used IVF to achieve this feat, but the American public did not gnash its teeth at the announcement. Chukwu was portrayed as a tired woman in a wheelchair next to her husband, a woman who discussed how her faith in God had brought her through a hard pregnancy, and who explained that she had refused a selective reduction operation during her pregnancy because she “could not find such words in [her] Bible.” No one pointed out that neither could she have found “IVF” there. Chukwu sacralized the births: “I wanted to have as many babies as God would give me,” and in turn the media portrayed the pregnancy as miracle rather than monstrosity.
In contrast no mention was initially made of Suleman’s refusal to undergo the same selective reduction procedure. A bioethicist at the University of Pennsylvania called the scandal an “ethical failure” and there were invocations only of Suleman’s obsessions, not God’s gifts. Of course Suleman embodied one of the media’s favorite objects of fascination and reproach: young, female, desirous, and with a body that performed feats unknown to natural woman. Like other media favorites, Suleman even got her own hybridized nickname, Octomom, but unlike Brangelina, the hybridity was maternal rather than romantic, interspecies rather than intra-; Octomom was part-mom, part-(marine)-beast, and implicitly part-machine.
Though at first the nickname Octomom seems to reduce Suleman to the sum of her eight kids, the focus on Suleman’s desire or “obsession” instead reduced her eight newborns to herself. The scorn heaped on Suleman’s actions carried the implication that the children should never have been born in the first place, a curious stance for a society obsessed with abortion, celebrity children, and big families like the conservative Christian Duggars and John & Kate Plus 8. But Suleman made no attempt to explain her extraordinary pregnancy outside her own personal desires, and she lacked the trappings—husband, comfortable income, religious belief—that might have normalized it socially.
As a result, Octomom became a symbol of selfish enhancement, artificial excess, and irresponsible motherhood, and a reproductive technology that has been used to conceive over 250,000 pregnancies in the United States since the early 1980s suddenly became the focus of intense public discussion, giving bioethicists a platform to point out that while IVF is widely regulated throughout Europe, the US federal government only demands that ART clinics track their success rates.
Was the reaction to Octomom merely symptomatic of society’s anxiety about the impact of new technologies on society, or was something deeper at work concerning our contemporary understanding of maternal agency? I think Carl Elliott’s Better than Well: American Medicine Meets the American Dream  is an interesting place to start thinking about the relationship between society, agency, treatment, and enhancement. Elliott theorizes that Americans’ obsession with identity and authenticity helps explain why Americans appear uneasy with enhancement technologies yet seek them in droves:
We need to understand the complex relationship between self–fulfillment and authenticity, and the paradoxical way in which a person can see an enhancement technology as a way to achieve a more authentic self, even as the technology dramatically alters his or her identity.
This authenticity often depends on the assertion of deficiency. By turning a characteristic into a deficit, such as the lack of social ease in those prescribed Paxil, an enhancement becomes a treatment.
Of course this construction of deficient or disabling conditions is an ever-evolving social process with consequences for a person’s understanding of his or her authentic self. Today social phobia is the third most common mental disorder in the US, but 15 years ago it was a rare problem. Diseases are not just culturally symptomatic, they are causal and therein lies the risk. Ian Hacking’s looping effect suggests that the identification of a disease creates the conditions for the manifestation of that disease in others. For instance the emergence of the idea of gender identity disorder gave people a means to conceptualize and reinterpret their experiences around a single idea, in this case a disorder with a surgical solution.
Elliott calls this semantic contagion, and while it is a more complex idea than the gloss I give it here, its relation to the idea of copycatting may help explain the suspicion and fear with which certain diseases or disabilities are approached.
In general, Elliott is sympathetic to those who make use of the possibilities of biomedicine like pharmaceuticals or sex-reassignment surgery to achieve self-fulfillment because he sees bodies, technology, and identity as co-constructive entities. He is even sympathetic to voluntary amputees, who want to cut off their limbs as surgical treatment for what they claim is a psychological condition, asking which is worse: to amputate your leg or to live with an obsession that controls your life. Elliott provocatively suggests that voluntary amputation is fair game in a world where you can “pay a surgeon to suck fat from your thighs, lengthen your penis, augment your breasts, redesign your labia, implant silicone horns in your forehead or split your tongue like a lizard’s.” Thirty years after the first test-tube baby, is Octomom just what society should come to expect?
In an America that takes its individual responsibility seriously and its babies very seriously, how a gestating mother behaves and what she ingests has become increasingly socially and medically monitored. Authors who have explored the construction of fetal alcohol syndrome or tracked the impact of obstetric tools like ultrasound have argued that this has resulted in the objectification and erasure of the mother, and her individual needs, as she comes to embody the potential life within her.
In The Making of the Unborn Patient: A Social Anatomy of Fetal Surgery  Monica Casper traces the implications of what in the 1990s was the relatively new medical field of fetal surgery. In fetal surgery a woman’s fetus is partially taken out of the uterus, operated on, and, if it survives, placed back into her womb for further gestation. In 1998, fewer than 100 fetuses, all of which would otherwise have died in the womb, had been operated on. Only 35% of the fetuses survived the surgery.
Though the numbers suggest that most women will miscarry or choose to abort a fetus that is likely to die in the womb, Casper sees fetal surgery as contributing to the materialization of the fetal patient at the expense of the mother. The mother and fetus are first separated as subjects, and then one is given preference over the other. Pathologization in this case doesn’t result in the reorientation of an identity but instead in the creation of one subject and the erasure of another.
Casper obviously sees her book as a warning signal to women; they should be aware that in being made invisible, their agency risks obliteration. In becoming patients, fetuses problematically become persons. Casper surely uncovers a discursive realm, with very material consequences, that represents a serious threat to maternal agency. But does she overstate the extent to which the creation of a fetal patient necessarily erases the pregnant woman, or the extent to which such erasure necessarily threatens the woman’s agency?
If we take Carl Elliott’s biomedical world as our own, then bodies are frequently objectified and technologized for one’s own interests. Does the materialization of another subject through this technologization necessarily threaten those interests? I am not doing Casper, who recognizes that both fetal and maternal interests could be valued in fetal surgery and argues that the field is a ripe area for a women’s health intervention, full justice. But I do want to challenge the idea that invisibility, in the face of social and bioethical surveillance, is necessarily a handicap to a pregnant woman’s agency. In a world where increased biomedical capabilities has engendered a field of bioethicists, of which a substrata warn the public to value mystery in the face of mastery, do efforts to regulate maternal behavior in fact intensify when a pregnant woman’s own subjective desires and agency become visible? In other words, are pregnant women in fact more free because the gestating woman is absent from the sonogram?
The case of Octomom would seem to confirm the idea that unmediated maternal agency provokes surveillance and can even reverse a typically pro-life discourse (though not necessarily its anti-choice iteration). Obviously the media’s issue with Octomom was partly the abnormality of giving birth to eight children at once, combined with the perceived social disadvantage of the children as members of a 14-child family led by an unemployed, unmarried mother. I want to argue, however, that the intense media circus surrounding Octomom suggests that we, benefactors of the biomedical era, owners of our own bodies, who need merely pop a pill each day to prevent pregnancy and who can pull out a fetus and put it back in, whose obsession with identity grants us leave to do most of what we want with our bodies, have centered many of our anxieties surrounding the blurry divide between perfection and freakishness, human mastery and mystery, on the bugaboo of maternal hubris.
Yet how to explain the fact that most women who undergo IVF are not seen as hubristic cyborgs? In Making Parents: The Ontological Choreography of Reproductive Technologies  Charis Thompson details the small everyday negotiations that are made to normalize ARTs in fertility clinics. She argues that not just babies but parenthood is constructed in the reproductive clinic, and that we exist in a new biomedical era which requires us to reconceptualize objectification, agency, and naturalness. Rather than seeing a sharp division between personhood and non-personhood, for either the fetus or mother, Thompson sees many forms of fetal personhood that operate in direct relation to the mother’s own expressions of agency:
The clinics deal on a daily basis with human gametes and embryos, which function in this clinical setting as questionable persons, potential persons, or elements in the creation of persons. Embryos, for example, can go from being a potential person (when they are part of the treatment process), to being in suspended animation (when they are frozen), to not being a potential person (when it has been decided that they will be discarded or donated to research), and even back again to being a potential person (when a couple has a change of heart and frozen embryos are defrosted for their own use or for embryo donation.
As in Casper’s narrative, the pregnant woman still comes to embody the potential life attributed to the embryos, but Thompson asserts that this ontological change contains not just objectification but agency and subject-formation—a dense choreography on whose merits Thompson makes no explicit judgments, though she herself used IVF to give birth to her daughter. Just as Elliott’s voluntary amputees objectify their own bodies to achieve a more authentic conception of themselves, women using ARTs allow the medical objectification of their bodies in order to assume the identity of motherhood.
Thompson traces how the work done in reproductive clinics naturalizes kinship and procreative intent, smokescreening patients’ exceptional agency in selecting gametes with certain characteristics or constructing non-normative, and previously impossible, bio-social family structures. Significantly, of course, IVF actually has a high failure rate, and many women often require three or four rounds before an embryo implants, though this fact doesn’t necessarily obscure the appearance of extraordinary control, as the ambivalent reactions to the NY Times story of the Twiblings recently demonstrated.
Because of the costs involved, many of those who use ARTs embody a certain socially desirable profile: white, heterosexual, partnered, middle to upper class. Toward the end of the 1990s, however, there was a shift in focus from childrens’ to parents’ reproductive rights, corresponding to a legal trend protecting privacy in the bedroom. Infertility has become pathologized so that some states now mandate that insurance companies cover a certain number of treatment cycles. What were once called artificial reproductive technologies, denoting enhancement, are now called assisted reproductive technologies, denoting aid and treatment. Finally, the language of genes has helped reconstruct kinship ties whose traditional linearity can sometimes be disrupted by ARTs. A mother who uses a daughter’s egg to give birth to her daughter’s sister can focus on genetic kinship rather than processual kinship. An Italian-American woman can invoke the idea that genes code for race and ethnicity to seek gametes that appear to represent a specific group identity.
All these factors contribute to the strategic naturalization of ARTs. When the biological facts of parenthood are underdetermined—for instance when a woman gestates a different woman’s egg—legal, medical, and familial conventions step in to naturalize kinship. In turn, biological entities, like genetics, are used to substantiate the social. In this biomedical process neither the natural nor the social is essentialized—elements of each work together and contribute to a recognizable process of “family building.”
The irony then is that a woman can achieve a great deal of agency by putting herself at the mercy of medicine so long as the desire and control that technology grants her to achieve an exceptional, nontraditional pregnancy in form or substance, is mediated, normalized, and made invisible. And when other forces fail to naturalize an IVF procedure, abortion politics and its close companion, contemporary American religion, have a significant role to play in shaping public perception, as demonstrated by the Nkem Chukwu narrative.
Octomom incurred scathing public scorn and initiated a debate on the regulation of a reproductive technology that has been around for nearly three decades because Suleman made visible—literally embodied—the potential abnormality of ARTs and did nothing to mediate this abnormality through socio-naturalization or by deploying a supernatural discourse of God and miracles. Instead, Suleman’s story was told through the language of human “obsession,” “desire,” and “fixation.” As a result tabloids painted her as selfish and irresponsible, a drain on society’s resources, and the pregnancy as regrettable, the work of human hubris and misappropriated technology.
Ten years ago when Nkem Chukwu had her eight children there was no media storm; in fact the Chukwu octuplets were largely forgotten until Octomom. Nearly two years later, Octomom is still with us. An image of her very pregnant stomach photographed eight days before giving birth saturates the internet—in this photo both Suleman’s stomach and her face, which looks directly into the camera with a half-smile, are distinct and memorable. In October 2010 news sources began to report on the California trial of Suleman’s doctor for negligence. Paparazzi follow Suleman around and blogs speculate about post-pregnancy plastic surgery, the great symbol of American artificiality.
Childbearing in the US is tightly bound to narratives of self-sacrifice—whether it’s the mother who gives up drinking during pregnancy, her career to stay home, or her body to fetal surgery. And while we have reached a point where we endorse a normalized agency and right to parent that supports such sacrifices and naturalizes ARTs as treatments rather than enhancements, maternal self-interest must be mediated and muted, better off obscured than exposed.
Suleman was unusual in her use of reproductive technology to achieve an extraordinary birth, but she was also unusual because she made no effort to portray her pregnancy as natural, therapeutic, miraculous, or self-sacrificial. As a result she became an object of fascination, a much-photographed freakish symbol of hubristic enhancement. Yet the sudden public attention on the question of legal regulation of IVF thirty years after its American birth suggests that Suleman and her pathologized self-interest were also seen as potentially contagious. The border between extraordinary reproductive enhancement and typical treatment was a little too blurry. A fence had to be built, and the media have always been excellent fence-builders. They drew up plans and the easiest way to build it was to turn a woman into a cyborg.
Oh wait, actually it’s a review of Bush Junior’s Decision Points from Eliot Weinberg over at the London Review of Books. Thanks to Mircea, always on the look out for the absurd, for sending my way. For those of you who are not regular readers of the London Review of Books or my facebook wall I am providing some key moments. Consider it a holiday treat [question: does my use of the term “holiday treat” constitute a Battle on Christmas?]. I would provide extensive commentary except that really, at this time of year, all we want is to get to the good stuff:
I will note that the review, presumably reflecting the book, plays as a tragicomedy — the further you go in it the sicker you feel. Apply Foucault to Bush! Clever! But keep going on and the fictionality of the text and the vagueness of the author makes your stomach do a few flips. “Who really spoke? Is it really he and not someone else? With what authenticity or originality?” are all fine things to ask about J. M. Coetzee, but they’re not ones you want to have to ask constantly about the actions and words of a president who ran your country for eight years:
‘Damn it, we can do more than one thing at a time,’ I told the national security team.
As I told my advisers, ‘I didn’t take this job to play small ball.’
‘This is a good start, but it’s not enough,’ I told him. ‘Go back to the drawing board and think even bigger.’
‘We don’t have 24 hours,’ I snapped. ‘We’ve waited too long already.’
‘What the hell is going on?’ I asked Hank. ‘I thought we were going to get a deal.’
‘That’s it?’ I snapped.
As Foucault says, ‘The author’s name serves to characterise a certain mode of being of discourse.’
This is a chronicle of the Bush Era with no colour-coded Terror Alerts; no Freedom Fries; no Halliburton; no Healthy Forests Initiative (which opened up wilderness areas to logging); no Clear Skies Act (which reduced air pollution standards); no New Freedom Initiative (which proposed testing all Americans, beginning with schoolchildren, for mental illness); no pamphlets sold by the National Parks Service explaining that the Grand Canyon was created by the Flood; no research by the National Institutes of Health on whether prayer can cure cancer (‘imperative’, because poor people have limited access to healthcare); no cover-up of the death of football star Pat Tillman by ‘friendly fire’ in Afghanistan; no ‘Total Information Awareness’ from the Information Awareness Office; no Project for the New American Century; no invented heroic rescue of Private Jessica Lynch; no Fox News; no hundreds of millions spent on ‘abstinence education’. It does not deal with the Cheney theory of the ‘unitary executive’ – essentially that neither the Congress nor the courts can tell the president what to do – or Bush’s frequent use of ‘signing statements’ to indicate that he would completely ignore a bill that the Congress had just passed.
I never know whether to admire or detest Barbara Bush. I admire her brute strength and the fact that she whips George Junior into shape, but Margaret Thatcher had some of the same qualities. I like that she called her son out for fabricating or at least falsifying the fetus-in-a-jar story. But at the end of the day all one can say is that she might be the best of a very bad lot:
Mother – she’s never Mom – pops up frequently with a withering remark. As middle-aged Junior runs a marathon, Mother and Dad are, of course, coming out of church. Standing on the steps, Dad cheers ‘That’s my boy!’ and Mother shouts ‘Keep moving, George! There are some fat people ahead of you!’ When Junior decides to run for governor, Mother’s reaction is simply: ‘George, you can’t win.’ Not cited is Mother’s indelible comment on the Iraq War: ‘Why should we hear about body bags and deaths? Why should I waste my beautiful mind on something like that?’ But the single newsworthy item in this entire book is the get-this-boy-to-therapy scene where Mother has a miscarriage at home, asks teenaged Junior to drive her to the hospital, and shows him the foetus of his sibling, which for some reason she has put in a jar.
Bush claims this was the moment when he became ‘pro-life’, unalterably opposed to abortion and, later, embryonic stem-cell research. (The thought would not have occurred to Mother. At the time, patrician Republicans like the Bushes were birth-control advocates; like Margaret Sanger, they didn’t want the unwashed masses wildly reproducing. Dad was even on the board of the Texas branch of Planned Parenthood. )
Decision Points flaunts its postmodernity by blurring the distinction between fiction and non-fiction. That is to say, the parts that are not outright lies – particularly the accounts of Hurricane Katrina and the lead-up to the Iraq War – are the sunnier halves of half-truths. The legions of amateur investigative journalists on the internet – as usual, doing the job the major media no longer perform – are busily compiling lists of those lies. Gerhard Schroeder has already stated that the passage in which he appears is completely false. And even Mother has weighed in. Interviewed recently on television, she said she never showed Junior that jar, but maybe ‘Paula’ did. (It was assumed we would know that Paula was the maid.)
And finally the infamous claim that the worst moment of his presidency was Kanye West, which I’m surprised was actually let in by whatever crowd of advisers/consultants/focus groups vetted/wrote the thing
The book states that, for him, the worst moment of his presidency was, not 9/11, or the hundreds of thousands he killed or maimed, or the millions he made homeless in Iraq and jobless in the United States, but when the rapper Kanye West said, in a fundraiser for Katrina victims, that Bush didn’t care about black people.
West was only half right. Bush is not particularly racist. He never portrayed Hispanics as hordes of scary invaders; Condi was his workout buddy and virtually his second wife; he was in awe of Colin Powell; and he was most comfortable in the two most integrated sectors of American society, the military and professional sports. It wasn’t that he didn’t care about black people. Outside of his family, he didn’t care about people, and Billy Graham taught him that ‘we cannot earn God’s love through good deeds’ – only through His grace, which Bush knew he had already received.
And that’s where the devastation really hits. Because who would want a president who lacks empathy, and why would such a man ever become president except for the most noxious of reasons.
As many of you know, George Bush’s autobiography hit the stands this week. Well George Bush, of course, isn’t the only figure of world-historical importance to write an autobiography lately. And no, I’m not talking about Tony Blair’s stupid book. So, its competition time. Book vs. Book; Memoir vs Memoir; hell, Life vs Life. Yes, I’m talking about Keith Richards vs George Bush.
The differences between the two men are subtle, but real. Both are famous for cocaine abuse. Both have notorious tempers. Both came of age in the 60s (though on different sides of the cultural divide). But only one tried to beat up Truman Capote while doped out in Dallas.
A bit about Bush’s biography first. Bush begins his memoirs by explaining that he had been in discussion with “more than a dozen distinguished historians.” I would love to know who these “distinguished” historians are. Why do a suspect they regularly appear on History Channel’s UFO Hunters?
Anyways, there are rumors that George hired a ghost writer . Nonsense. You can’t fake this writing style. Like ending paragraphs about the educational crisis with lines like this: “I had promised to take on the big issues. This was sure one of them.” It’s got that “gee-shucks” I can’t believe you morons voted for me attitude that we all came to love. Plus the rhetoric about freedom and liberty that would embarrass a lazy freshman. Just check out chapter 13—titled, I shit you not, “The Freedom Agenda.” A few choice quotations: “Freedom is not an American value; it is a universal value. Freedom cannot be imposed; it must be chosen. And when people are given the choice, they choose freedom.” As we later learn, evil is real. But, don’t worry, guess what beats evil?… freedom! “The answer to evil was freedom.” Whew… glad we solved that.
In terms of writing styles, Keith is much more free and loose. His description of being busted while on acid, for instance, is quite good. “There’s a knock at the door, I look through the window and there’s this whole lot of dwarves outside, but they’re all wearing the same clothes! They were policemen, but I didn’t know it. They just looked like very small people wearing dark blue with shiny bits and helmets. ‘Wonderful attire! Am I expecting you? Anyway, come on in, it’s a bit chilly out.'” On the downside, his constant reference to female groupies as bitches, is a bit disconcerting. He is, after all, almost 70 years old. Nevertheless, casual misogyny aside, Richards wins on writing style. (Richards 1, Bush 0)
Bush has been clear that he is no revisionist historian. He’s not spinmiesiter. Just a good ol’ boy telling it like it is. Richards, on the other hand, is refreshingly honest, that he’s not quite sure how much of himself is now image and how much is real. The whole dirty dangerous image of the Stones, after all, were consciously crafted by record executives to compete with the pretty-boy Beatles. He’s also aware that his memory was, well let’s say impaired, during certain large chunks of his life. At one point he acknowledges that “memory is Fiction, and an alternative fiction is…” before giving someone else’s version of a story. Self-awareness, then, goes to Richards (Richards 2; Bush 0)
Both, interestingly, begin their memoirs with stories of substance abuse. George first: “We had a big meal, accompanied by numerous sixty dollars bottles of Silver Oak wine. There were lots of toasts—to our health, to our kids, to the babysitters who were watching the kids back home. We got louder and louder, telling the same stories over and over again. We shut the place down, paid a colossal bar tab, and went to bed.” The next morning he woke up with a killer hangover that almost prevented him from jogging.
Did you make it through that ok? Real Requiem for a Dream stuff. Luckily for the cause of evil-fighting, Bush gave it up. Redemption, and all that.
Keith, likewise, begins his bio with a drug story. He’s in the South and is pulled over by the cops: “I had a denim cap with all these pockets in it that were filled with dope. Everything was filled with dope. In the car doors themselves, all you had to do was pop the panels, and there were plastic bags full of coke and grass, peyote and mescaline… In the 70s I was flying high as a kite on pure, pure Merck cocaine, the fluffy pharmaceutical blow. Freddie Sessler and I went to the john, we weren’t even escorted down there. He’s got bottles full of Tuinal. And he’s so nervous about flushing them down that he loses the bottle and all the fucking turquoise and red pills are rolling everywhere and meanwhile he’s trying to flush down the coke.” This isn’t really a fair competition, but the award for drug stories goes to Richards. (Richards=3; Bush=0)
Both of them have brief moments of contrition. In one heart-wrenching story George describes his shame at his alcoholism. He’s at a family event. “As we were eating, I turned to a beautiful friends of Mother and Dad’s and asked a boozy question: “So, what is sex like after fifty?” Everyone at the table looked silently at their food—except for my parents and Laura, who glared at me in disbelief.” The next day, he faces an angry Laura, ashamed of his behavior. Interestingly, this is pretty much the only thing that Bush apologizes for in the whole autobiography. Later on Bush brags about authorizing torture (“Damn right”), defends his wiretap program, and wishes he could have privatized Social Security.
Keith also has some shameful memories.
“Some of most outrageous nights I can only believe actually happened because of corroborating evidence. The ultimate party, if it’s any good, you can’t remember it. You get these brief vignettes of what you did. ‘Oh you don’t remember shooting the gun? Pull the carpet, look at those holes, man.’ I feel a bit of shame and embarrassment. ‘You can’t remember that? When you got your dick out, swinging form the chandelier, anybody up for grabs, wrap it in a five-pound note?’ Nope, don’t remember a thing of it.” Hmm… maybe he’s not too contrite. So in terms of heart wrenching stories of redemption, we’re going to call a tie. Score stays at Richards=3, Bush=0.
Both, of course, suffer from the occasional lapse of judgment. In Richards’ case, its hard to agree with his opinions of the Rolling Stone’s 80s and 90s output. Really? Bridges to Babylon is a good album? Your solo work was good? C’mon, man. Plus his explanation for why Some Girls is not racist is, let’s say, unconvincing (It involves the fact that the Rolling Stones really have slept with lots of different types of women, so they should know what they’re talking about). Bush’s poor judgment, on the other hand, starts in, well, the introduction, continues to page 44, when he acquires the Texas Rangers, and then begins again on page 48 when he stops talking about baseball. The index seems well put together also. Richards 4, Bush 0
There are touching moments in each. Keith discussing Brian Jones’ death and the death of his infant child, George Bush visiting wounded soldiers. In fact, my favorite moment of Bush compassion comes when he meets a wounded solider. This poor guy had a rocket propelled grenade tear off part of his skull, his right hand, and put shrapnel all over his body. He had a request for Bush: he’d like to become a citizen. See this, my friends, is what compassionate conservativism is all about. After NAFTA destroys the basis of Mexican agriculture, you can come to America, pick fruit for your childhood, go fight in some imperialistic adventure, get your skull half blown off, and then, then, the President of the United States shakes your hand! Well, not your hand. Because you don’t have it anymore. But still! What a compassionate path to citizenship!
Compassion, though, isn’t really either of their strong suit. Bush, of course is a complete psychopath responsible for the deaths of hundreds of thousands. And Keith Richards once almost killed a man (at his daughter’s wedding no less) because he stole some of Richards’ onions.
Both are responsible for their share of mayhem and violence. Keith, for instance, informs us that he has been carrying a knife since being a teenager. He explains the best way to win a knife fight (hint: the knife is for distraction, the main attack should be a kick in the balls), and explains that in his occasional drug purchases that have gone bad, he always takes the chances with gunfire. It’s very difficult to hit a moving target he reminds us.
Of course there really is no competition in terms of senseless violence. Richards may have Altamont, but Bush has Abu Ghraib and Fallajuh. So we have to give the award for senseless loss of human life to Bush. Richards 4; Bush= 1.
In all seriousness, though, the most remarkable difference between the books is that Keith Richards, almost certainly, has more self-awareness and integrity than our President did. What makes Bush’s book such a shitpile is exactly what made him such a terrible president: his total lack of self-awareness and self-consciousness. David Foster Wallace once wrote this great review of a tennis star’s autobiography (“How Tracy Austin Broke My Heart” in Consider the Lobster). He made the point that great athletes excel partly because they are able to concentrate completely on the sport, unencumbered by nagging doubts in high-stakes moments. What makes a tennis player great, then, is exactly what makes an author terrible; hence there are few great sports memoirs. Bush’s book is bad for the exact same reason. There is no self-reflection, no moments of humility (besides the obligatory references to the troops, mandated, no doubt, by his PR people), no second guessing, no shame about the fact that he compares himself to Abraham Lincoln at least five times (pps. 389, 368, 203, 195, and 183). At least Tracy Austin had the excuse that she was a great tennis player. Bush isn’t even interesting in his self-absorption.
With its overt instrumentalization of America’s history for its angrily vague purposes, the Tea Party has brought the question of history’s relationship with political advocacy into sharp national focus. Of course this question is implicitly, ok sometimes overtly, addressed or entertained at this blog — Wiz in stalwart 19th century Americanist form has brought the Tea Party to task here and here for its historical smudging. Sadly Jill Lepore has beat this blog to it and recently published a book about the Tea Party’s use of the past for political ends. In my own field, Linda Greenhouse and Reva Siegel just came out with a book of documents illustrating the abortion debates that took place in the pre-Roe v. Wade period. In a later post I’ll talk about the political significance of these documents, which suggest that the “judicial backlash” narrative placing blame for today’s extremely polarized abortion politics at the Supreme Court’s door is misguided, but for now I just want to explore the idea of politicized historicization itself– the implicit assertion that making something historical, makes it good.
To take something that is unlikely (unless this blog has an audience I’m totally unaware of) to raise too much political fervor in the comments board, a book I recently read on the psychiatric definition of depression and its policy implications seems a good place to start. With the publication of a series of memoirs in the 1990s, including Elizabeth Wurtzel’s infamous Prozac Nation, the idea that our society has been invaded by depression and its related medications became a cultural truism. The subsequent decade saw a different cultural realization: that the American healthcare system was stratified between those who had no coverage and those who were covered too much. Scholarly tracts and magazine essays highlighting the constructedness of disease and proposing to reconceptualize various medical practices in order to cut corners have since become the norm. Allan V. Horwitz and Jerome C. Wakefield’s The Loss of Sadness: How Psychiatry Transformed Normal Sorrow into Depressive Disorder seems to have risen out of this perfect storm of cultural unease and economic incentive. The authors (both academics but not trained historians it’s important to note) at the end of the day have one purpose: to suggest that the way we think about depression today is not “natural,” and to use a long history of sadness to propose a different “natural” boundary that would remove a significant number of today’s patients from the category of the “psychologically depressed.”
The odd thing is that the authors seem to think that this historicization of what sadness and depression used to be should drive contemporary policy on how to medically define them today. The authors think that if they can show that a certain phenomenon has a long history, they can prove there is something humanly essential about it, and that this is significant for how we approach it today. In reality, it derives from a philosophical viewpoint (not that far from the Tea Partiers’ frequent invocation of our founding fathers) that long continuities trump recent change. Things with history are better than things without. (Where in the world would this leave women at the end of the day?)
Wakefield and Horwitz’s charge is that, far from inevitable, and in fact composed of a very short history, the recent depression “epidemic” was only made possible in 1980 “by a changed psychiatric definition of depressive order that often allows the classification of sadness as disease, even when it is not.” The current crisis is at least in part one of classification: a group of individuals experiencing a reactive “normal” sadness due to depressing but typical life events (the loss of a job, the end of a strong romantic attachment) are being gerrymandered into the constituency of the depressed due to the rewriting of the depressive disorder entry in the 1980 volume of the American Psychiatric Association’s codification of mental disorders, the Diagnostic and Statistical Manual of Mental Disorders [DSM-III]. The revised definition elided the distinction between a sadness caused by an internal, biological dysfunction and a reactive, temporary sadness caused by the trauma of everyday life.
This is a problem since, as any good Foucauldian knows, classification leads to surveillance, surveillance to control; in this case medical and pharmaceutical control. Foucault might then talk about subject formation, but the authors here can only speculate about potential negative effects of the relatively recent elision, in part because they present no historical or sociological research into its impact on those affected. And though the tracking of changed definitions might suggest a genealogical method, the authors are more interested in “diagnosing” why the changed definition is a problem. Thus they spend two chapters delving into a long history to demonstrate that there are deep historical grounds for the divide between normal and disordered sadness; this historical evidentiary basis then spirals together with the authors’ descriptions of the evolutionary biology of sadness and notes on the anthropology of non-western populations. All of this, the authors claim, establishes the divide between normal/abnormal sadness as essential, human.
The authors draw on an array of disciplinary tools to prove that a biologically-grounded, historical divide has existed between endogenous depressive disorder and a reactive normal sadness for the last 2,500 years, jumping from literary and medical history to evolutionary biology and anthropology. The emotions of Achilles and Gilgamesh are probed to prove that both normal and disordered sadness have long been identified as separate phenomena. Of course, the authors are sure to cautiously point out that the attributes of normal versus disordered sadness are culturally conditioned, yet they insist that a structural division exists.
Once this is settled, the search begins for the culprit that erased this important divide, and in a sudden switch to a sociology of scientific knowledge analysis, Horwitz and Wakefield identify the practices, instruments, and interests of the psychiatric profession as a primary cause for a definitional shift in sadnesses. This constructivist approach to psychiatric knowledge seems paradoxical given the authors’ reliance on the claims of evolutionary biology as evidence of a naturalized normal/disordered sadness divide.
This disparity may reflect biology’s status as a “harder” science than psychiatry, but more likely it represents the authors’ willingness to sacrifice consistent method for the sake of present purpose.
From the 1920s through the 1970s, empirical psychological studies relied on new statistical methods that measured symptoms of depression present at a single moment, thus erasing the situational context of their appearance. This replaced the differential diagnoses of sadnesses “with cause” or “without cause” with diagnoses based on the (perhaps momentary) appearance of specific symptoms. At the same time, a rising outpatient population, with a far wider range of problems than inpatients, began to seek treatment; their more situational issues challenged the consistency of diagnoses and resulted in competing and overlapping diagnostic systems. In this story of standardization, the Feigner group at the Washington University in St. Louis then developed new criteria for affective disorders to combat the inconsistencies. Here the authors do convince that there was no real evidentiary or scientific basis for the inclusion of normal sadness under the umbrella of disordered sadness.
But does this mean that the shift was not useful? Horwitz and Wakefield suggest that the conflation of sadness and depression has resulted in the pathologization of sadness and costly misdiagnoses and overmedications of patients. To support their claim that “a flawed definition may be facilitating the recent surge in reported depressive disorder and may even lie at its very heart,” the authors closely read the definition of depressive disorder in DSM-IV, point by point analyzing how it has directly resulted in the increased rate of depression. The main problem, Horwitz and Wakefield conclude, is that the disorder’s criteria do not have enough “exclusions” (besides bereavement) to allow psychiatrists and other medical authorities to differentiate between endogenous depression and temporarily sad reactions to everyday life.
Because the authors presume that a statistical rise in depressive disorders is necessarily a bad thing, without exploring the positive or negative effects on patients themselves, it is hard to say how effective a policy statement this historicization of depression is. One could just as easily speculate that the broadening of a definition has been useful for patients and psychiatrists: instead of pathologizing sadness it might normalize depression; while some might be overtreated, a vaguer definition might also provide access to treatment for patients who do indeed need it and may otherwise have been denied; a vague conflation of definitions might give psychiatrists wiggle room to address specific patient needs that might have been missed under the old system. Oddly enough I am sympathetic to their claims that we need to accept sadness as part of a cultural or human experience, that many patients are likely overmedicated. However, their historization of sadness hardly proves this point. It may convincingly demonstrate that depression as we now know it did not have to be this way, but it does little to prove that it should not be this way.
I just finished reading Robert Aronowitz’s excellent Unnatural History: Breast Cancer and American Society, where he addresses a paradox: for the past 200 years there’s been a flat-line mortality rate for breast cancer, even though there’s been a great whirlwind of intervention, developing technology, and knowledge accumulation surrounding it. He subversively suggests that the “risk assessments” conducted today are based on misleading, even incorrect, data, and that from this we’ve built up a culture of risk and fear that leads to a blurring of the line between “risk factor” and actually “cancerous disease.” The result is a demand for unnecessarily early screenings, which leads to early treatments for conditions that may have remained benign if let be, or would have been caught at a more reasonable point later on. In the end he thinks all this “the earlier the better” frenzy is both an emotional and material drain on our society.
The book reminded me quite a bit of an article “Letting Go” by Atul Gawande published in the New Yorker this past summer on the way we treat end-of-life care — how ever-improving technologies allow us to try to ever extend life, leading to medicalized, emotionally cold, and often painful experiences of death. Gawande asks our society to take a shot of emotional courage and look death in the face. His point is that if we are able to be more honest about death, we will be able to speak with our families and physicians about what we want our deaths to look like, allowing many of us to seek and accept the no-road-back option of hospice care. Having done this, we would be able to focus on living our last days comfortably, preferably in our home and surrounded by our families. Underlying both Gawande’s and Aronowitz’s arguments (no surprise in the era of health care reform) is the point that these phenomena have evolved from a culture of overtreatment dependent on increased technological capabilities, which has resulted primarily in a waste of resources rather than the betterment of our health (or lives). The question of end-of-life care in particular has the potential to take a more philosophical turn some day: if we reach the point where we are technologically able to just forever sustain life, how will we know where life ends and death begins?
What is most worrying in both these tales is the way in which there is no one culprit — everything works within institutional and discursive systems: the doctor has protocol learned in med school, mandated by his hospital, structured by his relationships with biomedical companies. The patient lives in a world with targeted pharma and genetic testing ads, asking him or her to “determine your risk” and “speak with your doctor.” In both cases there is a doctor-patient relationship shaped by these other relationships, but also by the limits of articulation, and by constructions of risk and responsibility, hope and health.
Getting to the end of Aronowitz’s book, one feels that something is desperately wrong, but one doesn’t quite know where to look. Should medical schools tell doctors to target limits on treatment plans? But at what point does this turn to medical paternalism? Should there be public service announcements asking people to think twice before going to the doctor? But what of those who need to seek treatment and don’t? (I should note that a problem I had with Aronowitz’s book was his lack of a sexuality or body perspective, thus bypassing the dilemma women face in weighing a loss of their understood (sexual) bodily integrity against a (possible) extension of life, which might actually lead them to non-treatment.) How should we regulate direct-to-consumer genetic testing and pharma advertising, and how should we talk about data and numbers that, when making the move from clinical lab to society, from epidemiological study to personalized health, can result in false assumptions and misleading diagnoses?
The point being that all these things need to be changed, yet they all work together in such a tight system that one doesn’t know where or how to begin. Which is why I was surprised by the conclusion to a recent book review in the NYTimes of Carl Elliott’s White Coat, Black Hat. I don’t know Elliott’s work but it seems to be in line with Aronowitz’s — the everyday happenings and ordinary individuals that constitute our problematic, wasteful, and sometimes brutal medical system. After a brief overview of Elliott’s book where she acknowledges all these things, the reviewer, Abigail Zuger, ends:
What a world, what a world, as the melting witch said in “The Wizard of Oz.” But there is one small consolation: at least Dr. Elliott didn’t have to call his book “White Coat, Black Heart.” Now that would have been depressing. The bottom line is that much of what he describes is simply the big business of medicine as we have allowed it to take shape. His bad actors are mostly just that: actors caught up in a script not of their own devising. They all come home in the evening, take off their black hats and hang up their white coats, just regular working stiffs out to make a buck.
Perhaps she was being ironic and I’m missing it, but wouldn’t it be much more heartening if this weren’t “simply the big business of medicine as we have allowed it to take shape”? If there were some obviously evil, black-hearted medicos whom we could identify, prosecute, and remove? Having just talked about “illegal immigrants, packed into shabby, overcrowded rooms with minimal supervision” who act as guinea pigs for new drugs, is now really the time to exculpate all of us because, well, all of us are at fault?