Monday, May 26, 2014
Two of the most interesting, engaging, and informative science books I've ever read were published in the last five years and written by the same person: Sam Kean. The first of these books, The Disappearing Spoon, is a history of the varied elements that make up the Periodic Table, which hangs in every American science classroom and is almost Borgesian in its functionality as both a serious emblem of scientific discovery and a series of 118 doorways that open to reveal 118 separate stories. Kean's second book, The Violinist's Thumb, was a similarly anthological collection of "lost tales," each demonstrating the ways in which our genes have made us into the people we are today. What makes both books so successful, not just as narrative pieces but also works of enlightenment, is Kean's unyielding belief in the people behind these stories. Rather than numb his readers with facts, figures, dates, and academic jargon, Kean distills the most important discoveries of our lifetime--not to mention the last few centuries--into stories of love, death, obsession, resilience, success, failure, and redemption. In some instances, his subjects are unlikely heroes; in others, their genius is tempered by arrogance, jealousy, or even bigotry. But they are human, and the very same men and women who discovered the microscopic bits that make up our universe, our world, and ourselves--the billions of tiny puzzle-pieces that fit together with such impossible precision to make the Everything around and inside us--also allow us to discover them through their work. And in Kean's mind, these two otherwise isolated bodies--the scientist and their science--are inextricably linked and, without one another, incomplete.
With The Tale of the Dueling Neurosurgeons, Kean has taken on another difficult and long mysterious subject, the human brain...or, as he himself writes, the "electrified tapioca" nestled so precariously in the thick lockbox of bone atop our necks. It is the most important organ we possess--the Everest in the atlas of our bodies--and it is unique among the brains of all other creatures in that it is aware of itself, its functions, and its limitations. The human mind questions the universe and our place in it, ponders the existence of a Higher Power (or lack thereof), debates existential quandaries that are forever unsolvable, and struggles with emotions that even its millions of firing neurons cannot understand, though it expends quite a lot of its energy in pursuit of an answer all the same. And yet, for all its powers, we know so little about it that conflicts and disagreements among the most eminent of experts rage to this day, despite centuries of study. Even with the advent of advanced technology, those three gelatinous pounds remain mystifying, and ironically so: the very organ we use to decode the world around us is incapable of decoding itself.
It's in this rich, frustrating, and seemingly fruitless pursuit that Kean finds his stories. Much like his previous books, an outwardly simple scientific task--a cataloging of the world's elements, an analysis of human genetics, and now a study of the human brain--becomes a monumental exercise in patience, dedication, endurance and, frequently, pure dumb luck. The two most unforgettable stories--and for completely different reasons--involve scientists who found themselves in tropical locations thousands of miles from home. The first is story of Carleton Gajdusek, a bombastic and headstrong man who took up residence in the Southern Pacific to study kuru, a degenerative neurological disorder that was devastating an isolated tribe in New Guinea. Earning the trust of the locals, he was able to gather brain and tissue samples from their dead, which he then had to ship back to laboratories throughout the world without a reliable postal service or the assistance of refrigerated transport; by the time he returned stateside, he had gathered enough raw data and materials to diagnose the cause of the tribe's problems. Unfortunately, his return--with quite a few of the island's boys over a number of years--also marked the end of his career, as his sexual predilection for those same boys became known. Gajdusek would die in exile after spending a year in prison, his legacy ruined--a Nobel Prize forgotten--and his otherwise monumental research forever tarnished by his actions. Similarly, the disease he had dedicated much of his life to unraveling and even curing, known as the "laughing disease," ended on its own when the locals stopped consuming the brains of their recently deceased tribespeople.
The second story concerns two British soldiers during World War II who also happened to be doctors. Captured by Japanese soldiers and mercilessly starved, they watched as their fellow POWs fell victim to an epidemic of beriberi, which they documented in great detail for months on end but were unable to stop. When it became clear that their research would be confiscated and likely destroyed by their captors, the two men sealed their papers in a tin and buried everything, unsure if they would even survive to see their hypotheses tested. Luckily, both of them did, and their research was retrieved with literally minutes to spare.*
In both instances, the scientists involved found themselves in extreme conditions, noticed a devastating health problem, and used whatever they had on hand--makeshift surgical instruments, a cooler, scraps of paper, a tin, and their knowledge of the human body--to work towards a solution, not just to save lives, but to advance science itself. Many of the other stories featured in Kean's book work the same way. In one chapter, a famous neurosurgeon bribes a priest so that his assistant can cut out the glands of a dead man hours before he is to be buried; described as "an illiterate wagon-driver," the man--John Turner--suffered from giantism, and the neurosurgeon is certain the cause is located in his glands. But as the assistant finishes removing the pituitary, the deceased's family breaks down the funeral-parlor door, forcing the young assistant to flee into a waiting taxicab. In another, an epileptic known only as H.M. has pieces of his brain sucked out with a tube; the procedure cures him of almost all seizures but leaves him with almost no memories. In fact, his brain becomes such a mysterious and important part of neuroscience that, after he passes away at age 82, it is removed from his skull, frozen, sliced into more than two thousand micro-thin slices, and scanned at extreme magnification for digital study.
I write of these men and their patients in present tense, not only because they are the subjects of narrative nonfiction, but because their work--or, conversely, their ailments--are with us today. They inform modern science in ways that theories, anecdotal tales, and small-animal experimentation never could. Which is the tragedy that underlies much of Kean's book: in order for us to understand the brain as much as we do today, many people--men, women, and children equally--had to suffer. Some of them were unfazed by their ordeals, or they learned to live with what had happened to them, but most experienced pain and misery, if not total loss of life, and because of their own bodies no less. When neurosurgeons today speak of the advancements that have been made and the knowledge that has been gained, they speak of countless patients--dozens, maybe even hundreds of ordinary people--whose lives were unexpectedly interrupted, their bodies and minds forever unfixable. The pursuit of knowledge often claims its fair share of victims--Marie Curie is perhaps the most recognized example of this--but very few areas of science have claimed more than neurosurgery. And still, despite all this, we know very little. Those who taught us through their suffering did so in the beach-waters of an ocean that, even today, seems unimaginable in its breadth and depth. The horizon, unfortunately, is so very far away.
*One of the men--Hugh de Wardener--lived long enough to be interviewed by Kean himself for this book. De Wardener passed away in late 2013 at the age of 97.
Sunday, May 25, 2014
On an unassuming Wednesday morning in 1881, nine men--three sets of brothers, an unarmed ranch-hand from Mississippi, and a dentist--walked out into the streets of Tombstone, Arizona, and engaged in a shootout that lasted 30 seconds. When the firing ended, three of the men--Ike Clayton, Tom McLaury, and Frank McLaury--were dead, all of them outlaws. Two more men--the marshall and his assistant--were wounded, though not fatally. And two others--Wyatt Earp, brother to the marshall and his deputy, and Doc Holliday, one of Wyatt Earp's only friends--would become famous because of it. In fact, this one moment would come characterize the unsettled West as a place of common violence, lawlessness, and untamed brutish antagonism, defining an entire era in American history as something befitting the Hollywood depictions that were only decades away. By 1890--less than ten years after the shootout--all but two of the survivors would themselves be dead, either at the hands of other men or, in the case of Doc Holliday, tuberculosis. Virgil Earp, disabled by a gunshot to the arm almost exactly two months after the shootout, would die in 1905, and his brother Wyatt would pass away in 1929 at the age of 80, a senile old man living in a Los Angeles apartment with his third wife. It was there, in the heart of the burgeoning American cinema, that Earp would consult with the very same industry that would one day transform him into a caricature of himself: an ironic end to a man redefined in a blink of an eye, his legacy forged from shadows into reality.
Larry McMurtry, perhaps the more eminent Western writer working in America today, has written his most recent book about Wyatt and Doc. And of the nearly 200 pages that make up The Last Kind Words Saloon, the infamous occurrence at the OK Corral occupies just nine sentences at the end of the final chapter. On its surface, this event--the sole moment for which both men are today remembered--should be the focus of any study of Earp and Holliday; in McMurtry's hands, it is little more than an afterthought, and rightly so. Enough has been written about the mythologized men of Tombstone, he seems to be saying. Instead, he wants to imagine the men as they really were--not historically, mind you, but symbolically. In McMurtry's sparse, almost barren prose, Earp and Holliday--both of whom were in their thirties at the time--are exorcised from what they've since become and inhabited by the souls of aged porch-sitters, as though they are two old men who'd seen enough of the West to know more about the world around them than almost anyone else.
Except that, when the two men are not engaged in seemingly pointless conversations, they are behaving like immature teenagers aspiring to be much more: they join Wild Bill Cody's traveling show and fail miserably, unable to handle or shoot their weapons, and boring the crowd; they line up empty bottles behind a bar to practice their skills, at which they are embarrassingly bad; they express befuddlement over the local whorehouse's anatomical discount; they have troubles talking with women, and Earp not only punches his wife with frequency but cries afterward, ashamed; and so on. Both men are pulled in opposing directions, first by the images they have of themselves as tough gunfighters, and second by their total incompetence. When they sit on their various porches throughout McMurtry's novel, they are like children at play, their audience little more than each other; when they are forced to act, their performances come crashing down around them, and they are helpless, so much so that when the infamous shootout finally arrives, it is rendered in such unadorned prose--nine short, adjective-free sentences, as though drawn from a procedural report--that it's a shocking moment of awareness. The children have been jarred from their reverie, this time for good--this time with blood.
Which is McMurtry's point. His novel is not a celebration of the West or an appreciation of the men who inhabited it. Instead, this is an anti-Western, an attempt to depict the West not as it was but what we've made it into--namely, a fantasy of raging machismo, loose women, alcohol flowing like rivers, and gunfights abounding. Strip away the romantic adrenaline from the one remembered event in the lives of both Holliday and Earp, and we are left with two irresponsible young boys in a world of other young boys, all of them parading around as elders of the West.* And when we strip away the fictions of the West from our modern depictions of it--the high-noon duels, the desperadoes and bar-maids, the Indian attacks, guns that are ever-reliable and always shooting straight--we see just why this one simple embellishment has become such an important part of our country's narrative: it gives us permission to see ourselves as the gruff young country that has still not completely reformed itself, the rebellious young thing in a world of elders who can still outgun them. We can think of ourselves as tough, confident, and ruthless people. We can anoint ourselves sheriffs over any situation, claim any place or person as our own, can kill or pardon as we see fit, based on our own unspoken conscience. Others will abide by us, will revere and even fear us for what they know of us. We are the wild, untamed millions; when other nations relate their histories across the millennia, relate stories of endless wars and gruesome revolutions, we can think back to our uncivilized early years and be among them. And yet, this image is based on little more than a thirty-second scene, which has somehow been transformed by time and human intervention into a full-fledged play bearing little resemblance to the source materials.
The title of McMurtry's novel comes from a saloon sign that Wyatt Earp's brother Warren hauls around throughout much of the book; it is the name of his establishment in Long Grass, Texas, and when the Earps move across the undefined borders of the new American territories, Warren takes it with him, constantly in search of a new place to call his own, to adorn with his sign. He never does find a permanent saloon, and in the novel's epilogue, a reporter visits the ailing and elderly Wyatt Earp in his Los Angeles apartment. His wife tells the similarly aged reporter to not bother with questions about the gunfight, saying, "Wyatt don't remember much--there's days when he barely remembers me." On his front lawn, topping off a stack of old tires, is his brother's sign, beaten down by time but still in tact. The reporter offers to buy it, but Earp's wife refuses, giving her the sign for free. "Warren Earp drug it around all over the place," she says, adding, "We never did know what he meant by it." Warren's sign was, simply said, a promise that somewhere in the strange and uncharted West there would be a place for him. As the novel closes, the sign and Wyatt Earp are one in the same--two small, forgotten things stuck out of time, still searching for a home, for a reality in which they can simply be themselves and nothing more.
*The only true "old man" in the novel--the head of the Clanton outlaws--is actually referred to as Old Man Clanton in his various chapters, though he is quickly shot and killed by unknown assailants.
Monday, May 19, 2014
For all the adulation heaped on Orville and Wilbur Wright--the two Midwestern bike-shop owners who flew the first working airplane more than a century ago--history forgets that, beyond their scientific and automotive skills, not to mention a fearless desire to succeed, the two men were also stubborn, selfish assholes. While other aviators of the day were determined to see manned flight realized for the sake of progress--of moving humanity towards horizons both literal and figurative--the Wrights balked at such altruistic ideals and, in patenting aspects of their design, made it almost impossible for others to perfect motorized flight and move the technology forward. In fact, once their achievements at Kitty Hawk were publicized and their patent was certified, their story became one of litigation, greed, obsession, and the failed promise of two otherwise indispensable minds. It's this history--of the Wright's battle for supremacy, especially against fellow inventor and aviator Glenn Curtiss--that dominates most of Birdmen, Lawrence Goldstone's account of how two of the most idolized Americans did more than anyone else to undermine not only worldwide progress but also their own legacies.
Despite their humble beginnings, Orville and Wilbur Wright did not want for wealth. Their Ohio bike shop was not only a prescient idea--in the era before airplanes and the Model T, the bike was serious transportation--but also quite successful, and after their triumph at Kitty Hawk, they could have easily and comfortably lived off the fortune and prestige that came with fame. Public appearances and demonstrations alone would have sufficed, and eventually they could have competed against others for monetary prizes--which, as Goldstone shows in exhausting detail, were not just sizable but plentiful. (Goldstone also shows that the Wright brothers were skilled pilots and could easily have bested their competition.) Had they never manufactured a single plane of their own--had they simply drawn up their designs and passed them around to other aviators and businessmen--they could have lived out the remainder of their days without concern, forever revered by a world that would have forever been in debt to them.
Instead, the Wrights enlisted an attorney who understood patent laws, and in just three years they had built a virtual monopoly on one lone patent, which would require anyone who designed, purchased, or flew airplanes to pay their company a hefty sum, if not become part of their monolith altogether.* It was a ruthless decision on their part, one that was written in such a way as to give the appearance of small copyright claims--on flaps, rudders, and so on, all of them seemingly insignificant parts of the overall design--that, in total, handed over complete control of the entire industry to both men. (After all, without those small bits and pieces, a plane would have been useless.) Unfortunately for the Wrights--but fortunately for everyone else--the long wait allowed others to design, build, and fly their own planes, often improving on the Wrights' own work. By the time the patent--number 821993--became official, their design was already slipping into obsolescence. The Wrights could easily have returned to their workshops--to the beaches of Kitty Hawk, even--and made their own changes, drawing on their skills and insights before their competition could do the same; had they done so, they would have remained important players in the "battle to control the skies." It would have been a beneficial decision for everyone, not just the brothers and their fellow enthusiasts, but once again Orville and Wilbur Wright chose to march in the opposite direction. For the rest of their lives--Orville would die in 1912 at the age of 45, a tragedy Wilbur attributed almost solely to the constant pressures of litigation, and Wilbur himself would pass away a bitter recluse in 1948, at the age of 76--they would haunt court-rooms, make spurious demands for payment, elicit antagonism from the general public without apology or concern for their business' public relations, and travel across Europe fighting foreign manufacturers and governments. And with one small exception--a simple design that Orville did not nurture beyond its conception--neither brother would ever invent again.
More than a century later, we live in an age of streamlined, industrialized innovation, when new ideas do not spring from North Carolina beaches or the workshops of bike repairmen, from suburban garages or the kitchen laboratories of curious teenagers or housewives, but from well-funded and organized movements. These are often funded by millionaires and billionaires, each claiming to be in search of revolutionary ideas to make the world "a better place" and fix the seemingly unfixable, but more often than not their endeavors are tinged with the stink of profit--of capitalism masked as innovation. And while this is far from detrimental--after all, money accelerates any process, and those who actualize the next Big Idea deserve to be justly rewarded--it removes human independence and ingenuity from the procedure, both of which are vital to progress. The Wright brothers were able to create and refine their designs because they had funding, yes, but that assurance allowed them to work on an idea that already existed and to do so independent of any outside influence. Had they not been two bike-shop owners with an idea and had, instead, been two employees in windowless cubicles or on the floor of a multi-acre plant, there can be no guarantee of success. Would they have survived above the noises of the bureaucracy? Would their designs have passed quality control, or would they have faced instant rejections for its flaws? (Their original designs had many.) Would they have even been given credit, or would it have been the "invention" of their superiors and CEO, in much the same way Edison claimed the designs of others as his own?
Right now, there are thousands--possibly even millions--of innovators pushing to realize a dream of their own, most of them much more attuned to their responsibility than the Wrights were. The true questions is, are we doing enough for them--giving them the space, the funding, and the freedom to pioneer--or are attempts at fostering their ideas done selfishly and only for ourselves? Because, as the story of the Wright brothers makes clear, those who create for the sake of the world and those who create for the sake of themselves are often hard to distinguish, and it may just be true that they are--at one point in time--one in the same, liable to tip in either direction, dependent on little more than the winds of the day.
*The Wrights were also assisted in this process by the era's patent laws, which were opaque and favored business over innovation, and a judge who was unapologetically biased towards the two brothers.
Thursday, May 15, 2014
The first recorded use of the initialism "LGBT"--Lesbian, Gay, Bisexual, Transgender--is difficult to isolate. There are educational articles from the mid- to late 1980s in which the term is used openly and without explanation, suggesting an established familiarity: a denotation that could exist without context or qualification. (It would be popularized in the following decade and become commonplace in the new century.) In a sense, regardless of where it came from, the term itself was a sizable victory in the struggle for equal rights, and for the communities represented in that four-letter term: it endowed a small, virtually powerless segment of the population with a classification based on respect and appreciation for variance rather than on misunderstanding, bad science, and bigotry. The acceptable terms that were used before LGBT, including "third gender" and "queer," had their own separate benefits, but in the end they were far from acceptable; each connoted a sense of otherness, that those with an orientation other than heterosexual were not "normal," that they were quite literally strange and different, worthy of skepticism, isolation, and prejudice.* Finally, with "LGBT," there was a way in which those fighting for equal rights could speak of themselves without denigrating their very existence at the exact same time.**
That is, unless you identified with the "T."
Over the last decade, as marriage equality and gay rights have become not only socially acceptable but also the law in many states--and, in due time, the nation itself--the faults of "LGBT" have become apparent, at least where it concerns those who are transgender. For even though gays, lesbians, and bisexuals are seeing their relationships--and, in essence, their entire existences--validated through law, with hundreds of rights and responsibilities now open to them, trans individuals have been left behind...or at least found themselves in situations that are both strange and unthinkable. As the Human Rights Campaign points out on their website, a trans individual--say, a trans man who was born a woman--should be able to enter into either a same-sex or opposite-sex union, depending on the state in which they live and how they choose to identify themselves.*** However, laws in this specific instance are vague, trapped in thinking that is not only outdated but often-times offensive. (The HRC cites a 1999 court case from Texas in which a woman's marriage was invalidated after the death of her spouse, simply because she had been born a man; because of this, she lost the ability to sue for wrongful death, inherit his estate and Social Security benefits, and so on.)
Unlike the gay, lesbian, and bisexual communities, transgender individuals do not have an army of trans crusaders on their side fighting for laws that specifically address their needs. There have been trans mayors and council-members, yes, but there has yet to be a Barney Frank or a Tammy Baldwin of the transgender community--an openly trans or gender-nonconforming individual elected to a prominent national office. And while there are public figures who have spoken out in support transgender rights and introduced legislation to further the cause, their numbers pale in comparison to those who support the gay, lesbian, and bisexual community. (For example, there are currently 6 openly gay or bisexual members of Congress, the largest in history; there are no openly trans members.) When marriage equality and gay rights finally become solidified in law, the "LGB" community will have crossed a wide and tempestuous ocean; the "T" community, unfortunately, will still be miles from shore, much of their lives still caught in a legislative limbo.
In photographing and writing about trans individuals, Susan Kuklin highlights important similarities between the gay and trans communities--feelings of isolation, confusion over identity, a rejection of standard gender norms--but allows her subjects the chance to avoid overt comparisons by discussing their own struggles at length. Most of their stories are heartbreaking, and unavoidably so: they are raised by parents who don't understand the struggles of their children, much less what "transgender" even means; they attend schools where gender expectations are enforced with rigor and occasional cruelty; they face medical professionals whose expertise does not extend to the space beyond "male" and "female"; and they transition in a world that rejects variance and aberration. And yet they persist, not just because they want to, but because they have to. It's a testament to the character of Kuklin's subjects that they can so openly express their identities--one teenager, Cameron, discusses gender with the confidence and insight of a professor--and so passionately fight against the prejudice inherent in our world while also leading lives that, to any observer, personify what we consider "normal." They date, they fight, they laugh and cry, they express themselves through art and speech and fashion, they go to school, they graduate, they drop out, they question and they demand...and they do so as outliers in a society that often claims to appreciate those who stand out, that does not guarantee health care or insurance for their needs, does not make room for them in its two-genders-only structure, does not consider their relationships legitimate, does not protect them in the workplace, and still insists on scapegoating them as the cause of transphobic abuse.
Those affected by gay rights and same-sex marriage face many tumultuous issues in their daily lives--visitation rights, adoption, spousal insurance, legal documentation--all of which are so thoroughly entrenched in our culture that it takes laws to undo them. And yet these problems seem almost otherworldly when compared to the prejudice faced by transgender individuals, which is so ingrained that even the most basic aspects of everyday life--public bathrooms, drivers licenses, clothing--become constant reminders of just how unaccepted they are. None of which will be solved by gay marriage, as important a step as that will be, and none of which will be solved by many of the gay-rights laws currently being proposed, as important as those steps are. Transgender rights are an important part of the LGBT movement, but at some point their crusades will part ways, and that is where the problems will begin...because how can a movement celebrate progress when 25% of its identity focuses on those who still suffer under inequality?
Every generation faces its own moral crusade, and it's their responsibility to overpower the prejudices and correct the errors of those who came before them, including their parents and grandparents. For those born during and after World War II, it was civil rights for African-Americans, which younger Americans favored at much greater rates than those who were older and had lived with Jim Crow for much longer; their children fought for women's equality, as personified by the second- and third-wave feminist movements of the 70s, 80s, and 90s; and now, as the "millennials" become an increasingly significant part of the voting population, they lead the fight for gay rights, having grown up surrounded by openly gay friends, family, classmates, and public figures. Those who are 25 years old and younger do not believe that sexual orientation matters, though they recognize that bigotry towards those who are gay or bisexual does, and it is because of this generation--my generation--that gay rights is inevitable. The question is that, once gay rights are codified by law, will there still be fight left for the transgender community, too, or will they be ignored until the next generation--a ship of millions left to sit lonely along the horizon, ignored by the righteous many who've already landed on shore?
*Today, "queer" has been resurrected as the most all-encompassing term, incorporating not only gays, lesbians, and bisexuals but also the transgender community, pansexuals, asexuals, and every shade of non-heterosexual in between.
**As an initialism, "LGBT" is fluid. Over the years, and especially in the last decade, it has been revised to include Q (questioning/queer), C (confused), I (intersex), P (pansexual), A (asexual), O (other), and so on, to the point of it being a sort of in-flux alphabet soup. As there's no one consistent, universally accepted initialism, I use "LGBT" throughout my review, as it fits with the purpose of Kuklin's book (and embodies the greater problems raised by such a wide-ranging categorization).
***The specific information can be found here, where it is explained much more clearly.
Sunday, May 4, 2014
No president can be perfect. It's a difficult but inevitable aspect of a government that is built, run, and refreshed by its people: we are flawed, and therefore our system--not to mention the men and women we elect to control it--is flawed. Even those whom we lionize for their bravery and steadfastness, their roles in molding our nation into something more perfect and more unified, made decisions while in office that, even by the standards of their day, would be considered illegal, thoughtless, or inhumane. For instance, in the course of the Civil War, Lincoln suspended habeas corpus, which gave him the power to deny spies and political prisoners all of the judicial guarantees provided by the Constitution; it also allowed soldiers to search homes and seize property without a warrant, gave the president the authority to establish martial law, which Lincoln did in Kentucky, and invalidate any lawsuits against government agents because of otherwise punishable crimes, such as trespassing and false imprisonment. Lincoln did this while waging a war against the Confederacy for seceding and, in doing so, committing treason against the very same Constitution Lincoln himself was ignoring.
Eight decades later Franklin Roosevelt gave in to racial fear-mongering and forced more than 100,000 Japanese and Japanese-Americans to be removed from their homes and relocated to internment camps throughout the country. (In other parts of the country, the relocated populations were of German heritage rather than Japanese heritage.) Rendered through executive order, Roosevelt's decision came at a time when Hitler was himself overseeing a much similar, though much wider and more brutal, practice against the Jewish population of Europe. At the time, Roosevelt's decision was seen as a necessary evil, one based on what can only be described as a cautious paranoia; today, we know it as an indefensible crime against humanity and a shameful, unjustifiable chapter in our history.
Every president, regardless of their political affiliation, their legacy, or the length of their service, made decisions that we today look on as misguided, if not dangerous or shameful. Even those who took brave stands for what was right--historically significant moments, the kind that define a president's legacy--were often forced by public opinion or electoral defeat to soften their attitudes, if not walk back their ideas entirely. (Teddy Roosevelt speaking out against lynching and dined with Booker T. Washington, only to refrain from advocating for civil rights legislation or confronting Southerners directly, is the most obvious example of this.) And yet, for just as long as we've had presidents, we've suffered under the delusion of executive perfection--of the ideal candidate, the most influential statesman, the Great American President--and it colors not only our own beliefs but how we understand and learn from our own history. If we spend our entire educational careers desperately seeking out personifications of American exceptionalism, only to see those characterizations dashed when the truth is revealed, we are creating a fantasy that can never be fulfilled--a dream that will inevitably become a nightmare.
This is the fate of anyone who endeavors to write an appreciation of a president and his ideas: at some point in the process of researching and writing, the author must reckon with the disappointments inherent in being Commander in Chief. Legislative failures, military entanglements, economic downturns, domestic failings, social unrest, electoral rebukes, indecision--it is part of their history and therefore must become part of the narrative; otherwise, you are rewriting history through intentional ignorance or spin. Thankfully, most of the historical works we see today respect this balance--between the successes and failures, between what the president sought and what they actually accomplished, between their words and their actions--and those that do not are quickly and derisively dispatched to the dusty attics of history, as they should be.
But what of those who write of their president's ideals, write of the president himself, and also include the inevitable failings without ever reconciling the two to create a unified history? That is to say, what of the author who writes of Lincoln's grand defense of the American Constitution and his grave subordination of the very same document without ever addressing this discrepancy? What of the author who praises Theodore Roosevelt for his progressive stand against racism, derides him for his own personal prejudice and weak support for civil rights, and allows both truths to coexist without seeing this as a problem worth addressing?
This dangerous possibility both haunts and vindicates Harvey J. Kaye's The Fight for the Four Freedoms, a look at how one of Franklin Delano Roosevelt's greatest legacies persists in American discourse despite repeated attempts to undermine it and a series of presidents who were unable to bring it to life.* Believing that all people require the same four freedoms in order to live a truly purposeful and enjoyable life--freedom of speech, freedom of worship, freedom from want, freedom from fear--Roosevelt spent his final years as president failing to see adequate legislation enshrining these four freedoms in law. Kaye specifically focuses on the first, "Freedom from want," as a way to weave Roosevelt's story--and the story of the subsequent eighty years--with the current state of our country, in which millions of Americans find themselves unable to pay their bills, support their families on full-time salaries, send their children to college, or ensure a comfortable future for themselves and their loved ones, all the while chief executives and businessmen reap huge profits. As far as Kaye is concerned, it is this pillar more than any other that is the most prescient to our world and, in failing to be realized legislatively, the one that could do the most good for the greatest number of people.
Kaye has an obvious affection for Roosevelt's four freedoms, and his disappointment in Roosevelt for not achieving these four goals--because of the war, because of Republican intransigence, because of racism, because of conservatives and capitalists--colors much of the book, and palpably so. He is even more critical of those who occupied the White House after Roosevelt, regardless of whether they embraced his legacy or attempted to dismantle it; this includes the current president, who has tried and failed on many occasions to create laws that lessen the wealth gap and income disparity that has so damaged our country over the last quarter-century. Kaye still believes that Roosevelt's four ideals can be realized--in fact, he seems to believe the future of our country and the health of its Constitution depend on it--but much of his 200-odd pages are a dire history of lofty speeches, progressive ideas, and a willing population, all ending in bitter disappointment time after time.
Yes, no president is perfect, but there are those whose attempts at success were more effective than others. Lyndon Johnson came closer than almost any other elected official to turning FDR's dreams into reality; through the Voting Rights Act, the creation of Medicaid and Medicare, and significant new immigration and education acts, Johnson created a path for millions of Americans to gain greater levels of equality, security, and pride. However, even Johnson's legacy is tainted--in this case, by a war so controversial and devastating that it will certainly be the single most important aspect of his presidency for the next fifty years. And here is the paradox in Kaye's book: the difficulties of being president do not prevent Roosevelt's four freedoms from ever being realized; however, the imprecision of those freedoms--how we define want and fear, how we gauge speech and worship--means that success in fighting for any one of these four may go unnoticed, unappreciated, or derided as not good enough. How do we measure a president's success when it comes to democratizing speech, guaranteeing open worship, eradicating want, dispelling fear? If a president were to accomplish all four, with little compromise, would that be good enough? Or would we look over the unavoidable imperfections of their career and say, with a sigh and a nod towards history, that it wasn't enough, that it could've been better? Or would we look over their limitations and say, confident, that they did the best they could?
We as a nation will always suffer from prejudice, will always face inequality in our neighborhoods, will always have a reason to be afraid. There will always be a great need for healing and improvement because we are a land of people: human beings, young and old, who are just as flawed as the people to our right and left, just as imperfect as those who came before us and who will come after. And as time moves forward and more of history is written, we see the small steps of progress that each successive generation makes--not enough to quench our troubled national conscience, no, but enough to know that we are not the animals we used to be. If we are fated to never realize the four freedoms that Roosevelt proposed eight decades ago, we can at least be comforted by the knowledge that we are a better, more freer people than we were yesterday, and we will be even more free when we wake up tomorrow. Free from our bigotries, from our unrealizable desires, from meaningless fears. We won't realize this until much later, as changes like this are incremental, but so is the passage of time itself.
*In the interest of full disclosure, Kaye is professor at my alma mater, where he lectures on many of the same topics addressed in his book. However, I was never one of his students.
Friday, May 2, 2014
When John Paul Stevens retired from the Supreme Court in 2010, he had served as a justice for close to thirty-five years, making his term the third longest in American history. He was also 90 years old, the Court's last veteran of World War II, and--if his critics were to be believed--a relic of a bygone era whose departure was two decades too late. Often described as "wise" and "soft-spoken," his customary bow-ties giving him the appearance of a sweet and elderly grandfather out of touch with current trends, it was easy for those beyond the Court's walls to view Stevens with an undisguised mix of amusement and derision: a snicker and a wink about the Republican appointee who had gone soft, switched sides, and grown old. What the pundits and stone-throwers didn't understand--what those who worked the Supreme Court knew from personal experience and study--was that Stevens' age and appearance belied an agitator of the highest democratic ideal, a man who knew the law better than almost anyone and wasted little time on the ignorant. In his last years, when the Court's decisions ran counter to his own understandings of the law--that is, when the conservative justices began overriding precedent and undermining Constitutional law--Stevens' dissents were brutal in their honesty and intelligence, especially when focused on the honesty and intelligence of the majority. When outsiders criticized Stevens as a turncoat who abandoned the Republican cause, they didn't understand that it had abandoned him first.
Now 94, Stevens' mind is still as strong and clear as it was during his term. Only now, the Supreme Court four years behind him, his judicial ideas have become much more political in nature. He does not look kindly on the justices he left behind, especially those who have worked so hard--and worked during the last years of his term--to undo much of the progress of the last 75 years by reinterpreting and redefining some of the most fundamental American rights--free speech, life, liberty, equality--to create a system in which those with money have power, those without have even less, and those who reek of "otherness" are prevented from becoming part of the great American "us." It is this anger--and despite his measured and professional tone, Stevens is clearly angry--that compelled him to write Six Amendments, in which he proposes small changes to preexisting Constitutional amendments with the goal of making our country a more perfect union.
As with most cases that involve the Supreme Court, however, there are problems. The most obvious, and therefore the most damaging, is not one of purpose or authority--he has both, and as someone who has lived with the Constitutional longer and more intimately than almost anyone else alive today, he should be more than forgiven for taking up the cause of preserving America's greatest ideals through monumental revisions...after all, he would know more than anyone else where the document's shortcomings can be found. No, his problem is not one of ethos but of interest. The dry, academic style that served Stevens so well as a sitting justice make much of his book difficult to read. (His longest chapter, on sovereign immunity, is downright impenetrable.) While on the court, Stevens wrote primarily for those involved in law--after all, they would be the ones most likely to reread and reference his opinions in the decades to come; a book, on the other hand, is intended for wider audiences, and the language and style inherent in judicial literature does not translate well to the bookshelves and devices of the average reader. The only exception is Steven's chapter on the death penalty, which he bases almost exclusively on personal stories--his own, the defendants of two death-penalty cases, and so on. In doing this, Stevens removes the issue from the lofty heights of cold legalese and makes it real, palpable, and relatable, insomuch as a death-penalty case can be. It's also the only amendment that Stevens stakes in ethics rather than Constitutional precedent: by continuing to use the death penalty today, long after the rest of the industrial world has abandoned it, and in the shadow of so many wrongful convictions and botched executions--which, as it happens, meet the Constitutional definition of cruel and unusual--we are staking our national and democratic reputations on an act that is hypocritical, immoral, and ineffective.
It's the sixth and final chapter, however, that is the entire impetus for Stevens' one-man Constitutional convention, as Stevens himself reveals on the very last page: below a photograph of the Sandy Hook Elementary School in Newtown, Connecticut, where 26 students and educators were killed by a lone gunman, Stevens writes, in part, "That incident provided the catalyst for writing this book." And yet, for all the passion that must have propelled Stevens to write a book in response to such a tragic event, his final chapter is by far the book's weakest. At only 9 pages, Stevens breaks down the rulings in two recent gun-rights cases and decries both as misguided, as neither followed the Second Amendment as it was originally intended. For Stevens, the poorly worded amendment was designed only to ensure unrestricted gun possession among members of a militia, not private citizens; however, he does provide a single source to support this reading, not from the writings of the Founders or Constitutional drafts or even the analysis of scholars, historians, linguists, or grammarians. Instead, he pronounces his reading as fact while dismissing every other reading as political, then concludes the chapter with a photographic nod to the source of his writing--a memoriam that is, unfortunately, underserved by author himself.