Garry Wills’ Qur’an (Part One)

Is Islam as American as apple pie?  Both are early imports from Asia Minor – Islam from the Arabian Peninsula by way of Africa and Iberia, apples from southern Kazakhstan by way of Europe – that have grown deep roots in the New World.  Islam has directly affected the New World in ways that have been obscured for generations but deserve better understanding today.

The history of Islam in the Western Hemisphere has long been debated in the Near East.  There are some interesting, if apocryphal, suggestions that early Muslim navies traveled to North America from the Mediterranean before Columbus, but evidence is scarce.  Islam definitively arrived in the Americas with the Spanish conquista.  With them the Spaniards brought tens of thousands of African slaves, a large plurality of whom were likely Muslims, as early as 1501.

The conquista was profoundly affected by the Spanish experience of both Moorish rule and the reconquista that expelled Muslims from the Iberian peninsula in the late 15th century. The pursuit of gold in the New World was motivated in part by the financial burden of the war and the sheer fact of reconquest in Spain drove a self-fulfilling narrative for the brigands and ne’er-do-wells who led the pillage.  In their minds the conquest of the New World was an extension of the liberation of the old.

What the quran meant

But the Spanish could not purge the cultural influence of Muslim rule as easily as it could the population that brought it to them.  Just to start, the entire Spanish language was heavily influenced by Arabic including hundreds of adopted words.  You may never view Arnold Schwarzenegger the same when you consider that his characteristic line, “Hasta la vista, baby,” is a direct Arabic import from hatta meaning “until”.  Likewise, Spanish speakers from Argentina to Canada still use the expression ojala, invoking God, meaning the same thing as the Arabic inshallah:  God willing.

Consequently, the Spanish left an Islamic-inspired legacy across the hemisphere.  The geometric tile mosaics of Seville, Spain, were inspired by Islamic art whose legacy can still be found as far away as Mexico and California.  The famously beautiful enclosed balconies of Lima, Peru, are a direct import from North African moucharaby latticed windows.  Place names influenced by Arabic terms proliferate.  Guadalajara, Mexico’s second largest city, means “Valley of the Stones” in Arabic.  The Catholic patron saint of Mexico, the Virgin of Guadalupe, has Arabic roots: Guadalupe is an Arabic-Latin mash-up meaning “Valley of the Wolves”.   The historical influence doesn’t stop there.  Matamoros, a Mexican border town opposite Brownsville, Texas, means “Moor-Slayer,” the epithet applied to Rodrigo Díaz de Vivar, known to Muslims as el Cid (el sayid), a Spanish holy warrior of the reconquista.  Santiago de Chile and San Diego, California, are named for St. James, a mythical hero of that war.

This influence persists even in the United States.  There is strong if not conclusive evidence that California’s etymology is rooted in the term “caliph,” which applies to a Muslim leader descended from the prophet Muhammad.  Similarly, it is possible that Albuquerque stems from the Arabic term Abu al-Qurq, meaning “father of the oak”.  Andalucia, Alabama, may have adopted a residual place name from the Spanish colonialists who explored the south during the 16th century.  Al Andalus was the name of Islamic Spain.

African slaves poured into the hemisphere shortly after the conquista.  At least 10 percent of the 400,000 Africans kidnapped to the United States were Muslims.  This is a fair if low estimate for the rest of the Americas.  The most notable slave uprising in Brazil, to which the Portuguese brought three million Africans, was led by a Muslim community known as the Malê.  While most Africans were converted to Christianity, it is well-documented that many of these men and women retained their names indicating Islamic roots: Muhammad, Fatima, Ayisha.

Two African American slaves, Ibrahim Abd Al-Rahman and Omar ibn Said, achieved modest fame in the 19th century when they demonstrated literacy in Arabic.  Through a dramatic political intervention, Al-Rahman was manumitted to Morocco with his wife.  (Sadly, not their nine children.)  Ibn Said remained property in the United States and died two years before the 13th Amendment was passed that would have freed him.

moors account

The Moor’s Account, a recent novel by Laila Lalani, tells the true story of Estevanico, a Moroccan slave who accompanied the Panfilo de Narvaes expedition to Florida in 1527.  Estevanico, whose real name was probably Mustufa Zemmouri, was one of four surviving members of the expedition whose numbers were decimated by shipwreck, disease, exhaustion, and native population raids on the invaders.  Before he died, probably in 1539 in what is now New Mexico, he traveled from Florida along the Gulf Coast, across what is now Texas and northern Mexico, all the way to Mexico City.  He was among the first non-natives to see what we now call the American southwest.

Muslims did not exist in individual vacuums in the United States: there were communities of Muslim believers, including one led by Bilali Muhammad in Georgia.  Muhammad was literate in Arabic and wrote a short treatise on Islamic law before his death.  He also commanded 80 men during the War of 1812.  Indeed, Muslim soldiers served in the Continental Army during the Revolutionary War and the Union Army during the Civil War.

jeffersons quran

Separate from the faith of the African population, which did not interest their owners, Islam conceptually and politically affected the founders of the American republic.  In Thomas Jefferson’s Qur’an, Denise Spellberg’s comprehensive survey of the influence of Islam on the Founders’ debate over religious freedom, tolerance, and political participation, she reveals a radical, if wholly theoretical, acceptance of plural belief in the early United States.  In contrast to Great Britain, whose monarch is also head of the Church of England, and most European countries with their own state church, the Americans imagined their new state purged of church influence and religious society protected from government action.  At that time, the country was utterly dominated by Protestant sects.  Catholics were a distinct Christian minority, except in Maryland (which they founded) and Jews were considered so rare as to be exotic.  The belief systems of the indigenous people of the Americas were barely acknowledged and the Islamic beliefs of the enslaved population virtually unknown.

The drafters, in sum, made an extraordinary concession to a future they only could imagine when writing the constitution to forbid religious discrimination explicitly.  The founders, in an extraordinary leap of faith, embraced the distinct possibility that future U.S. officeholders, including the president, may not be Christian.  At that time, in a country dominated by Protestants, Muslims were routinely lumped together with other religious and cultural minorities of the age, including Catholics, Jews, pagans, Hindus, Indians and “infidels”.  The political principle of religious inclusion is a cornerstone of revolutionary American democracy.  The vision of religious freedom appears, in retrospect, astonishingly clairvoyant – an almost science fiction vision of their country 200 years in the future that actually came to pass.  Today, in that envisioned future, Christians still predominate in the United States but Protestants do not.  Catholic justices now hold a majority in the Supreme Court.  Jewish Members of Congress serve at three times their representation in the population. And Islam is the fastest-growing religion in the United States.

While clearly none of the American founders was an Islamic scholar, they appear to have been better acquainted with Islam and the great Islamic civilizations than the contemporary generation.  The early Americans, in exalting “foundation,” placed the experimental United States alongside the world’s great civilizations, which included Rome and Athens but also the contemporaneous Ottoman Empire as well as ancient Egypt and Persia.  The founders knew their history and drew from the historical experience in crafting the government.

This homage is found in the physical structures that symbolize the republic.  A relief of Suleiman the Magnificent graces the chamber of the House of Representatives.  Islam is depicted as an allegory for physics on the ceiling of the Jefferson Building of the Library of Congress.  And the Prophet Muhammad himself is depicted in relief in the U.S. Supreme Court as a great lawgiver.

Unfortunately, an intellectual caesura has opened up between the revolutionary generation and today’s leaders and thinkers.  Indeed, a concerted collective attempt by the Christian majority to understand Islam only occurred after September 11, 2001.  The gap in knowledge unfortunately remains evident.

But it was not universal.  Today about half of the U.S. Muslim population consists of American-born converts, and the largest representation of those are African Americans.  This American Islamic tradition dates back more than a century to the founding of the Moorish Science Temple of America in 1913.  A follower known as Wallace Fard Muhammad broke from the temple to establish the Nation of Islam in 1930.  Both organizations were syncretic religious/political movements with roots firmly sunk in African American history and experience.  Nevertheless, the Nation of Islam reformed itself into an orthodox Sunni Muslim organization, still dominated by African American converts, following the death of Elijah Muhammad.  No American today can claim absolute ignorance of Islam if they know the names Malcolm X and Muhammad Ali, the Nation of Islam’s two most famous converts.  But these movements have always been considered fringe, both politically and theologically.

How can we account for this collective loss of knowledge?  One way may be examining the vaunted Western Canon, that corpus of literature spanning back to Greece two thousand years before Jesus Christ.  The definition of the canon varies, which is what makes Harold Bloom’s definitive list so important.  In The Western Canon, Bloom specifically extols the Qur’an as a source of law, ethics, and poetry as part of the Western tradition.  (Strangely, this is his only other mention of Islam in the book.  The Qur’an isn’t even noted in the index.)  He includes the Arabian Nights, The Poem of the Cid, the apocryphal Rubaiyat of Omar Khayyam, William Shakespeare’s Othello, as well as Miguel de Cervantes’ Don Quixote, whose framing story involves finding the manuscript written in Arabic by an “Arab Historian”.  Edward Gibbon’s Decline and Fall of the Roman Empire, while not hospitable to Islam, nonetheless represents its core tenets accurately.  Goethe’s last work, West Eastern Divan, was inspired by the Muslim Persian poet Muhammad Hafez e Shirazi (and inspired Muhammad Iqbal to write an homage to Goethe in return).  Herman Melville’s character Ishmael in Moby-Dick (his name is the Biblical progenitor of the Arabs) describes the fasting and prayer of his harpooner bunk mate Queequeg as a kind of “Ramadan”.  Mark Twain’s Innocents Abroad, though not included in Bloom’s list, was widely read contemporaneously and involved descriptions of the Grand Tour that includes the Holy Land and Egypt.  Clearly, literate Americans were familiar with the Islamic world as late as the 19th century.

But all of that prologue is forgotten in the contemporary era.  The answers to why Islam’s cultural and philosophical influence in the United States fell away since can be explained in part by examining Bloom’s modern canon.  Not a single great 20th century American writer wrote on these themes.  This suggests a deterioration of collective knowledge and experience in American letters.  The Arabic writers Bloom cites, including the Nobel Prize winner Naguib Mahfouz, were largely secular in nature (a sin for which he was stabbed in the street by an Islamic extremist).  Other Europeans address these themes to a lesser extent: Albert Camus (The Stranger), Ivo Andric (The Bridge on the Drina), Amos Oz (The Perfect Peace), and Lawrence Durrell (The Alexandria Quartet).  Still other writers aren’t included in the list but probably should be:  Rebecca West (Black Lamb and Grey Falcon), T.E. Lawrence (Seven Pillars of Wisdom), Gertude Bell (The Desert and the Sown).

In the 20th century, American writers were grappling with modernity and affluence, war and peace, the immigrant experience and the African American struggle for justice.  After the collapse of the Ottoman Empire in 1918 learned Americans had little reason to include the Islamic world in their thinking until that fateful second Tuesday in September 2001.  That is where the reckoning with our intellectual history began.

###

 

 

UPDATE: The Price of Promotion – Diversity and Retention

60728610_10157238647399383_273462775349510144_oThis is a follow-up to my earlier article, “The Price of Promotion,” which criticized the U.S. Foreign Service promotion system.  The following analysis is based on the Department of State Bureau of human Resources “Foreign Service Promotion Statistics by Cone, Ethnicity, Race and Gender” for FY 2017 and 2018.  I have focused primarily on gender and racial classifications limited to white, African American, and Hispanic.  I do not claim this to be definitive; I am not a statistician and I am happy to engage in conversation about my conclusions.

Promotion rates for women on aggregate are higher than for men but this masks the fact that there are far fewer women in the Foreign Service.  The female population is small enough that the higher promotion rate does not narrow the gender gap at senior ranks.  The Foreign Service is 35 percent female, compared to 43 percent across the federal workforce and 46 percent of the national workforce.

There are so few racial minorities in the Foreign Service that their distribution across cones, where they compete and are promoted, can be numbered in single or low-double digits.  As with female officers, this dramatically limits the numbers of minorities competing at the senior level.  In fact, the numbers are so small that they affect statistical validity.  While friends have noted that even with low numbers, or a limited sample, certain reasonable assumptions can still be made, readers should not draw hard conclusions in the absence of a larger sample.  This probably explains the wide variation in numbers and promotion ratios: individual variations can have dramatic statistical consequences.  (Additional data earlier than 2017-2018 is not currently available while the State Department updates its web site.)

The gender distribution varies dramatically by cone and that directly affects promotion opportunities, especially at the senior ranks. Public Diplomacy is 53 percent women, for example, while in Political women represent 32 percent of the total.  Public Diplomacy has more women and minorities represented but fewer officers than Political.  This narrows the number of women and minorities who are promoted each year, as promotion rates and opportunities vary by cone.

Specifically, Political has more officers and promotes more of them to the senior ranks than other cones.  The most dramatic evidence of this is the ratio of women competing at the FS-01 to FE-OC level and above by cone and real numbers of competitors: Consular (35 percent of 102 total competing), Economic (38 percent of 162), Management (35 percent of 105), Public Diplomacy (47 percent of 130), and Political (28 percent of 205). This hurts women as they advance primarily because Political promotes the most to the senior ranks (45 slots to Consular’s 23, for example).

Similarly, promotion rates for white officers in Political and Management cones are more than double their African-American colleagues and also higher in Public Diplomacy.  But African American officers exceed the rate of promotion of both their Hispanic and white colleagues in Consular and Economic cones.  Again, keep in mind this is based on two years’ data and very small real numbers of minority officers that can fluctuate dramatically depending on the data set.

Most importantly, female and minority officers between the FS-02 and FS-01 threshold appear to be abandoning the Foreign Service at higher rates than white and male officers.  Whether this is voluntary withdrawal or the result of time-in-service limits is a question worth looking at in more detail. But result in both cases is the same: it reduces the ratio and real number of women and minorities competing at the senior ranks, and it reduces real numbers that were already low to begin with. The Foreign Service has a minority population of 21 percent (2016), compared to 36.4 percent of the federal workforce and 35.3 percent of the national workforce.

The already-small number of women and minorities in the Foreign Service, as compared to the federal and national workforce, is an embarrassment.  But it can be remedied.  If we intend to deploy a diplomatic corps that represents the United States in its broadest and deepest sense, we need to improve the recruitment and retention of our most talented citizens.  This clearly demonstrates the need for improved gender- and minority-inclusive recruiting, creative retention strategies and incentives, early leadership training and mentorship, and aggressive efforts to combat conscious and unconscious bias.

###

 

The Price of Promotion

The State Department’s employee evaluation process is worse than terrible.  It is no better than a gamble.

Department of state seal

Consider this career choice: you are a second-tour FS-04 consular officer serving your country in an American embassy abroad.  You have tenure, which means that you’re more or less guaranteed employment with the State Department for the next 15 years.  You are now being considered for promotion to FS-03.  In a bid to speed up the process, the Management Bureau Office of Performance Evaluation approaches you with this offer: you may proceed with the promotion board selection process or, more simply, flip a coin.  Heads, you will be promoted.  Tails, you will not.

Even odds are not great but they’re better than the 2018 promotion statistics suggest you have, which is closer to one in three.  Still, you’re told promotion is a merit-based process decided after rational, impartial evaluation of your accomplishments.  You’re a hard worker and have made no mistakes.  Your peers and supervisors like you and your Employee Evaluation Reports (EERs) have all been stellar.  So you decide to go with the promotion panel.

Everybody in the department believes this, or at the very least, cynically behaves as if it were true.  And it may well be true.  But here’s the problem: social scientific research argues strongly that the State Department’s promotion process produces decisions little better than random chance.  The department has the data available to determine whether panel decisions are indeed correlated with future performance rather than blind luck but it has not used that information to improve the process.

This should alarm everyone and not just those being considered for promotion.  If the department cannot accurately select future high-performers, then we are rewarding people who do not deserve it and punishing those who don’t deserve that, either.  And it means, when it comes to the future leadership of the premier executive department, we are doing no better than throwing dice and hoping for the best.

The EER has a long history.  The Rogers Act of 1924 set up the modern Foreign Service, replacing prodigal heirs populating the diplomatic corps with a professional and meritocratic promotion system.  The 1980 Foreign Service Act instituted new reforms, significantly eliminating the evaluation of a diplomat’s spouse, then almost entirely women, as part of the employee’s evaluation.  The Foreign Service now uses a narrative system primarily devised in 2002, amended partially in 2015, resulting in the goals-oriented, short-narrative EER in use today.

Each year the Performance Evaluation office convenes around 20 boards made up of about six panelists, including an outside civilian.  For six to eight weeks each panelist reviews 40 candidate folders per day, which include the past five years’ EERs.  That accounts for some 30,000 EERs reviewed or more than 200,000 pages in total for each board, a daunting task by any measure.

The best thing to be said about this promotion process is that it is worse than a similar system the Israeli army abandoned in 1955.  Daniel Kahneman, then a 22-year-old psychology graduate, was asked to assess how the nascent Israeli army selected its officers.  The army used a process adapted from the British following World War II.  It involved evaluating a group of officer candidates working together to physically bridge an obstacle.  Here, Kahneman writes in his Nobel Prize biography,

[w]e were looking for manifestations of the candidates’ characters, and we saw plenty: true leaders, loyal followers, empty boasters, wimps – there were all kinds. Under the stress of the event, we felt, the soldiers’ true nature would reveal itself, and we would be able to tell who would be a good leader and who would not.

But the trouble was that, in fact, we could not tell. Every month or so we had a “statistics day,” during which we would get feedback from the officer-training school, indicating the accuracy of our ratings of candidates’ potential. The story was always the same: our ability to predict performance at the school was negligible. … I was so impressed by the complete lack of connection between the statistical information and the compelling experience of insight that I coined a term for it: “the illusion of validity.”

Kahneman, along with his partner Amos Tversky, documented another cognitive fallacy in this exercise: the human tendency to make extreme forecasts based on very little data.  Overconfidence in predicting future performance has been documented in fields as diverse as football, economics, the weather, and the stock market.  It should come as no surprise, then, that predicting future performance of diplomats would be shot through with overconfidence as well.

He continues:

Closely related to the illusion of validity was another feature of our discussions about the candidates we observed: our willingness to make extreme predictions about their future performance on the basis of a small sample of behavior. … As I understood clearly only when I taught statistics some years later, the idea that predictions should be less extreme than the information on which they are based is deeply counterintuitive.

Kahneman subsequently drew up a more objective measure of performance using a numerical scale to rank candidates based on certain attributes correlated with future success in officer training.  It was not perfect but it has remained, more or less unaltered, with the Israeli army ever since.

The similarities between the two evaluation systems are striking.  Like the Israeli army of 1955, the State Department panels rank candidates into three tiers.  Like the Israeli army, the Foreign Service Officers serving as judge and jury over lower-ranking strangers are not human resources professionals or executives with hiring and firing experience.  They are beneficiaries of the same system that selected them.  They have no reason to doubt the process that promoted them and in fact have quite a psychological incentive to perpetuate it.  Kahneman and his disciples were skeptical of expert judgment.  Kahneman cites as an influence the work of psychologist Paul Meehl, who famously determined the superiority of actuarial prediction to clinical judgment in medical prognosis.  Kahneman eventually came to understand the value of training and experience to improve professional judgement.  But the one-time panel members only receive minimal training for their task and rarely serve on a panel again.

The only fundamental difference between the two selection processes was the Israeli army actually observed the individuals being ranked.  The department, using a record of paper, is basing its decision on hearsay.  There is no interaction with the candidates themselves.  The selection process is done by consensus, which means the entire panel must agree on every individual file.  Indeed, officers I have talked to say there is rarely any dissent, that they all see the same things the same way, and that what they are seeing is blindingly obvious on its face.  Kahneman saw the same thing, too.  As he told the author Michael Lewis, “[t]he impression we had of each candidate’s character was as direct and compelling as the color of the sky.”  But he could not correlate these judgments to their outcomes.

The Foreign Service promotion system is explicitly designed to predict future performance.  It is projecting: EERs should tell us who will succeed at the next level and who will fail.  The Procedural Precepts for the 2018 Foreign Service Selection Boards specifically state, “[p]romotion is recognition that an employee has demonstrated readiness to successfully perform at the next highest level.”  It continues:

A recommendation for promotion is not a reward for prior service.  Boards should recommend for immediate advancement only those employees whose records indicate superior long-range potential and a present ability to perform at a higher level.

But as Kahneman demonstrated and has been amply documented from economists like Richard Thaler and statisticians like Scott Armstrong, people are very poor predictors of future human behavior.

In addition to overconfidence, Armstrong outlines two ways people often get predictions wrong.  First, he notes that agreement or consensus is not that same thing as accuracy.  The board process requires each member to rank the top promotable candidates but then, strangely, come to a group agreement about the order of those candidates (as well as those mid-ranked and low-ranked).  Second, Armstrong notes that increasing complexity makes prediction correspondingly more difficult.  Variables and uncertainty, which perfectly describes the Foreign Service career assignment system, make predictions of future performance much less tenable.

All this perpetuates a system governed by subjectivity and intuition, both of which Kahneman and his partisans have exposed as no better than chance.  There is very little objective measurement of performance or potential.  Not even the straightforward U.S. military requirement to tick a box recommending promotion or not is available in the EER.  State Department panel members, already unable to read the entirety of each individual EER because of sheer volume, must parse a hundred different ways raters recommend promotion, looking for evidence of faint praise or lack of enthusiasm.

Performance Evaluation considered these issues, including objectivity and bias, in an declassified 2013 cable.   The cable argued that the Defense Department’s system of numerical ranking was more appropriate to the task of evaluating hundreds of thousands of members of the armed forces.  But if it were simply a matter of scale, shouldn’t there be enough military officers reviewing their peers in similar ratios to Foreign Service boards?  More importantly, it is difficult to argue that the Pentagon is concerned more with efficiency than efficacy when lives are literally at risk if the system fails.

In another unclassified cable in 2015 accompanying the modestly reformed EER, the department attempted to address rank-and-file requests for a more objective performance evaluation system.  The department responded by saying it had conducted external consultations and concluded metric evaluation fed bias and grade inflation.  But these are precisely the problems with the current system as evidenced by the department’s concern with unconscious bias on the one hand and inflation of performance evaluation on the other hand.  A 2017 unclassified cable noted specifically how difficult board members found it to rank candidates who were all, to borrow a phrase, above average.  There is no question about the problem of bias and grade inflation.  The question to ask is which system – narrative/subjective or quantitative/objective – best reduces these biases and has the highest correlation with future performance of employees.

The consequence is a collective, department-wide effort to game the promotion system.  Conventional wisdom among the department rank and file accepts that there is a proper way to write or couch the EER, that deviating from this unwritten rule spells career doom, and that there is absolutely no quarter to be allowed to document poor performance, individual struggle, or personal failing.  The logical outcome of this conventional wisdom should be no surprise.  It produces highly inflated and hyperbolic EERs crammed with superlative performance and exceptional personal achievement, not any rational or reasonable measure of performance or readiness for promotion.  It is absurd, especially at the lower ranks, to expect that all officers will be equally excellent at all things, which is the picture the EERs in the aggregate indicate and about which promotion panels perennially complain.

Given this, the lack of quantitative measurement of performance is especially glaring.  This may not seem important given the depth and rigor of a professional, narrative evaluation.  But quantitative data is the standard measure of performance we get in high school, university, and graduate and professional schools.  High school grades are more strongly correlated than the Scholastic Aptitude Test to university success.  Performance in individual subjects can be aggregated in a grade point average, and that grade point average along with standardized test scores can be correlated with future performance as measured in future grades and earnings.

The 2013 cable notes eight systemic biases without addressing explicitly how the EER and promotion board system are designed to minimize bias.  It is worth quoting in full:

Boards can help to cut through overt or subtle biases at the line manager level.  In addition to invidious general biases (age, gender, race, ethnic origin, sexual orientation, religion), more nuanced types of bias are also workplace hazards.  The [Foreign Service] Board system, by having an impartial objective group, mitigate possible bias.  The Foreign Service grievance system adds additional safeguards.  Bias is simply a personality-based tendency, either toward or against something.

In other words, the boards are impartial and objective because the bureau says they are impartial and objective.

The department has recently required unconscious bias training for selection panel members and has long screened EER drafts for inadmissible information.  This is an important start.  But the panels themselves are structured in a way to protect, not remove, bias.  Panel members are not required to recuse themselves unless they are reviewing a family member.  The panel decision itself cannot be appealed; employees can only grieve language they have already consented to send to the boards.

While the department expresses concern about bias and forbids most explicit (race, sexual orientation, age) and implicit (“staffing” to replace “manpower,” for example) bias, importantly quite a lot of information is still not scrubbed from standard EERs.  Importantly, gender – a legally suspect class specifically mentioned in Equal Employment Opportunity laws and regulations – is permitted.  Additionally, names are also allowed which, in a diverse country like the United States, one can use to make reasonable guesses about a candidate’s religion, national origin, ethnic group, mother tongue, and race.

The department explicitly dismissed calls for a more quantitative or objective evaluation system.  The 2015 cable dismissing a proposal for quantitative performance evaluation specifically endorses a narrative approach and the 2017 cable reinforces it.   Unfortunately, social science research has repeatedly identified the “narrative fallacy” as a flawed heuristic people apply to interpret the past and anticipate the future.  The panels agree, in other words, to build a third cognitive fallacy right into the structure of the evaluation.

The lack of objective measurement lends itself to all sorts of abuses, including watered-down and “coded” language that experienced raters and reviewers use to communicate sub rosa to the panel, while lying to the candidate, that they are not promotable.  I am stunned that the use of coded language in EERs is not only an acknowledged practice but in fact accepted as the price of promotion in the department.  We would not tolerate secret communications from our children’s teachers or guidance counselors in letters of recommendation for jobs or university admission.  If such a practice were discovered, it would garner the same kind of legal and press attention the university admission-bribing scandal recently did.  Raters and reviewers who communicate in coded language are committing fraud.  This subterfuge would not survive an Inspector General investigation, Congressional scrutiny, or class-action lawsuit.

The EER, as a narrative instrument, lacks objective utility at a minimum and otherwise eschews any quantitative measurement of performance.  But this data is not entirely missing. The process has the statistical tools to correlate prior evaluations with future performance and promotion rates but has not used these tools. Promotion panels routinely rank EERs into three tiers and then rank order every EER in the top tier.  This could very easily be correlated with prior and future evaluations and promotion rates, but these ranks are not permanently assigned to individual officers or their EER files.  This is a mistake, the equivalent of grading classwork without issuing a transcript at the end of high school.

The panel’s purview is guided by two massive documents: the “Procedural Precepts” for Foreign Service Selection Boards (31 pages) and the “Decision Criteria for Tenure and Promotion in the Foreign Service,” confusingly also known as the Core Precepts (15 pages).  The Procedural Precepts outline the duties and procedures the panel must follow while the Core Precepts outline the skills and objectives an officer must demonstrate.  The Core Precepts refer to these as “guidelines by which Tenure and Selection Boards determine the tenure and promotability” of officers.  But nowhere in either document are instructions that explain how the board should apply the Core Precepts.  That’s the equivalent of applying the A-F grading system without explaining what distinguishes an A from an F.

There are six Core Precepts containing 31 subsections.  This is simply too much for any one narrative evaluation – the line count is just above 100 – to consider in any depth.  Even if several narratives over the last five years are evaluated, that leaves little more than 500 lines total.  So employees, raters and reviewers must select their most important accomplishments and skimp on covering all facets of the Core Precepts.

Panelists are left to parsing nuance of language, which is simply more noise above the randomness of assignment, location, experience, work product, and outcome, not to mention unconscious bias and writing ability.  How can a panel rationally evaluate a 25-year-old consular officer with no prior job experience working in a nonimmigrant visa mill in Mexico against a 50-year-old second career officer handling American Citizen Services cases following an earthquake in Japan?  How can panels distinguish between competent performance and sheer circumstance?

Fortunately, the Core Precepts actually provide the solution to this otherwise subjective mind game.  Instead of the laborious negotiation over the narrative evaluation, the employee, rater, reviewer, and a peer or local staff member would grade the employee against their peers in a given post on each of the Core Precepts subsections.  The employee would have no control over the grading but would be allowed to see it.  The individual grades and aggregate would be assigned to the employee’s file for future panels to view.

A numerical evaluation has more utility than its narrative counterpart.  It is more objective because in this case the reviewer is rating the employee directly against their immediate peers whereas under the current system the panel measures performance against an unseen global population.  The global view introduces much more noise, subjectivity, and randomness to the process.  A graded system can be averaged over a year, a grade, or a precept; it can be correlated to future performance; it can track improvement; it eliminates sub rosa sabotage (using regression analysis, we could determine whether a particular low rank was within the standard deviation of the other reviewers or prior evaluations).

Any kind of human evaluation system is prone to bias, error, and luck.  The best we can do is find the tools that limit our bias, reduce error, and account for luck.  The current State Department promotion system makes no systematic attempt to do that.  The result is the opposite of meritocratic: it is pure fortune.

###

The opinions and characterizations in this piece are those of the author and do not necessarily represent those of the U.S. government.

 

The Cost of Lies

On April 26, 1986, Reactor #4 at a Soviet nuclear power station in northern Ukraine exploded.  As with almost everything in Soviet history, that is about all anyone can agree on, but it was enough.  The worst nuclear accident on record, Chernobyl released the equivalent of 350 atomic bombs exploded over Japan.  The Soviet state rushed to deal with the crisis, sacrificing untold numbers of men and materiel to contain the catastrophe, while at the same time hiding it from local and international scrutiny.  But the damage was already done.  The accident exposed the rot, corruption, and deception at the foundation of the Soviet state.  Some believe the accident precipitated the collapse of communism and dissolution of the Soviet Union even if precious little evidence supports that.  But few doubt that Chernobyl is devastating as a real-world metaphor for all that was wrong with 20th-century communism.

A recent HBO miniseries, “Chernobyl,” based largely on the works of Nobel prize-winner Svetlana Alexievich and the newly published “Midnight in Chernobyl” by Adam Higginbotham, visually documents the disaster.  Higginbotham writes the first comprehensive narrative of the catastrophe in English.  Alexievich’s 1997 work documents the individual toll of the catastrophe in intimate, human terms.  Together, they provide a rich account of this extreme event in human history about which, they amply demonstrate, we still know so little.

Consisting of four RBMK-1000 reactors, the Chernobyl power station contributed to an expanding electrical grid fueling the Soviet economy with much-needed electricity.  The RBMK-1000 was a second-generation power generator.  The Soviets were the first to field commercial nuclear reactors and the RBMK-1000 was only a generation beyond the earlier “piles” that enriched uranium or produced plutonium for the superpowers’ nuclear arsenals after World War II.

The Soviet Union leapt ahead, in part, by cutting corners on design, safety, and quality.  Reactor engineering and construction were recklessly shoddy.  Soviet scientists knew this after the construction of the first few plants but issued no modifications or warnings.  Not surprisingly, Reactor #1 at Leningrad (now St. Petersburg) itself careened toward an accident just months before the infamous meltdown in April 1986.  That was just one of several nuclear accidents across the Soviet Union that have only recently come to light and that Higginbotham reports as a prelude to the Chernobyl catastrophe.

Soviet reactors were a physical manifestation of all that was wrong with its system of government.   The RBMK-1000 was enormous.  The core itself was larger than entire Western reactor systems, redundant containment units included.  It was so large that often the reactivity in one part of the core seemed to operate independently from the rest.  Technicians often couldn’t even measure what has happening deep in the fuel rods.  The reason for the size was not simply the Soviet mania for gigantism; the uranium fuel was not as refined or enriched.  In addition to being much larger, this systemic flaw made the reactor dangerously unstable at low outputs.

Image result for voices from chernobyl

Chernobyl nuclear power station had four reactors, each protected from the environment only by a steel-reinforced concrete pad and a huge, 2,000-ton cap on the reactor that threaded the fuel rods, moderators, and cooling water into the core.  By contrast, most Western reactors had at least three containment structures to protect against explosion or meltdown.  The Chernobyl’s obsolete design was water-cooled and graphite-moderated, meaning that the hot fuel rods were nestled in columns of highly flammable graphite, essentially pure carbon.  Instead of flooding the core with water the RBMK used what was in effect a giant radiator, instead of a pool, to cool the fuel.  Each flaw contributed to the accident.

All industrial accidents, from airplane crashes to oil refinery fires, result from a cascade of equipment failures and human error.  No single glitch or misapprehension is the direct cause, but all of them in order can lead to an accident.  Chernobyl is no different, although in this case unlike others the system of government and its paranoid ruling culture were also contributing factors.

The miniseries views this disaster cascade as the story’s central mystery to solve: when the reactor careened out of control during a safety test, the shutdown order acted not as a brake but as a detonator.  Soviet scientists and engineers had observed this phenomenon in other reactors of the same type but did nothing to remedy the flaw.  A shutdown, or SCRAM, is supposed to kill the nuclear reaction by flooding the core with neutron absorbers like boron.  In the case of the RBMK-1000 reactor, the AZ-5 button lowered graphite-tipped boron-clad control rods, initially displacing water as a neutron absorber, speeding up instead of slowing down the reaction.

Image result for midnight at chernobyl

At Chernobyl, that initial heat spike instantly vaporized the cooling water.  This in turn released an explosive force that destroyed the fuel rods and moderator channels and ruptured the reactor container.  Fresh air then poured into the breach, combining hydrogen and oxygen with super-heated graphite, which caused a second explosion that destroyed the reactor and its building.  The force of the explosion was so great that it knocked the 2,000-ton lid off the reactor.  It fell back at a 15-degree angle, both exposing the core to the outside while protecting it from a direct attack.  Tons of uranium fuel and radioactive graphite were blown free of the building.  Some of this was instantly vaporized and floated away on the wind. Meanwhile, hundreds of tons of fuel and graphite caught fire inside the reactor, spewing more contamination across eastern and northern Europe for nearly two weeks until the fire burned itself out.

The miniseries follows a handful of true-to-life characters, including the nuclear scientist Valery Legasov played by Jared Harris, apparatchik Boris Scherbina played by Stellan Skarsgard, and Lyudmila Ignatenko, played by Jessie Buckley, the newlywed wife of a fireman who was among the first at the disaster site.   Ulana Khomyuk, played by Emily Watson (likely based on Vasily Borisovich Nesterenko as interviewed by Alexievich), is an awkward composite character designed to represent the many Soviet scientists who aided the cause.  It is true the narrative lacks female protagonists, but they are not entirely absent: the architect of Pripyat, the model city built to house Chernobyl workers, was a young Russian of Chinese descent named Maria Protsenko who is still alive and working in Kyiv.  Similarly, the top expert in radiation medicine at Moscow Hospital #6 was Dr. Angelina Guskova.  Protsenko helped organize the evacuation of Pripyat and Dr. Guskova treated dozens if not hundreds of acute radiation casualties flown in from Chernobyl.

Legasov and Scherbina do not show a distinctive character arc.  It is true that Legasov hanged himself two years to the day following the accident.  It is not clear why.  While he declaims the damage done by a culture of secrecy, suspicion, and deceit, he does not appear wracked with guilt or hypocrisy that he toed the party line while reporting on the accident to the International Atomic Energy Agency in Vienna.  Scherbina, for his part, appears to be the stereotype of a blustery commissar.  There is little human interest here.  Considering all the compelling perspectives documented by Alexievich and Higginbotham, it is a shame that each episode doesn’t focus on a single individual, as was done in a series like “Band of Brothers”.

“You are dealing with something that never occurred on this planet before,” Legasov tells Scherbina in the miniseries.  The reactor was exposed.  It was on fire.  The core fuel, without coolant or dampener, was melting down.  Every surface surrounding the accident site was covered in lethally radioactive debris.  The position immediately above the reactor was a crossfire of deadly gamma, alpha and beta radiation.  The fire was too intense to douse with water which simply flashed to steam and carried off more nucleotides into the atmosphere.  Liquidators had to stop the fire to get at the core, which threatened to melt down completely, burn through the containment unit and concrete pad, and enter the water table serving two million people in the Dnieper river basin and the Black Sea, including Ukraine’s largest city, Kyiv.

The Soviet commission convened to remedy the disaster determined that the first, immediate step was to smother the fire by bombarding the exposed reactor with heavy materials like lead, sand, and boron.  This required a heroic effort by hundreds of aviators who had to position their helicopters directly above the fire and receive huge amounts of radiation while they dumped their suppressive loads, blindly, into the smoke.  They flew hundreds of sorties.  During one of the initial runs, a helicopter collided with a crane and fell from the sky like a rock; the miniseries faithfully recreates this horrifying event .

But this was simply heroic futility, one of many depressing stories from the “Battle of Chernobyl”.   In the miniseries, Scherbina cheerily announces to the Politburo that the fire is out.  But the reality was more prosaic and also more terrifying: the high-risk aerial attack had no effect on the fire whatsoever.  After the accident site was covered by its first, faulty containment called the sarcophagus, teams entering the reactor found nothing of the several thousands of tons of material in the reactor compartment.   The graphite fire burned itself out.  Instead, most of the deflected cargo joined the melting fuel as a toxic, radioactive slurry known as corium that was eventually found, cooled, coagulated and hard as glass, in the plant basement.

If there was one true effective operation that actually averted a worse catastrophe, it was the work of a handful men who entered the dark, flooded suppression pool below the reactor.  The water here, it was feared, could lead to an even larger steam explosion than the one that ruptured the reactor in the first place.  The miniseries needlessly exaggerates the threat, claiming that the explosive yield could be as large as a hydrogen bomb.  A more conservative but realistic fear was that the steam explosion could destroy the rest of the complex, including the three other fully fueled reactors at Chernobyl.  The men entered with inadequate equipment and faulty dosimeters – their horrifying, claustrophobic work well-documented by the miniseries – and managed to drain the suppression pools.  They received 1,000 rubles for their heroism in addition to lethal doses of radiation.

Within days of the accident, the vast resources of the Soviet state began to be mobilize.  This was mostly people.  Between 600,000 to 750,000 men were activated to clean up, or “liquidate,” the accident site by the end of 1986.  For perspective, that is equivalent to the entire Soviet force deployed to Afghanistan over the ten-year war there.  Thousands of vehicles were sent and abandoned; thousands of tons of lead, boron, liquid nitrogen, sand and water were poured into the reactor.  The fallout dispersal and disparate impact on individuals made the human effects of radiation difficult to predict.  Consequently, both blithe complacency and hysterical alarm ran rampant in the population.

It is shocking, even in retrospect, how ill-prepared the Soviet Union was for this accident.  The country’s defense policy explicitly prepared to fight and win a nuclear war.  But basic equipment such as personal protective gear, hardened vehicles, ventilators, and working dosimeters were in short supply for this one incident.  The government was entirely unprepared for an accident of this scale, which released the equivalent of fuel of some 350 atomic bombs of the kind dropped on Japan in World War II.  Soviet citizens appear to have swallowed their own propaganda about the “peaceful atom”: not even the workers at Pripyat thought a reactor fire much cause for concern.  Only dedicated nuclear scientists raised the alarm.

The commission tried to use robotic vehicles to clean up the accident and scout the site.  These included off-the-shelf Soviet rovers designed for the moon and equipment bought on the sly from the West.  They all proved to be inadequate as radiation quickly fried their delicate electronics.  But in order to build the containment sarcophagus to isolate the damage, adjoining surfaces to Reactor #4 had to be cleared of radioactive fuel and graphite. With the robots failing, the job fell again to men – “biorobots” – who worked the roof in 90-second shifts, their groins and heads protected by strips of lead.  This was truly heroic and important work, as containing the accident site required access to these surfaces.

The sarcophagus itself was a paragon of Soviet engineering, built quickly and inadequately.  No sooner than it was constructed than it began to fail.  One of its support beams rested on the damaged wall of the reactor chamber; the steel rebar began to rust immediately and the roof leaked water.  Only some 30 years later was the largest moving structure ever built, the New Safe Confinement, erected over the sarcophagus, to start the deconstruction and cleanup that will end no sooner than 2065.

Higginbotham poignantly notes the sarcophagus and New Safe Confinement are effectively the tomb of one of the few documented deaths of the accident.  It is widely assumed that Valery Khodemchuk, a coolant pump operator, was instantly killed the moment the reactor exploded.  His body remains inside, somewhere – perhaps to be recovered later more than 80 years after he died.

Narrative liberties aside, the attention to detail and verisimilitude in the miniseries is extraordinary, from the Mi-8 workhorse helicopters to the ineffective “petal” cloth respirators, black overalls and lead strips lashed to men’s bodies.  The film shows you the reactor explosion and takes you onto the roof with the biorobots in a mad, claustrophobic scramble with deadly materials on a surface so alien it might as well be the moon.  The entire miniseries is an extraordinary replication of a time and place that no longer exist.  (Lithuania stands in for much of the Ukrainian steppe, including the model city of Pripyat.)

The miniseries’ producers have obviously watched some of the great Soviet director Andrei Tartovsky’s work.  His lingering camera focus of organic shapes seen in the original Solaris and his ominous pans across sinister landscapes in Stalker are echoed in the miniseries (along with Tartovsky’s most-recent avatar, Annihilation).  The disaster site, and in particular the relentless smoke that rises from the reactor, combined with a similarly eerie soundtrack, conveys something dark and otherworldly, a mystery we will never fully comprehend.

Indeed, aspects of the disaster have now entered legend, even myth.  Observers have long noted that the English translation of Chernobyl, wormwood, is a biblical allusion to the end of days.  The exact time of the accident, at 01:23:45, has a chilling, almost planned precision to it.  Several witnesses, as documented in Svetlana Alexievich’s narrative account, report seeing a bright blue beam of light ascending from the reactor, presumably the ionized atmosphere exposed to the reactor core.  People taste metal near the accident, feel the pin-pricks of radiation on their face, smell the sweet scent of ozone.  It is conventional wisdom among the liquidators that alcohol blocks radiation, and the more the better.  Alexievich documents gothic rumors swirling around the accident: summary burial of radiation victims, massive evacuations, the accident caused by “cosmophysical forces,” river pikes without heads or tails, radioactive wolves and foxes playing with village children, children born with yellow liquid in their veins instead of blood.

Real questions remain.  We will never know how many people died following the accident.  Alexievich’s anecdotal approach documents ample death, sickness, deformations, and disease.  But without a public health baseline, which the Soviet Union and the succeeding states of Belarus, Russia, and Ukraine all neglected to document, we can only make educated guesses about the final casualty count.

It may well be true, as Mikhail Gorbachev stated long after the Soviet Union collapsed and in a furious tirade against Gorbachev that Alexievich documents, that Chernobyl was the beginning of the fall.  As the apotheosis of the Soviet experiment, it seems to have destroyed any lasting faith in the system among the Soviet people.  But that is as impossible to prove as the final death toll from the accident, in which case it remains the greatest object lesson of the 20th century.

###

The Secret History of Small Mercies

The most devastating scene in The Lives of Others, Florian Henckel von Donnersmarck’s 2007 film about the East German Stasi, comes near the end when the playwright, played by Sebastian Koch, pulls the wire taps out of his walls of his apartment.  He reveals the hidden microphones that have recorded his every waking moment for the security state.  The playwright is suddenly confronted with the intimacy of the surveillance.  The walls were listening.

No Live Files Remain

The Lives of Others complements a growing genre of nonfiction documenting the files written by the secret police during the reign of communism across Central and Eastern Europe through the late 1980s.  The first was probably Timothy Garton Ash’s The File (1997), which uncovers his own monitoring while a British doctoral student in East Berlin during the 1970s.  Following the publication of his dissertation, which documented the effect of the Stasi on his friends, he was barred from returning to the country.  In The File he recounts both how he had been surveilled and by whom.

While Garton Ash’s stories may shock, the sheer scope of the state spy apparatus really astonishes.  My father-in-law, a California chemistry professor, appears in Stasi files associated with an East German counterpart whom he hosted in the 1970s.  Likewise, my former NATO colleague Oanu Lungescu documented her own Romanian Securitate case file in a 2009 BBC documentary series.  She discovered that even after she emigrated, secret agents continued to keep tabs on her abroad.

All of this surveillance occurs at a distance.  The Stasi agent in The Lives of Others, played by Ulrich Mühe, never physically encounters his target.  Garton Ash only met his monitors after searching them out decades later.  Lungescu never meets hers at all.  For the pervasive nature of surveillance, most accounts have the strange aspect of bureaucratic prose, the banality and even humor countered by the terrifying knowledge they had of daily life.

It is first with Herta Müller’s Cristina and Her Double (2013) that we get a closer sense of individual betrayal in a police state.  While the Romanian-German Nobel laureate has extensively documented its terror in fiction, this autobiographical essay really captures its braided, complex intimacy.  Approached by Securitate agents to inform on her colleagues in a factory where she worked as a translator, her refusal led to harassment, reprisals, and eventually her firing.  She emigrated to Germany where intimidation followed her.  The Romanian embassy worked through state agents and “useful idiots” to slander her, ironically, as a Securitate plant.

This was her “double,” she discovered later after lobbying for years to access her file.  There was Cristina, the code name given her as a possible recruit, and her double, which was a projection if the Securitate had had its way: Cristina the spy, Cristina the agent, Cristina the plant.  It was bizarre and Kafkaesque but contained its own logic: if she wouldn’t spy for the government, then the government would make her one in the minds of her friends and colleagues.  “Everywhere I went, I had to live with this doppelgänger,” she writes.  “It has taken on a life of its own.”

Throughout this entire ordeal Müller was comforted by a friend named Jenny.  After she emigrated, Jenny came to visit her.  At this point Müller grew suspicious, searched her friend’s luggage, and confronted Jenny with evidence that the Securitate had sent her.  Her friend broke down and confessed, saying it was the only way she could see Müller one more time.  Cancer would soon kill Jenny.  Devastated by this betrayal, Müller was convinced that Jenny had spied on her from the start.  Only after a close reading of her file did she conclude the friendship had been genuine from the start and the Securitate only compromised her friend in her final days.  “You become grateful for small mercies,” Müller writes, “trawling through all the poison for a part that isn’t contaminated, however small.”

Andras Forgach

Andras Forgach (Simon and Schuster)

András Forgách searches for these same small mercies here in the most intimate relationship each of us has: his mother.  A friend working in the Hungarian state security archives came across his mother’s file, recognized his name, and contacted him.  What he learned completely upended his entire understanding of himself, his mother, and his family, which he had written about as recently as 2007.  “Now,” he told The Guardian, “I realized nothing of what I had written was true.”

Forgách is a well-known writer, poet and dramatist in Hungary, where this book was a best-seller.  This is his first book translated into English (confusingly co-released by Penguin in some markets as The Acts of My Mother).  The book’s structure and method are unorthodox.  The first part is a fictionalized narrative pocked with footnotes referencing his mother’s secret files.  This ends with the verbatim transcript of the last file made about his mother, as she “ends her activities” at the end of her life.  This is footnoted with outraged comments by Forgách, defending his mother and attacking the penury of her handlers in her final days.  The second part is a series of poems about his parents.  The final section is a more conventional essay discussing his mother’s activities as a “secret colleague.”

Forgách’s motivation appears similar to that of The Lives of Others: reconciliation.  Garton Ash hits closest when he noted in The New York Review of Books that the film’s primary feature was its exportability.  In the German context, it is easier to see how the collision of characters—the liberal but loyal Marxist intellectual, the principled and redeemable Stasi agent who protects him, and the corrupt Communist bureaucracy—is designed to help Germans reconcile their history while giving foreign audiences a dramatic structure to understand that country’s recent history.

And in fact how these countries reconcile their own past, and the actions of friends, family, and neighbors, is a fundamental question.  Some countries have made access to the secret archives a priority.  Other countries apply courts for punitive judgment or truth commissions with the power of subpoena to bring the dark past into the light, so much the better to inoculate the present against future reincarnation.  This is not limited to Europe.  In Brazil, a small army of activists surreptitiously published three million documents held by the post-military government.  Peru and Chile built state museums to research and remember their own recent tumultuous history.

But Forgách is reconciling with his own mother.  Bruria Avi Shaul was born in Jerusalem to a famously literate Jewish family during the British mandate in Palestine.  She joined the Israeli Communist Party, and abandoned Israel out of principal.  She opposed the Jewish state and found a welcome ally in the communist bloc.  She was exotic, beautiful, and charismatic.  She married Fórgach’s father Marcell, a Hungarian Jew then serving with the British army in Palestine, and they relocated to Hungary.  As a result, she lived almost her entire life as a foreigner and outsider.  Both his parents, he writes, “were the inhabitants of nowhere – neither Hungarians nor Jews, nor foreigners, nor comrades, nor compatriots.  Among comrades they were Jews, among Jews they were communists, among communists they were Hungarians, among Hungarians they were foreigners.”

But both parents were committed communists and both, Forgách reveals, were willing secret agents of the Hungarian state, sharing the code name PAPAÍ.  His father was the less successful of the two, using cover working for the Hungarian news agency in London, where he clashed with the bureaucracy.  His colleagues turned against him and his career never recovered.  Forgách documents his father’s descent into madness and eventual suicide – his mother found him hanging by the neck in the bathroom when he was 53 – as a result of state harassment and professional failure.  But it appears likely he was actually an undiagnosed syphilitic.  He shared stories of his conquests of Egyptian prostitutes with his young son during long strolls in London.

His mother, likewise, was in love with an English soldier she had an affair with during the mandate and never saw again.  Despite this mutual discord, they remained married until his father’s death, mother and son nursing him in the most pathetic circumstances.  His parents shared, too, a secret language, Hebrew, that they used to communicate in confidence around their children.  As a result, Forgách admits, there are depths and dimensions of his parents’ relationship he may never know.

But he does know a lot about his mother.  Forgách finds answers to questions he forgot he had growing up, such as a strange visit to Greece with his mother in 1976.  Forgách remembers his mother being particularly tense during this “vacation.”  The files explain that she was there to meet several contacts and pump them for information.  Bruria likewise made regular trips to Israel to see family, missions financed by the secret police to make contacts and provide political intelligence.  Like Jenny’s visit to see her friend Herta, these trips were impossible without the permission of the state and enabled by the prospect of secret intelligence.

All of this was relatively superficial and inconsequential.  Bruria sent nobody to prison and hardly affected the balance of power in the Middle East.  Her children never noticed and despite her ardent communism they began the generational turn away from Lenin and toward liberalism, stoking the revolutions that would sweep Central and Eastern Europe and truly upend the balance of power in Europe.  She complained regularly to her handlers that she had not raised good communists.

But that changed when the state learned that her son was harboring an enemy of the state.  György Petri was a writer, poet and virulent opponent to the regime.  As a result, he was virtually unemployable and survived mainly by crashing with friends.  He was doing this at Forgách’s apartment when Bruria’s handlers suggested she get access to her son’s apartment so they could install surveillance equipment and monitor the old poet.  She dutifully did this – in Forgách’s hybrid recollection she appears on edge, as if she finally understands the true depths of her commission – by insisting, uncharacteristically, that she clean her son’s apartment.

The structure of Forgách’s narrative drives toward this Lives of Others moment by laying a series of less-serious accounts and anecdotes like the foundation of a pyramid of betrayal.  But while this may be the apex of his mother’s personal betrayal, it is not the book’s zenith.  Forgách drives further to what he believes is Bruria’s ultimate betrayal of herself.  He finds her review of a Hungarian author’s history of Jerusalem that she guts as Zionist propaganda.  Her fury overwhelmed her handlers, who snuffed out any ideas about publishing her review.  It languished in her file until her son found it a quarter century later.

Forgách portrays this as a tragedy: the state she served would not allow her the ability to express the one deeply held conviction she had.  This is a less convincing argument but it in fact completes the circuit for his family and for his country.  It argues that in a police state even collaborating out of conviction does not allow the expression of that conviction.  It is the ultimate expression of raison d’état.

For himself, Forgach’s affection for his mother remains.  He doesn’t so much defend her as attack her handlers, disgusted by how little the state valued her betrayal.  His final memory of his mother, intervening with Forgách in a childhood dust-up, is an archetype of the godlike wisdom and power our parents wield when we are small.  He clearly would prefer to remember her this way rather than as the compromised collaborator in the grubby marketplace for covert information who made no real difference in the end.  And in that way he can reconcile his mother’s actions and find those few small mercies.

###

No Live Files Remain
by András Forgách
Translated by Paul Olchváry
Scribner
£14.99

Tool of the trade

AHB2012q26535

This is an L.C. Smith and Corona Company Standard Typewriter in the collection of the National Museum of American History in Washington, D.C.  It is a training model for children:  instead of letters printed on the keys, illustrations of animals correspond with rings children wore on their fingers that would correspond to the position or deck of the keys they were typing.

Except for the key rings and the animals, this is the exact model of a typewriter I have owned since I bought it in the early 1990s.  The Standard typewriter was built in the 1930s so it was at least 50 years old when I bought it.  It took me some time to realize what terrific shape it was (and continues to be) in.  But it was an antique, a museum piece, even then.

ernesthemingwayI bought this typewriter to write during a summer traveling overseas.  It was not the most practical purchase, since a notebook would have been far better and lighter.  The typewriter weighed at least 10 pounds and didn’t come with its case.  But buying the typewriter was quite possibly the most romantic thing I have ever done.  As I recall, I was smitten with an image of Ernest Hemingway  hammering away at his desk.  (I learned only later, reading A Moveable Feast during that trip, that he wrote his first drafts with a pencil.)

It cost $50 when I found it at the Vallejo Typewriter Company on Tennessee Street in my home town.  As you might imagine, the company no longer exists.   In 1993, it still sold and repaired machines of all kinds.  It was locally famous for displaying in its window the typewriter used by Burt Lancaster as he played Robert Stroud in the 1962 film The Bird Man of Alcatraz.  I have no idea how the owners came to own that typewriter and, unfortunately, what became of it since the store shut down.

My typewriter was manual (and, as a result, less heavy than an electric typwriter) so it didn’t require power as I crossed borders with varying voltages and electrical outlets.  It’s all-metal construction was painted and well-oiled so it was virtually waterproof.  It was almost indestructable, even what I initially thought were its delicate parts.  It had an evocative odor — the good, clean smell of a well-maintained machine.  It is hard to come by that sensation today even in a museum of industry.

I learned to type on a typewriter in a junior high school classroom in the mid-1980s.  Although they were electric, I fail to remember the model.   For a time as a child I enjoyed banging out short stories and scripts on my grandmother’s IBM Selectric.  The Selectric used a special innovation called a typeball instead of individual letter hammers for each keystroke.  The typeball would rapidly rotate based on the key to imprint the letter on paper.  The ball could be swapped out for different typefaces.  For my proud Italian grandmother, it somehow fit that she used an italic typeface for all her letters.

None of that prepared me for a manual typewriter.  By 1993 I had been working on computer keyboards for at least five years.  It took a week just to build the hand and wrist strength necessary to hit the keys hard enough and with enough follow-through to type.  Even after weeks on the road I would find the tips of my fingers insensible if I typed too long.

But carpal tunnel or repetitive motion stress was never a problem.  Unlike the flat modern computer keyboard, the keyboard on the Standard was large and steeply terraced with well-spaced keys.  Hitting all the keys required constant movement with the hands raised and moving all over the keyboard.  There was no resting position but as a result no particular load or stress on the hands or wrists.

I immediately learned that unlike my Apple computer at the time or even the Selectric, which had a correcting ribbon, everything I typed on the Standard was permanent and irremediable.  This fact dramatically, if not instantly, affected my writing.  Typing on the Standard required more forethought and planning as I wrote.   I had to choose words and compose sentences, even whole paragraphs, in my head before I started typing.  Prone as I was at the time to random digressions, the Standard actually helped me maintain intent and focus.  It improved my spelling and composition.

It was also insanely loud.  Once I got going, the physical, even violent nature of typing became apparent.  My friends could hear me working outside the building and down the street.  Nobody particularly complained as I recall and I encountered only mild curiousity.  More eccentric things I’m sure have been seen on the streets and in the hostels across Europe.  Nonetheless, I do remember somebody calling to me on the street in Milan as I hoofed the typewriter on my ruck: “Hey!  Whatta you-a doin’ with that a-writin’ machine?”

Things we love only become romantic when they are obsolete.  Although I still have that typewriter, I’ve never used it since that trip in 1993.  It was outmoded 30 years before I bought it.  But it made me a better writer.  Even as a mass-produced item its old world engineering made it a work of art — maybe something worth seeing in a museum.

###

Belief from the inside out

Carla Power’s Pulitizer Prize-shortlisted If the Oceans Were Ink, an outsider’s meditation on The Holy Qur’an with the help of a learned Islamic scholar, signals a subtle but seismic shift in our intellectual world.  It joins other unmistakable indications that mostly secular Western thinkers now realize they have allowed the belief of a billion people to be defined by a clique and that the popular understanding of Islam has been warped and obverted to the point that the exception has replaced the rule.

I imagine especially for Muslims it is as if everyone thought they were doctors because a friend had a rash, or physicists because they’d seen a car accident.  While they understand something from the inside out, everyone else seems to be just peering in from the outside.

I was reminded of this when listening to an interview on San Francisco public radio recently. The host of The Forum on KQED, Michael Krasny, was interviewing Qamar Adamjee, curator of a new exhibition of Islamic Art at the Asian Art Museum.  (The relevant portion begins at about the 13:00 minute mark.)  Krasny does not so much ask a question as state the cultural and human destruction wrought by the Taliban and the Islamic State.  As she struggles to express herself, Adamjee’s response is telling.  Those who attack art are doing so for political, not religious, reasons, she says. “It’s easy to pick on religion, it’s easy to pick on the other,” which of course cuts in two directions. She changes the subject: “[The exhibit] allows us to see Islamic culture as a much broader thing than the undifferentiated monolithic mass that comes across to us today.”  What she is trying to say is: I want to talk about art and Islamic culture.  This art has nothing to do with violence.

The larger point, perhaps missed in a discussion of art, is that the art and culture and belief of Muslims are what is really important.  That is a difficult thing to say while a coalition of nations is trying to destroy the Islamic State.  But as this recent NPR story by Tom Gjelten also argues, understanding that larger point is also essential to defeat our enemies and to make friends as well.

Carla Power’s honor may be a landmark of that dawning realization but it is not the only example.  Another can be found in Garry Wills’ recent essay, “My Koran Problem” in The New York Review of Books in which he admits that only very recently had he read The Holy Qur’an.  This is an extraordinary confession.  How could a public intellectual and powerful liberal polemic of such range, virtuosity and experience go so long without understanding one of human civilization’s great texts?  “It was ridiculous that I would remain completely ignorant of what a quarter of the world’s people not only believe in but live by (in different ways),” he writes. Beginning sometime after 2003, he continues to struggle with this text “unaided”.  Surely Wills could find somebody willing to help him?

On a smaller scale but in more sympathetic vein, Washington Post columnist Courtland Milloy recently wrote about a visit to the Masjid Muhammad, “The Nation’s Mosque” located in northeast Washington, D.C.  “If you see nothing suspicious, maybe that’s normal,” his article was headlined.  At the mosque he met the imam, a retired U.S. Air Force Master Sergeant.  A member of the mosque is a retired U.S. Army Command Sergeant Major.  “We should be America’s allies in the fight against extremism,” another member of the mosque told Milloy.  Muslims are by far the greatest victims of terrorism around the world.  “Instead, we’re on the defensive, always being asked to respond to somebody’s claim that Islam promotes violence.”  Again, in Milloy we hear somebody trying to change the subject, to focus on what’s important, which is what is normal.

How did so many overlook this pacific ordinariness, this everydayness, this normality that we all can recognize?  Wills writes that he has spent most of his career studying Christian and Jewish theology.  Herein is the heart of the problem.  I discovered myself how self-limiting one’s own provincial interests can be.  Even well-intentioned attempts to learn more lead to a contained circle of works, all cross-referencing each other, each self-delimiting any knowledge beyond the circle.  It takes an extraordinary mind or experience to force oneself out and beyond.  I am the grateful beneficiary of such an extraordinary experience and extraordinary minds when it comes to Islam. 

Wills struggles from this insulating defect, unfortunately comparing the Qur’an to The Communist Manifesto and Mein Kampf, as if the holy text were an operational manual for our enemies.  This is exactly wrong.  Studying and understanding The Holy Qur’an and Islamic thought is how we understand and know our friends.  Western secularists don’t understand what Muslims really believe and how their belief animates their lives.  What is normal is important because it is what we have in common to defend against intolerance and barbarism.

But like Wills, we have to start at the beginning.  At the beginning is the realization Wills alludes to: that understanding Islam on its own terms is more important than its present political context.  When a billion people believe some thing, we have a duty to understand that from the inside out.

If Carla Power’s book suffers a flaw, like any other similar book written by a secular Westerner, it is that she addresses the belief from the outside.  But she is studying the Qur’an, which as any Muslim understands is the place to start to understand Islam.   There are several excellent guides (in English) to the Qur’an, including Introduction to the Qur’an by M.A. Draz and The Story of the Qur’an by Ingrid Mattson.  These both benefit from the authors being Muslim.  Additionally, several translations of The Holy Qur’an (also in English) can be found online.  I am less familiar with the Sunnah and the Hadith, the actions and sayings of the Prophet Muhammad (peace be upon him) and a major source of Islamic theology and moral philosophy, but translations are also available online.

Like Wills, I admit that these ancient texts are indeed challenging to read unaided and barring a community college or divinity school course most of us must avail ourselves to what we can find in the public domain.  To understand what Muslims really believe we have to break out of the confining circle of Western scholarship and read what Muslims write about themselves.  Fortunately several books do this and don’t require the assistance of a scholar.  The journey is rewarding from the first step.

the_road_to_mecca_book_coverThe gift of a friend, Muhammad Asad’s The Road to Mecca (1947) is a good place to start.  The book is at once a philosophic meditation, spiritual quest, and ripping adventure yarn in the old Islamic tradition.  Asad was an Austrian convert from Judaism who began his career as a journalist in the Near East.  His adventures, which included advising King Saud and the nascent government of Pakistan, rival or exceed those of T.H. Lawrence, Robert Burton and Gertrude Bell.  Asad very nearly died of thirst while lost in the desert and was interned as an enemy alien by British authorities even though his entire family perished in the Holocaust.  His greatest contribution was a defining contemporary translation, The Message of the Qur’an (1980), into English.

The story of Asad’s conversion is moving.  He has returned to interwar Berlin from his latest journalistic exploits in the Near East and he is riding the Berlin U-Bahn with his wife.  They note the devastated expressions of their fellow citizens, the deep unhappiness of their lives etched on their faces.  There they decide to convert to a system of belief that appeared so much more humane and logical than what they had been raised in.

Who Speaks for Islam (2008) is a misleading title since this book, produced by Gallup and written by Dahlia Mogahed and John L. Esposito, is a very literal survey of what Muslims around the world think about belief, politics, and culture.  It is a study of a complex and plural community, but many clear common threads show through: the central importance of family, the rejection of political violence, the concerns about the erosion of traditional cultural norms, the necessity of belief guiding political choices and personal behavior.  These findings are not particularly dramatic and indeed could be mistaken for similar surveys in Europe and the United States.  But they are critical to understanding the community on its own terms rather than those forced on it by barbarians and xenophobes.

Memories of Muhammad (2008) by Omid Safi, is a kaleidoscopic examination of the legacy of the founder of Islam.  Safi argues it is impossible to understand the belief without understanding the man who promulgated it – much as Protestant Christians closely examine the life of Jesus Christ, he notes – in addition to how Muslims remember and honor the Prophet around the world.  In the clearest way I have read, Safi illuminates the history of Islam, the Sunni-Shia schism, Sufi mysticism, and even contemporary politics.  Born to Iranian parents in Florida, he displays in his home a devotional portrait of the prophet popular in Persian-speaking countries but considered taboo elsewhere – demonstrating the plural and dynamic nature of the community.

Safi by necessity acknowledges contemporary challenges – here he writes against the conventional orthodoxies of the “clash of civilizations” as well as Muslim Occidentalism – but significantly argues that the best way to combat religious strife is to argue for the alternative.  Like Adamjee, he wants to change the subject to what’s really important:  what real people believe and what belief means to them.  And by doing so, he is convinced that it is necessary to talk about and gain a better understanding of Islam and what Muslims believe, which is what the rest of us are just now coming around to.

###

 

 

Why I Love Comic Strips

This comic strip, a Sunday installment of Peanuts by Charles Schulz, ranks as my favorite above all others.  I can’t explain why.  Maybe it’s the incredibly loaded conversations that don’t seem to go anywhere or resolve themselves.  Maybe it’s the idea of children speaking like Talmudic scholars.  Or maybe it’s the uncomfortable truth that, when you play competitive sports, especially as a child, indeed you feel you were born to suffer.

pe140914comb_hs

Milan Kundera wrote that the modern novel exists because no other form of art can do what what the novel does.  I keep that in mind as I think particularly about this Peanuts strip.  I can’t explain why I find it so funny and appealing just as I can’t imagine any other type of art form or expression being able to do what Schulz did here.

Like the comic book, the comic strip is a peculiarly American art form.  It is related to the single panel editorial cartoon that has been used to lampoon public life for centuries and it has a cousin in the full-panel illustrated narratives common in Europe such as Tintin and Asterix.  A hybrid of the comic strip and the narrative cartoon gave rise to the graphic novel and graphic memoir.  With the awarding of the Pulitzer Prize to Art Speigelman’s Maus the latter appears to have achieved mainstream respectability. Comic strips by contrast are still considered not quite serious business.

Berkeley Breathed, creator of Bloom County, explains in a footnote to one of his Sunday strips in the recently produced anthology of his work that the real advantage to the comic strip is timing.  Unlike a panel cartoon, which has the punchline right there, or a narrative cartoon, which theoretically has no ending, the panel comic allows the artist to tee up the joke and punch it hard while the pictures reinforce or contrast to the gag.  It’s hard to think of anything else like that, with the possible exception of certain stand-up comedy.

eebba1c05dbe012ee3bf00163e41dd5b

Brethead always denied a certain level of political engagement but it’s not hard to see below his frenetic satire a certain point of view.  Nonetheless, he’s more an anarchist than Garry Trudeau, for his early pot-smoking days, ever was.  I remember being amazed at Breathed’s focus on political process as a source of comedy: the inevitable futility of caucuses, political parties, platforms, and selecting candidates.  Nobody had ever done that before that I could remember.

And even though this is a single panel, the narrative flow of the strip is left to right, just as if it had four panels, paying off with the punchline on the right.  Losing in politics feels EXACTLY like this.

11071984bloomcounty

Peanuts had a peculiar effect on me as a child.  They suggested both a much broader intellectual world than anything I had experienced while at the same time placed children at the center of that world.  I mimicked Peanuts that way, not knowing any better that most children don’t talk or think this way while I expected both children and adults to be as cerebral as Linus often was.  It was a strange way to grow up.

peanuts

Screen Shot 2014-01-02 at 11.34.08 AM

The question of whether comic strips are themselves “art” seems to have come down to a public debate held by Schulz, then late in his career, and the upstart Bill Watterson, creator of the beloved Calvin and Hobbes, during the early 1990s.  Watterson claimed that a comic strip could be art (and in his hands it clearly was) but that could be polluted by commercialization.  Schulz, who practically invented licensed merchandising, demurred.  It’s hard to imagine a world without Peanuts products just as much as it is to imagine it filled with Hobbes stuffed tigers.

While much of Peanuts merchandise is crass and upbeat, contrary to the initial intent of the strip, this more expansive view of the cartoon universe also gave us the “A Charlie Brown Christmas” television special as well as the gorgeous visuals of Snoopy’s hallucinatory flight across No Man’s Land in “It’s the Great Pumpkin, Charlie Brown”.  I wonder if Watterson had stayed in the game longer and kept the control over his creations he ultimately won, if he wouldn’t have contributed something equivalent or greater.  But he left the field too soon to tell.

I still love Calvin and Hobbes and have since I was a young.  I can’t explain its enduring appeal over all that time any better than I can explain the appeal of the Peanuts theological strip.  Calvin and Hobbes remains very, very funny and even some of Watterson’s more politically weighted strips still hold up.  But I think I recognized Calvin in myself as I had an extremely vivid imagination and creative inner life when I was young and the dour Peanuts gang (with the exception of Snoopy, of course) simply didn’t mirror that.

I remember the day when that explosive youthful inner life – the kind that sees toys and playthings as part of their own complete world – died to make way for adult concerns.  I’ve realized since that Calvin and Hobbes is a kind of written record of that pre-adolescent land of make-believe.  Alternative reality is reality of a child’s imagination.  Watterson started this with Hobbes always appearing as a stuffed tiger when he and Calvin weren’t alone.  But he developed this idea as the strip evolved: the higher the verisimilitude of Calvin’s inner life – that is, the higher the contrast between his “cartoon” existence and the “reality” of his imagination – the funnier it became.   As I watched my son stuff a small plastic velociraptor into the back of his Lego helicopter, I knew Watterson somehow found the mainline back to that unrestricted creative universe of the child’s mind.

calvin_hobbes_trex_jet

If Calvin and Hobbes captures the child’s experience, its opposite may be Garry Trudeau’s estimable Doonesbury.  Produced almost as long as Peanuts (and aside from a few magazine covers and a spin-off musical, sharing Watterson’s aversion to merchandizing), it has chronicled three generations of friends who lived together in a college commune in the 1970s.  Trudeau has always been overtly political.  Infamously, when editorial cartoonists protested his award of the Pulitzer Prize for (panel) editorial cartooning, once assured it could not be revoked he joined the protest.

ae81b231f7cdeee5df891bbf85d3a99c

Most recently Trudeau has won deserved plaudits for documenting the struggles of injured veterans returning from Iraq and Afghanistan.  He sent his first character, B.D., to Fallujah where he lost his leg in an ambush.  A member of B.D.’s company, a young soldier named Leo, suffered traumatic brain injury in a roadside bomb.  Later, Trudeau introduced a young female soldier who was the victim of command rape.  As I have written elsewhere, I can think of nothing else that examines these extremes of human experience with more grace and wit and humor.

But Trudeau didn’t start there.  B.D. was a Vietnam veteran and in this famously liberal strip the only conservative.  But when Saigon fell in 1975, Trudeau captured with extraordinary poignancy the grief not just of the veteran but also the entire nation.

db750531

Trudeau has done something else with the recent wars that I’ve not seen anywhere else: lampoon the self-regarding professional international calamity personalities, the latter-day incarnation of the White Man’s Burden.  Sean Penn comes to mind, but so does the disgraced Greg Mortensen, author of “Three Cups of Tea”.  Trudeau’s splenetic adventures of the Red Rascal, a fictional alter ego of a low-achieving former CIA intern, sends up the noble braggadocio of an entire class of celebrity philanthropists.  I only really understood this when I saw the inside cover of Red Rascal’s War, a collection of stories about the preening hero, which features him rappelling from a Blackhawk helicopter carrying…three cups of tea. If there’s anything that needs comedic sending up, it’s the self-important and inflated rhetoric that infests these international endeavors.

51qzWKunemL

I’m ignoring some of the classics of the genre, including Pogo (which I came to appreciate as an adult) and its predecessors such as Lil’ Abner and Little Nemo in Slumberland.  Or, for that matter, strips like For Better or For Worse, Boondocks, and Cathy.  My point does not exclude them, because I’ve selected here the strips that speak to my particular experience.  It’s a testament to the endurance of the art that so many other strips thrive and, undoubtedly, speak to others, serving the purpose that no others do.

YUzOHo1

Joan Didion, Californian

thelastlovesongJoan Didion seized my attention early, before I wrote for myself.  Assigned “Slouching Towards Bethlehem” in high school, I read with amazement her cool, detached descriptions of things I recognized growing up in California.  I graduated quickly to “The White Album” and it was there she was the first to suggest my life had literary merit: her description of my hometown as being some place she passed through, from the North Bay to the East Bay, because there was no place there to return a rental car as she suffered an emotional breakdown.    This implied to me, at age 18, everything and more than I wanted to know about growing up.

Her acute sensitivity to detail connected directly with the skeptical eye of the adolescent.  I admired her method of careful observation, finding revealed truth in the everyday that we adults take for granted, unchanging, and immutable.  But her method as it appealed to me when I was young marked me: the often passive but meticulous attention to the obvious or overlooked that other people in their haste or misdirection miss is useful (and lacking) in adulthood.  “Didionesque” became both a description and a model to emulate for my friends and me in our writing.

Her sensibility as a Californian and Westerner also endured.  After reading the great American writers of the South and the East (which from our perspective took in everything east of the Rockies), it was always pleasurable to return and read something that reflected my own surroundings and upbringing.   (For example, only a Northern Californian can truly appreciate her revelation that Huey Newton was “a Kaiser,” that is, a member of the Kaiser Permanente health maintenance organization.  Who knew that the Black Panthers had a group medical plan?)

Only later did it occur to me that Didion’s public acclaim but lack of establishment laurels – she never won a Pulitzer Prize – suggested that her voice and regionalism could seem alien, even bizarre, to anyone not raised in my home state.  I am no doubt proved right in my intuition that Didion’s late memoir about the death and illness of her husband John Gregory Dunne and Quintana Roo, “The Year of Magical Thinking,” was her first to win a National Book Award.  In her straightforward, literal and full-disclosure accounting of the trauma and dislocation of that year, it is her least Didionesque book.

The new biography of Didion, “The Last Love Song” by Tracy Daughtery, is haunted by death from the last pages.   We know, if we know Joan Didion, how the story ends.  But the most powerful and quietly devastating real-life manifestation of Didion’s flattening fear of catastrophe comes about half-way through the book, accounted for and tossed away.  Didion and her husband, John Gregory Dunne, hired a young girl from Central America to look after their daughter.  The girl had a baby, who was then raised in the Dunnes’ house, which was kept obsessively clean to protect their own daughter.  When the mother and baby returned home to visit relatives, the infant’s unpracticed immune system collapsed, she contracted a fever, and died.

Didion feared not just the prospect of immediate disaster – the fatal illness, the heart attack, the life changed in the instant – but would have recognized the crushing, tragic irony of protecting a child so well that it kills her.  That this story is simply mentioned in passing in the first comprehensive biography of Joan Didion is just one of its many flaws but by no means its least.  (Like others, I’ve been annoyed by the author’s attempt to mimic Didion’s fictional style.)  Still, it’s important to note that we now have a fully developed narrative of Didion’s life to better understand her influences and her impact on American culture.

Death stalked Didion as the mysterious stranger killed acquaintances, friends, and loved ones as he closes in on those closest to her: her daughter and husband.  She is surrounded by horror which more than accounts for her desiccated dread.  Her niece was murdered, her agent died in his 50s, leukemia killed her sister-in-law, suicide claimed her brother-in-law, and some of the Manson victims she numbered among her friends.  Indeed, given how many people died in her life it is strange to realize that her memoir of her upbringing, “Where I Was From,” was written after her mother died around the turn of the millennium.

05

Joan Didion, Malibu 1976.  Photo bi/via Nancy Ellison.

That memoir achieved a pinnacle in a theme she has explored since the 1960s.  “We tell ourselves stories in order to live,” Didion wrote at the beginning of “The White Album”.  This may seem like overstatement until we recognize that we understand our own experience, history and public life through a series of stories rather than the longer, infinitely more voluminous series of details and events of our actual experience.  Storytelling saturates every aspect of adult consciousness, from the explanations we tell our children to the 30-second spots on television.  Storytelling is so pervasive that we mistake it for reality because there is no other, easily graspable way to communicate our experience.  But narrative, or story-telling, is not the same thing as experience.  Narrative is not reality: it is a way of picking out the most important and relevant details of our life and finding a common sentient thread to string through them in a way that makes sense.  Without this organizing principle, our lives would be incoherent.

For non-writers, and even for many writers, there is something spooky and slippery about narrative.  Some stories work themselves at a deep, almost subconscious level – the endurance of the gothic and Grimm fairy tales goes far to explain this and so does the “heroic journey”.  But what makes a “good” or “compelling” story is not something easily taught and takes some time for even professionals to learn.  Any newspaper cub reporter can tell you what it’s like to finally come up with a “great story” in a budget meeting, but she might be hard-pressed to explain why beyond a series of compelling elements lacking elsewhere.

Nonetheless the self-critical writer recognizes at some point that narrative can distort reality beyond recognition.  Didion’s dry, scathing views of San Francisco hippies, or young marrying couples in Las Vegas, or even those running the California aquaduct and Los Angeles freeway system, would not recognize themselves in her reporting.  They tell themselves different stories.  A good story can lead to the narrative version of sample bias, where we mistake the compelling exception for the rule.  And I’ve always worried that the drive for the “good story” means we may miss the profundity in the mundane.  Didion hammers at this, most tragically, in her reporting on the Central Park Jogger case: what makes the story of a lone, white, “attractive” victim so much more compelling than any of the other 3,254 reported cases of rape in New York City in 1989?  To the tabloid journalist – indeed, all of New York, it seemed at the time – the answer is obvious, beyond explanation.  But Didion shifted that spotlight to expose the even darker corners of New York – as well as our own bias and indifference – in one of her best essays.

Didion never goes so far as to explain explicitly what she means by story-telling or narrative.  At the beginning of “The White Album” she uses some peculiar analogies:  “The princess is caged in the consulate.  The man with the candy will lead the children into the sea.  The naked woman on the ledge outside the window on the sixteenth floor is a victim of accidie, or the naked woman is an exhibitionist, and it would be ‘interesting’ to know which”.  That can seem unintelligible to even the most sophisticated reader.

This question is the foundation for virtually all of her future reporting, from presidential races and the Central Park Jogger to her own background in “Where I Was From”.  But in the late 1960s and early 1970s, as Daughtery meticulously accounts, the narratives of public life irremediably fractured.  She no longer could recognize or understand events – her account of the five-year-old girl found clinging to a fence on Interstate 5 is one searing example – as she had traditionally.  These commonly accepted narratives, she wrote, were replaced by the sheer insanity of Vietnam, the Symbionese Liberation Army, and almost inevitably, the Manson murders.

Narrative had a particular relevance to Didion’s writing about politics, which she turned to in the 1980s and 1990s.  As I began to work in politics I found this writing less and less compelling, but her idea that narrative drives politics remains one of the most useful and penetrating critiques as it is practiced today.  Nevertheless I found Didion’s flat, skeptical ear when turned to the professional vocabulary of politics – always in quotes: “trade-offs” and “programs” and “policy” and “play” – could be easily turned to any other profession.   (Indeed, I can imply the same cynicism very easily with  Daughtery’s writing about the Dunne-Didion health crises which he unhelpfully leaves unexplained in layman’s terms: “hemodynamically significant lesion,” and “angioplasty,” and “congential defect of the aortic valve” and “radio-frequency ablation of the atrial-ventricular node”.)

Instead of revealing systemic cynicism, she has exposed the technical vocabulary of a committed if exotic profession.  It wouldn’t have made sense for her to explain it, since the exclusionary vocabulary was the point.  But what she found to be exclusive I found to be a specialist’s way to describe the work I did.  All professions are this way.  Perhaps she was yearning for a purer, amateur politics as reflective of the kind of fundamental American innocence we all seek in our political life.  But that doesn’t make her insight particularly extraordinary.

But in the beginning and the end, Joan Didion is a Californian.  It’s hard to overstate, as a native Californian, how much she writes for and about California and Californians.  The state’s uniqueness – climatological, social, cultural – has been plumbed for generations. But Didion was raised in its heart and writes about this state of mind from within.  She was born in Sacramento to fourth-generation Californians who can track their lineage back to and through the Donner Party that perished in the Sierra Nevada mountains in 1847.  Indeed, both Didion and Daughtery use this oft-told warning fable of hubris, tragedy and anthropophagia as a sort of talisman, the root of all fatal human folly.

But for the later arrivals – which includes most of the state and me – the settler narrative does not resound as profoundly as Didion’s depiction of an Eden whose compact with the snake in the garden includes the hot winds, the fires, the droughts and earthquakes, and a culture that seems unhinged, prone to murder.  Californians understand what it means to bear the Santa Ana, to watch the incinerated oak leaves fall from the sky, to dive under school desks when the building begins to shake.  The cults and random madness seem to be less immediate concerns.

Unlike observers from elsewhere, who write about these phenomena as freakish, exotic events, Didion wrote about them for what they were: permanent features of the landscape, an inescapable part of life in the garden.

###

The Power of Babel

Tower of Babel, woodcut, M.C. Escher, 1928. Via Wikipedia.

For most of the last nine months I have had the extraordinary benefit of intensive foreign language training.  I had resources, faculty, structure and time all to my benefit: online and computer resources, a diverse faculty from many countries to learn different accents and idioms, day-long small group classroom work and and intensive one-on-one training.  That I speak a new foreign language at all I owe to my instructors.  But the undeniable fact that I am not native, or even fluent, I can blame only on myself.

I can’t blame everything entirely on myself, but rather, on mysterious components of myself that seem to be beyond my conscious control.  I found that the most difficult, most unfathomable, most unpredictable aspects of my training came entirely from the cubic foot of space inside my head.

Your brain is not your friend

Perhaps the most astounding and frustrating aspect of language training was the involuntary reaction my brain had to responding to this new input.  In short, I found myself inadvertently speaking or substituting prior languages I had learned for the new language I was trying to learn.  This could be as vague as mispronouncing homonyms or cognates or as physical as substituting the word with the rudimentary sign language I learned 15 years ago.  It seemed, then, that my brain was resisting the “overwrite” my previous non-native languages, or confused anything “non-native” in my head.  I was not alone.  For anyone with previous language instruction, however old it was, the brain had a tendency to reach back and substitute old French, say, or Italian, for the new language.

This goes quite against everything I had heard or thought about new language acquisition, at least when I was much younger.  Knowing a foreign language helps acquire new foreign languages.  Indeed, the friends and family who speak many languages find it considerably easier — or they at least learn more successfully — to acquire more.  And for myself this is true as far as it goes: my prior language provided a context for understanding structure and grammar, recognizing cognates, memorizing words and verb tables, and so on.

It goes without saying that I never contended with the active opposition of my own brain to absorb a new language.

Immersion is a myth

This may be the result of being a native anglophone in a world that increasingly uses English as a common second language.  I benefited from intense, immersion-like training  during which my colleagues spoke nothing but the foreign language for hours.  This helped, as far as it goes.  Because once we left class, we were back in our native language environment.  I feel like there is a switch in my brain that toggles between “native” and “foreign” languages and it is thrown one way or another depending on my environment.  When the switch is off, I’m not learning.

It’s certainly easier to learn when the switch is always on “foreign” and indeed the gold standard is simply living, learning and speaking in the country you expect to travel to.  But now that I am abroad again, I see how difficult it is to achieve a totally immersive environment.  English is used everywhere, on the radio, on billboards, in magazines, songs and movies.  Every time I recognize a new word in English, that switch in my head gets flipped back from “foreign” to “native”.

There is no substitute for long, hard work…

In the end, unless you are innately gifted, acquiring a new language takes long hours of concentrated effort.  It is a methodical and slow process.  There is nothing quick or simple about it, and those language schools that promise acquisition in six weeks strike me as fraudulent.   I never could see progress from week to week.  Day to day was worse — fall-backs and regressions more than outnumbered the minor triumphs.  That’s because real progress comes over months.  For example, one day, about three months into my training, I realized I could recognize all the individual words in a foreign language broadcast.  That helped my verbal acquisition (not to mention confidence) immeasurably, but I had to work a long time to get there.

…except using your language in a real context every day

That said, there is nothing like using your new language in real-life context every day.  Real life forces you to do things you never trained for in the classroom.  It is virtually impossible to explain the difference, particularly to those slogging through the middle part of their language training, but using your language in a real context is both liberating and more challenging than the classroom.  That is as it should be.

People are forgiving

One of my favorite stories about foreign language acquisition involves Facebook’s Mark Zuckerberg.  His wife’s family is Chinese and he made a concerted effort to learn Mandarin.  He deployed his new language before a Chinese audience in Beijing in 2014.  The reaction of the audience struck me — they were delighted that he made the effort.  More importantly, when he persisted in speaking Chinese, the interviewer and the audience adapted and eagerly helped him where he struggled.  The interviewer kept the questions simple.  The audience shouted out words to Zuckerberg when he got stuck, urging him on.  The audience was clearly deeply flattered (and entertained) that he completed the 30-minute interview in Mandarin.

I hope this story provides some solace to my colleagues who learned Mandarin.  But I’ve found that, again, real life mirrors this story.  When you learn a new language and are struggling to use it, people recognize the effort and try to help.  People are forgiving.  In the end, the real goal is not a perfect, grammatically correct, fluently pronounced sentence but understanding.  Understanding always involves at least two people and in my experience most people want to understand and will help you reach that ultimate goal.

###