Monday, May 23, 2016

War is Rude: Mrs. Miniver

Reprinted with permission of World War II Magazine
 
Mrs. Miniver tells the story of life in the U.K. 1939-1941 experienced, as the opening crawl explains, by an “average English middle class family”—which just happens to be able to afford servants and a spacious residence whose lawn extends to a dock on the Thames River. Shot in Hollywood starting on November 11, 1941, and released in June 1942, the MGM film became that year’s highest grossing picture and garnered six Academy awards, including Best Picture, Best Director (William Wyler), Best Actress (Greer Garson in the title role), and Best Supporting Actress (Teresa Wright). Its appeal has endured; the American Film Institute ranks the film 40th on its list of “America’s most inspiring movies.”
 
Based on Jan Struther’s 1940 bestseller Mrs. Miniver, a compendium of anecdotal British newspaper columns, the movie has only tenuous ties to the book, whose arc barely reaches the war’s outbreak. Working with six screenwriters, director Wyler extended Struther’s timeline to incorporate Dunkirk and the Battle of Britain.

At the outset, Kay Miniver (Garson), husband Clem (Walter Pidgeon), and eldest son Vin (Richard Ney) are—as that crawl puts it—among England’s “happy, careless people . . . in that happy, easy-going England that was so soon to be fighting desperately for her way of life and for life itself.” Even when war breaks out the Minivers’ comfortable lives change little, though Vin does become an RAF fighter pilot. Reality intrudes only when Clem sails his motorboat to join a  flotilla that—offscreen—rescues the British army from Dunkirk.

While Clem is busy across the Channel, a downed German pilot (Helmut Dantine) finds his way into the Miniver home. The screenwriters, responding to Americans’ increasing admiration for England, reworked this sequence several times. In a draft predating Pearl Harbor, Clem is present; the German, wounded and frightened, yields to Kay’s entreaty to give himself up and get medical attention. But in a post-December 7 version, Clem is absent, the German has a pistol, and Kay clearly in peril. And the finished film portrays the flyer as a fanatic. He holds Kay at gunpoint, demanding food he wolfs down like a wild animal before fainting from blood loss. Barely rattled, she takes away the pistol and phones the police. When the pilot revives, too weak to be dangerous, Kay sympathetically tends him. He’ll soon be “wonderfully looked after in a hospital,” she says. “The war won’t last forever.”
Mrs. Miniver’s trademark benevolence enrages the German. “We will bomb your cities,” he storms. “Rotterdam we destroy in two hours. Thirty thousand in two hours. And we will do the same here!” This is unspeakably rude (as well as inaccurate, since the Rotterdam bombings claimed only 884 people) and Mrs. Miniver slaps the brute. No sooner have the bobbies collected their prisoner than Clem chugs up to the Miniver dock. His Dunkirk circuit has battered his boat. Nonetheless he behaves much as a man just back from a particularly grueling business trip. Clem greets his family, falls asleep, and upon waking ten hours later asks if news of the Dunkirk evacuation is in the papers. It is, Kay says.
 
“Thank heavens,” Clem replies. “I shan’t have to tell you about it.” And he doesn’t.

Neither does Mrs. Miniver mention her adventure, until the household cook interrupts to announce that she has no ham for Mr. Miniver’s breakfast, explaining to Mrs. Miniver: “You gave it all to that German pilot.”  Not a problem, Mrs. Miniver replies; Mr. Miniver likes bacon, too. Naturally Clem asks what German pilot, and in a playful scene Kay treats the episode in an elaborately coy manner that is utterly typical of the film. 

Come what may, the Minivers’ lives remain idyllic. Yes, there are air raids, but the family hunkers in its Anderson shelter and Clem talks about how much he likes Alice in Wonderland. Vin weds the lovely Carol Beldon (Teresa Wright), adding another Mrs. Miniver to the household—and not incidentally a status dimension to the film. Carol’s grandmother is Lady Beldon (Dame May Whitty), a local aristocrat who frequently sniffs about the “middle classes” in a successful bid by the filmmakers to make the unpretentious but well-to-do Minivers more sympathetic to egalitarian American moviegoers. In the same vein, the film generally conveys that in wartime England class distinctions do not matter. Even Lady Beldon unbends; at the annual village floral competition—which by tradition she always wins—the grand dame permits a kindly stationmaster’s entry, dubbed the “Mrs. Miniver rose,” to claim first prize.

Tragedy invades only near the conclusion. Just hours after the floral show an air raid claims several villagers, including the stationmaster and Vin’s new bride, Carol. At a memorial service in a heavily damaged church, the Vicar (Henry Wilcoxon) delivers a powerful sermon. In the October 1941 shooting script, his homily was the 91st Psalm, which presents the Almighty as a refuge and a fortress. But after Pearl Harbor, a stirring peroration was inserted just after the scripture passage, composed by Wyler and Wilcoxon just hours before the scene was filmed. In it, the Vicar laments the dead innocents, declaring that theirs is a war in which everyone must share the burden of freedom’s struggle against tyranny. “This is the people’s war,” he concludes. “It is our war. We are the fighters. Fight it, then! Fight it with all that is in us!  And may God defend the right.” The camera holds on Mr. and Mrs. Miniver, resolute and dry-eyed, as the congregation sings “Onward, Christian Soldiers.” It then pans upward toward a gaping hole in the church roof.  The camera then dollies toward it, enlarging the hole so that the audience gets the full effect when a squadron of Spitfires passes overhead, flying toward the foe.

The film works because strong performances—especially Garson’s—redeem preposterous characters, and the Vicar’s concluding oration has enduring power. However, Mrs. Miniver presents war more as a case of dreadful manners than as something truly fearful: a sanitized way to promote American solidarity with Britain without hinting at how horrible that “people’s war” would be.

Thursday, May 19, 2016

American Iliad: The Sword of General Lee

Millions of American love the Civil War. Last summer I began work on “American Iliad,” a new regular column in The Civil War Monitor that examines one of the major reasons that they do. Some episodes from the conflict are so compelling that readers know them by heart. These episodes transcend mere history because human beings are story-telling creatures who make sense of the world through stories. They have an out-sized importance because they reveal what we want to believe, not just about the nature of the war, but also about the nature of life. Just as the ancient Greeks conveyed profound truths about the human condition through myth, so too does our national myth, the American Iliad. Each column explores a classic story from this myth and discusses the reason it continues to exert such a powerful grip on our imagination.

Here's the inaugural column, published in The Civil War Monitor 6/3 (Fall 2015):28-29, 72. Republished with permission.


Coined by the popular historian Otto Eisenschiml in 1947, the term “American Iliad” brilliantly captures the essence of so many Americans’ passion for the Civil War. A man once told me, in complete seriousness, that the story of the war was his religion. And while a conflict that killed over 2 percent of our country’s population may not be the best thing to construct an entire worldview around, it is probably not the worst thing, either. Abraham Lincoln’s magnanimity, Ulysses S. Grant’s perseverance in the face of all obstacles, and Robert E. Lee’s grace in defeat all provide strong life lessons. Although it is plainly exceptional to regard the Civil War a religion, it is obvious that the Civil War routinely functions as a national myth, a way to understand ourselves as Americans. And like the classic mythologies of old, it contains timeless wisdom of what it means to be a human being. Homer’s Iliad tells us much about war, but it also tells us much about life. The American Iliad does the same thing.

Foundational to the American Iliad is the conviction that the conflict was not a struggle between darkness and light, freedom and tyranny, but rather between two sides, equally gallant, committed to different but morally equivalent visions of the American republic, and therefore caught up in a tragedy larger than themselves, “a war of brothers.”

In his Pulitzer Prize-winning novel The Killer Angels, arguably one of the most influential modern re-tellings of this Iliad, Michael Shaara has Confederate general James Longstreet muse, “The war had come as a nightmare in which you chose your nightmare side.” Robert E. Lee provides the most prominent example of a soldier forced to make this choice. With the exception of Lincoln, Lee is perhaps the foremost protagonist in the American Iliad’s pantheon. In mythic terms he is the ideal man, the perfect warrior, the flawless gentleman—“the Christlike Lee,” as historian Kenneth Stampp once put it. On the eve of the war he is a full colonel, so obviously gifted that Lincoln offers him command of the armies that must extinguish the rebellion should war break out. Lee declines. Despite a lifetime’s service to the United States—and despite telling his siblings that “I recognize no necessity for this [rebellion]”—he feels honor-bound to resign from the U.S. Army when Virginia, his homeland, secedes. “Save in defense of my native State,” he says, “I have no desire ever again to draw my sword.” But of course he must defend his native state. Thus it is the protection of hearth and home, not the abstract principle of states’ rights and certainly not the preservation of slavery, which governs his decision. “I did only what my duty demanded,” he would write in 1868. “I could have taken no other course without dishonor. And if it all were to be done over again, I should act in precisely the same manner.”  Historian Alan Nolan once questioned whether Lee’s fateful choice to draw his sword against the United States was as ethical as the great captain maintained. But the American Iliad is emphatic that it was indeed Lee’s only path.

Lee’s decision provides the opening of the American Iliad, for, like Homer’s Iliad, this Iliad begins with the war underway. It is the first iconic episode, by which I mean an episode that Civil War buffs know by heart. A typical buff can take you step by step through the three days of Gettysburg but generally knows far less detail about the origins of the conflict.  Instead they have rather vague ideas that the South left the Union in defense of states’ rights against a government that embodied centralized political power, or that the war pitted the agrarian South against the industrial North; or even that it was a cultural clash between a supposedly Celtic South against a supposedly Anglo North. These explanations are nearly always asserted, not argued. They function simply to push the sordid political details (particularly the defense of slavery) out of the picture and just get on with the almost purely military story, for the American Iliad is all about generals, soldiers, and battles. The war, in mythic terms, is not “a continuation of politics,” to use the famous definition of war espoused by the Prussian military theorist Carl von Clausewitz. It’s not a continuation of anything. It’s just there.

And thus Lee is helplessly in the grip of something that resembles a natural disaster more than a human-created event. The war forces upon him a choice that is no choice at all, for his honor and integrity require that he serve the Confederacy. And in mythic imagination, Lee’s decision symbolizes the honor and integrity of every one of the 800,000 southern men who take up arms not from any political ideology, but rather because they must protect their families, their neighbors, their homeland.

Monday, May 16, 2016

Los Hombres Armados Redux

Almost three years ago I wrote a post on my encounter with a man who had grown up during the civil war in El Salvador (1979-1992).  I entitled it Los Hombres Armados.  I could be content just to provide you with the link, but the post is important enough to this one that I'm not taking the chance that you won't follow the link.  Here's Los Hombres Armados, reprinted in full:

There’s a neighborhood bar not far from where I live.  I drop by often enough that the bartenders know me and automatically get me my beer of choice.  It’s a friendly place and easy to make conversation.

Back in mid-December the coverage of the shooting at Sandy Hook Elementary School was almost wall-to-wall.  One evening it was silently unfolding on one of the bar’s muted televisions. I noticed a Hispanic man watching the images, his eyes wet with tears.  A short time later we began talking and I found out why.

The man–I’ll call him Fernando–was thirty-eight years old and had grown up in El Salvador during its long civil war (1979-1992).  The conflict was between the right-wing government, with its death squads, and the Farabundo MartĂ­ National Liberation Front (FMNLF), an umbrella term for several left-wing guerrilla groups.  Fernando’s parents were afraid of both sides.  For them it was simply a matter of los hombres armados:  the men with guns.

When Fernando was a little boy, he told me, his parents would sometimes take him from their house and spend the night hiding in the woods, with a hand cupped over Fernando’s mouth to keep him from crying out. We think of school shootings and civil wars as worlds apart.  But for Fernando, the former was irresistibly reminiscent of the latter.

I have since talked to Fernando on several other occasions.  We never speak of the civil war but it plainly haunts him.  At some point–I have never asked how–he acquired an M-16, perhaps because he eventually joined one side or the other.  Although he left the weapon behind him in El Salvador, he once told me he has never felt comfortable without it, and he alternates between having thoughts of violence and thoughts of running away.  He becomes tearful easily and indeed, never seems far away from weeping.  Although his case is undiagnosed, he almost certainly suffers from Post-Traumatic Stress Disorder (PTSD).

As a military historian, I have never quite known what to make of Fernando’s equation of the Sandy Hook murders with his own childhood.  But it is the same equation that others make who have to live with the threat or reality of mass killing.  For military veterans present at the recent Boston Marathon Bombing, the scene resembled the aftermath of an IED blast.  People residing in neighborhoods wracked with gang violence must know the same fear that Fernando’s parents did.  Fernando is a reminder, I suppose, that although we define the boundaries of our field as centrally concerned with political violence, the lived experience of people caught up in violence is essentially the same.

Last evening I had another encounter at that same neighborhood bar.  I was having a conversation with Steve, a dj who's worked at this bar--the Crazee Mule Saloon, whose patrons call it simply the Mule--for as long as I've been going to the Mule, which is at this point is going on four years now.

In addition to his job as a dj, Steve is a very serious conservative political commentator.  You'll find his Twitter feed, called Oracle of Ohio, here.    It, in turn, will lead you to Steve's blog, The Future of the Republic.  You have to examine the blog pretty closely to discern that Steve publishes it, but you can do it, hence my decision to use his real first name rather than a pseudonym.

Steve describes himself thus:  "writer/political scientist, capitalist business owner, Reagan republican.
Publisher, The Future of the Republic American."  His Twitter feed and blog form a good example of the dominant form of political discourse in America today--and for that matter the past decade and beyond.  That is to say, it's intensely partisan and demonizes the political Left.

I've no intention of picking on Steve or his form of political engagement.  We have contrasting views on practically everything, but Steve represents the substantial segment of Americans who are politically engaged, which is what the Founders wanted, expected, and believed the republic required if their political experiment would succeed.  And needless to add, there are plenty of Left-leaning Twitter feeds and blogs just like it.

Wednesday, December 23, 2015

The Meeting from Hell: Conspiracy

Originally published in World War II magazine.  Reprinted with permission.

We’ve all attended this meeting, convened by the leadership to discuss some new organizational undertaking and—ostensibly—to collect and synthesize the views of all assembled. Neophytes among us believe that; the more experienced know the score. We enter the conference room resigned, wary, even disposed to revolt. But the leadership has the clout to cram its agenda down our collective throats.

Such meetings occur in all walks of life: governmental, commercial, educational, ecclesiastical. The conference held on January 20, 1942, in Wannsee, a lakeside district southwest of Berlin, was like any other of its kind—except that this meeting was organized by the Schutzstaffel, and its agenda was the destruction of 11 million European Jews.

Directed by Frank Pierson and released in 2001, Conspiracy re-creates the Wannsee Conference in nearly real time, using as a set the mansion in which the actual event took place. Writer Loring Mandel based his script on the “Wannsee Protocol”—the meeting’s top-secret minutes. The original document is deliberately vague; its language gives no hint that the subject is mass murder. Nor does the protocol paint the conference as anything less than wholly harmonious. But anyone who has watched bureaucrats war over turf knows differently, and a close reading of the minutes suggests fault lines and objections. The filmmakers have fleshed these out to deliver a riveting drama that takes place almost entirely around a large conference table.

Conspiracy opens with SS Lieutenant Colonel Adolph Eichmann (Stanley Tucci) overseeing final touches for the meeting that include excellent wines, fine cigars, and a lavish buffet lunch for 15. As they enter, the guests, who represent some of Nazi Germany’s most powerful men, introduce themselves to one another and the viewer. Two look decidedly glum: Dr. Friedrich Kritzinger (David Threlfall), deputy head of the Reich Chancellery; and Dr. Wilhelm Stuckart (Colin Firth), chief architect of the Nuremberg Laws that have legally stripped German Jews of their civil rights, defining “Jew” using formulas of Stuckart’s devising. Kritzinger and Stuckart believe that their offices have resolved the “Jewish question.” Suspicious that the SS, the meeting’s sponsor, is about to hijack that “question” and impose a solution of its own, the two quietly grouse to one another.

 Last to arrive is the host, SS ObergruppenfĂĽhrer Reinhard Heydrich (Kenneth Branagh), Chief of the Reich Security Main Office. Heydrich, who called the meeting, presides with the jaunty, self-satisfied air of a man who knows that he is going places. He begins by quoting a directive from Reichsmarschall Herman Goering that assigns Heydrich to find “a complete solution of the Jewish question in the German sphere of influence in Europe” and stipulates that relevant government agencies are to “cooperate” with the security chief in this endeavor. Kritzinger instantly objects; the Chancellery, he declares, has received no directive on this subject. He fruitlessly tries to gain a hearing but Heydrich smoothly and stubbornly plows on, leaving the sidelined Kritzinger to fume.

Sunday, June 28, 2015

The Rout of the Confederate Flag

Cross posted from Civil Warriors



As surely you already know, in the wake of the murder of nine worshipers at Emanuel African Methodist Episcopal Church in Charleston, South Carolina, a petition was circulated demanding the removal of a Confederate battle flag from the grounds of the South Carolina state house. It quickly generated a massive number of signatures and a national ground swell of pressure arose for the state government to do precisely that. This story is already well-covered on Kevin Levin's Civil War Memory blog, so I won't rehearse the details any further.


Just for the record, I fully support the removal. The battle flag represents an army that fought for the preservation of slavery and has a long, notorious association with white racism. Yet it flies on the capitol grounds of a state whose population is 30 percent African American, most of them descendants of slaves. But this post isn't about that. It's about recent decisions to eliminate sales of the Confederate flag or to forbid its presence in certain sites. Those sites do not include National Park Service battlefields. But the park service has adopted a policy of ending sales of souvenirs in which the Confederate flag is depicted on "standalone" merchandise; that is to say, merchandise devoid of historical context. Gettysburg National Military Park has reportedly urged private businesses in Gettysburg to do the same. Tragically, this would result in the elimination of merchandise such as this:


Confederate Swim Suit Gettysburg Summer 2009 (sm)


I'm still learning about this issue, so consider this post a work in progress. It first got on my radar thanks to a Facebook status update by a friend of mine. Since his privacy settings are limited to friends, I'll quote the update without attribution:
So according to the NPS page, the only Confederate flags allowed are with permitted and approved living history events. You or I couldn't have one, you can't have one on your vehicle (I assume that means stickers too). The Lutheran Seminary at Gettysburg has banned them on the outside grounds completely.

Friday, June 26, 2015

Nonsense About the Confederate Flag

From (for some bizarre reason) FoxSports, written by one Clay Travis and entitled, "On the Confederate Flag."

Travis begins:
Only in modern day America could a racist psychopath kill nine people in a Southern church and the focus turn to a flag. Only in modern day America could our nation's largest retailer, Wal Mart, announce -- to substantial applause -- that it will no longer sell merchandise featuring the Confederate flag, but will continue to permit any mentally ill nut on the street to walk into its store and buy as many guns and ammunition as he can afford.

Did I miss the part of this story... where Dylann [sic] Roof stabbed nine people to death with a flag? Because every time I think we can't get dumber on social media, we get dumber.
Here's a link to the complete opinion piece.

Possibly I missed something, but in this writer's entire rant I never saw an effort to explain why it's important that the Confederate flag continue to fly over the South Carolina state house, or any acknowledgment of the arguments concerning why it is problematic to do so. Instead it suggests--actually, damn near says outright--that the arguments are focused on eliminating usage of the Confederate flag, period.

That's absurd. I would imagine that in the vastness of the Internet you can find someone advocating making the flag illegal, but no one I have seen is doing that. I think it is a matter of simple decency to remove the flag from a government facility that is supposed to represent every citizen in South Carolina, 30 percent of whom are African Americans, in most cases the descendants of slaves.

I think it is a matter of simple First Amendment rights to use the flag in non-government contexts: Civil War re-enactors, private homes, automobiles like the General Lee, even KKK rallies. Aside from re-enactors and other highly contextualized uses, know that you're going to offend a lot of people and some of them may tell you they're offended, but that's their First Amendment right. To paraphrase the author, Just when I think opinion pieces on this issue can't get any dumber, someone proves me wrong.

Monday, May 25, 2015

State of the Field: Military History/History of the Military

These were my opening remarks at an Organization of American Historians Round Table Session (commissioned by the Program Committee) at the OAH Annual Meeting in Saint Louis back in mid-April. My fellow panelists were Christian Appy, a professor of history at the University of Massachusetts, Amherst and the author of several books, including American Reckoning: The Vietnam War and Our National Identity; Meredith Lair, an associate professor in the Department of History & Art History at George Mason University, who is the author of Armed with Abundance: Consumerism and Soldiering in the Vietnam War; and Tami Davis Biddle, Professor of History and National Security Strategy at the US Army War College, who is the author of numerous articles and book chapters, as well as Rhetoric and Reality in Air Warfare: The Evolution of British and American Ideas about Strategic Bombing, 1914-1945.

Snapshots of the current state of a given field can be among the most interesting and valuable sessions at a conference, so when I was asked to participate in this one I accepted the invitation with pleasure. But once I began preparing these brief opening remarks I found myself with questions, mostly centering on what it means to speak of the “state” of a field. It seems to me that the term can indicate at least three things. It might mean the intellectual state of the field—the questions currently being asked most urgently, new conceptual frameworks and methodologies, and so on. For younger fields it might also mean the state of the field in terms of its maturity: for instance, just how many historians are now at work within it, how many history departments regard it as important enough to justify the creation or maintenance of one or more faculty lines? Related to this second meaning is a third, the general acceptance of the field within the overall discipline.

For me at least, it’s impossible to think of the state of military history, in any of the above meanings of the term, without being reminded that military history in the United States is an unusual field. Although it has been an academic field—in the sense of having PhDs trained specifically as military historians—since about 1970, the field has always had a powerful connection with an entity outside academe, namely the American military establishment. Indeed, our flagship organization, the Society for Military History, is a descendant of the American Military Institute, created by a group of active and retired U.S. Army officers as well as interested amateurs in the early 1930’s. Over time, as civilian scholars emerged who self-identified as military historians, they more or less glommed on to the AMI until around 1990 they acquired sufficient critical mass to turn the AMI into a conventional academic organization. Under academic leadership the organization changed its name, began to hold an annual conference, and created a refereed journal, the Journal of Military History.

Sunday, May 03, 2015

Patton Explains Academe

It is a huge lecture hall.  An image saying “Speak truth to the powerless” dominates the screen.  Patton emerges from his grave.

Be paupers.

Now I want you to remember that few PhDs ever get the job they really wanted. They get used to taking a job at some college where they feel under-placed.

Now, all this stuff about there not being many jobs, much less tenure-track jobs, is absolute gospel. Colleges love to exploit PhDs.  Most real colleges love to make you adjuncts.

When you were undergrads you all admired the coolest lecturer, the trendy scholar, the big deal professors, the erudite intellectuals.  Honors students love to apply to grad school and cannot be dissuaded.  Applicants overestimate their chances all the time. I wouldn’t give a hoot in hell for your chances of ever getting benefits.  That’s why adjuncts have never gotten, and will never get a living wage. Because the very thought of paying a decent wage is hateful to administrators.

Now, tenured faculty like to complain on your behalf.  They talk, fume, and pretend to sympathize with your plight. This "we’re all in this together" stuff is a bunch of crap.  The Ivy League bastards who feign indignation in the Chronicle of Higher Education don’t know any more about real job inequities than they do about teaching eight courses a year.

Now you have the most unrealistic expectations, the best intentions, and the worst career path in the world.  You know, by god I actually pity you starry-eyed saps, by god I do.  We’re not going to just crush your spirit. We’re going to remove your grip on reality and make you TA’s to speed the progress of our research.  We’re going to exploit you gullible chumps by the bushel.

Now some of you innocents, I know, are thinking you’ll get a decent job.  Don’t count on it.  I can assure you that you will all get screwed to the wall.  Administrators will be your enemy. Cower before them. Take their crap.  Get ulcers in your belly.  When you get a salary cut, that a fortnight before was a solemn promise you wouldn’t, you’ll know you were screwed.

Now there's another thing I want you to remember. Administrators don't want to get any messages saying you need decent benefits. They’re not giving you anything. Let your parents do that. They are proliferating constantly and they’re not interested in paying anyone -- except themselves. They’re going to grab you where it hurts, and they’re gonna blow smoke up your ass. They’re gonna exploit the hell out of you all the time, and they’re gonna tell you fairy tales like a sociopathic Mother Goose!

Now there’s one thing that you will be able to mumble to yourself in your cramped apartment, and you may thank your vague spirituality for it. Twenty years from now, with your dead end job still crushing your soul, and the creditors at your door, and you wonder what the hell did you do with your life, you won’t have to think:  “Well, at least I didn’t teach online courses for Take Their Money U.”

All right now you human beings, you know I don’t care.

Oh.  I will be proud to lead you gullible fools down the garden path any time I can get my readings course to subscribe.

That’s all.

Monday, April 27, 2015

Beyond the Academic Cage: Observations of a New Federal Government Historian

A guest post by Dr. Frank Blazich., Naval History and Heritage Command.
The views expressed in this post are his alone, not those of the NHHC, Department of the Navy, or Department of Defense.

Thousands of men and women across the United States and overseas are engaged in the pursuit of a doctoral degree in history. Most desire an academic position upon completion of their studies (preferably a tenure-track faculty position at a research institution), a career marked by the familiar rhythm of instruction, research, writing, and intellectual development. Unfortunately, a downward trend in tenure-track positions, budget cuts, and a growing reliance on adjunct positions has sharply reduced the odds of satisfying that desire. Yet as Daniel Drezner recently argued in the Washington Post, most graduate students have “drunk the Kool-Aid”: they get so fixated on the academic track as the only track that they will prefer an adjunct slot—and the increasingly naked exploitation that goes with it (crappy pay, few or no benefits, scant job security)—to any of the other tracks available. Indeed, they may have their eyes so fixated on the academic track that they don’t even know that other tracks exist. There are, however, alternatives to consider and pursue while in graduate school.

Sure, I too drank some of that metaphorical “Kool-Aid” too (as Drezner observed was practically inevitable) but only enough, as it turns out, to have but temporary effect. Instead, I’ve found my way to gainful, fulfilling employment, and a salary comparable to that of starting tenure-track assistant professors.

I did it by following a road less frequently traveled. And therefore my task is to make suggestions that can benefit graduate students in military history who are nearing the defense of their dissertation.

Everything started for myself with the omnipotent question: “What do you really want to do with your life?” The answer: “To be a professional historian,” a goal I believed I could and would achieve within traditional academe. Just how to get there also seemed straightforward. A BA in history from the University of North Carolina at Chapel Hill, and then an MA from North Carolina State University, and  finally a PhD from Ohio State—got me to my destination. I was indeed a professional historian. Now what? The inevitable guidance I received from a number of advisors, with minor variation, resembled a Philip Glass composition, a minimalist melody of “. . . and you can teach . . . and teach . . . and then teach . . . teach, teach, teach. . . .”

The only problem was that I didn’t particularly want to teach in the sense that they meant. I saw more self-fulfillment from researching, writing—and teaching in a different way, through public engagement. I thereafter resolved to pursue a federal or private industry position as a historian, and fairly quickly found a position in the federal government that allowed me to be a professional historian on the terms I truly desired. The famed scholar of mythology, Joseph Campbell, counseled his students, “Follow your bliss.” Well, I had done just that. So permit me to make a somewhat more than modest proposal, based on my personal journey, as an antidote to the Kool-Aid. I direct that proposal to graduate students, to suggest a different way by which to follow their own bliss, to provide them with fodder to reexamine the doubts they almost certainly have—doubts embedded by the mantra of “teach, teach, teach” within academe—on being a historian outside of the academy.

1. Find work outside of academe. Plenty of organizations can use a trained historian with skills in research, communication (oral and written), analysis, and interpretation. In my case, one such organization came to me—the Civil Air Patrol—and asked, based on my doctoral research on civil defense and emergency management, if I could help research their history. I joined the organization and began volunteering anywhere from five to twenty hours a week. I began as an unpaid internship, but shortly rose to become the CAP’s Chief Historian. Another possibility is to pursue contract history; that is to say, researching and writing reports or white papers for businesses, governmental bodies, or “think tanks”. During the final stages of completing my Master’s thesis, I signed a contract to write the fiftieth anniversary history for one such think tank. I will not claim that after earning my master's that I wrote a masterpiece, and certainly not one that traditional academe would recognize. But the work provided me with the equivalent of a post-doc in research, analysis, oral history, and business principles.

2. Diversify your historical skills. Many graduate programs—though far from all—equip students with valuable, albeit somewhat rudimentary, teaching skills, either through specific courses or by the osmosis inherent in repeated years of coursework. But that skill set can be deployed in places other than academe. A professional historian in the academic sense is more than capable of preparing graphic display panels, storyboarding a museum exhibit, or engaging in archival screening. Yes, these are public history skills, but a public that seeks its own version of a liberal arts education values precisely those skills. Therefore give serious thought to “following your bliss” by a different path.

Pursue the opportunities (many yielding salary and benefits as good or better that that of a career in academe) to use your teaching skills in another way. For those with knowledge of foreign languages or cultural knowledge from work overseas, apply your specialized knowledge reflected in a PhD to working within a law firm, business, government or museum. They value someone who has that kind of knowledge—and is usually more ready, willing and able than academe to place a realistic price upon their services. And the more you can demonstrate to potential employers—in ways not all that different from the conference presentations and referred articles you ought to generate, if you know what’s good for you, during your years as a PhD candidate—that you are capable of applying your education on terms other than those demanded by academe, the more you can demonstrate that you’re capable of performing an array of tasks and jobs.

3. Embrace the public. History is a popular field with the general public. Moviegoers spend billions annually to watch films “based on a true story.” Facebook, Flickr, Twitter, and blogs populate the Internet with historical morsels, nourishing seemingly every intellectual palate. A cartoon making its round with historian emails carries the caption “Those who don’t study history are doomed to repeat it. Those who do study history are doomed to stand by helplessly while everyone else repeats it.” This begs the question, why on earth would qualified historians choose to resign themselves to that fate?

Public historians have plenty of ways to achieve Campbell’s “bliss”. We build massive libraries, devote resources to rescue and save textual and non-textual records, and devote lifetimes to the study of the past. How much of your work is shared with the public? How much is written in a form and language accessible to the general public? Put a bit more brutally, how much are you fulfilling the social responsibility obligation demanded by a society in return for giving academe the freedom it long ago granted the earliest professions (law, medicine, and clergy): the freedom to generate expertise by making its own selection process, establishing its own criteria for the acquisition of that expertise, and providing its own mechanisms by which to assess whether an aspiring professional has in fact demonstrated that she or he has achieved professional status.

Academe replies: we “teach, teach, teach,” knowing all along that it really views teaching as secondary to the generation of a scholarship overwhelmingly directed toward specialists within academe, scholarship nearly incomprehensible to anyone other than those specialists.

Actually, fulfilling that social responsibility obligation to extend education to society at large is not all that difficult, even for those whose bliss truly lies within academe. (My reservations about academe come mostly from my observations about academic administrators rather than academic practitioners). It is easy to get angry at an inaccurate internet or media article about a historical topic. You don’t have to seethe about it, to do nothing to “speak truth to idiocy”. Do not remain silent, but tactfully respond. Offer your insight, share your knowledge, stand up as the subject matter expert and embrace the communication tools of the present day.

One obvious tool lies within the blogosphere, where you can fight blog post with blog post. Perhaps you found the misleading article via Facebook. You can fight back through Facebook. Fight dubious Twitter tweets with counter tweets. People beyond academe appreciate the appraisal of a true expert. And surprising number of them will beat a path to your door, contract in hand, and employment options will materialize.

4. Consider your perspective. Don’t just react to misguided assertions based on flawed, misguided, or outright bogus historical perspectives. Be proactive about engaging the larger public. Historiography is the ideal example of this, where historians can analyze and examine the differing perspectives allotted to a topic by multiple scholars. Now apply this outside of academe. If you are a business executive, would you not want to know what courses of action your predecessors took? Which ones succeeded and why? Why did others fail? What if an organization’s history is exclusively institutional memory that exists only within the minds of a handful of long-standing employees who could retire? Your training as a historian is ideal for bringing real expertise to bear upon these and myriad other questions, to provide the needed—and therefore valued—answers to a corporation or non-academic institution. Leveraging an organization’s heritage, creating a usable institutional memory, can easily save untold resources by avoiding past mistakes, or perhaps targeting new geographic or demographic markets.

Academe does not have a monopoly on such thinking or thought process. So why would you place your career in the hands of an institution that increasingly, and pretty remorselessly, will treat you like a commodity (in the sense, as expressed in the film Trading Places, of “gold, silver, platinum, heating oil, propane, cocoa and sugar. And, of course, frozen concentrated orange juice.”). Why would you limit your employment options only to academe if academe is frequently (though not yet always) unable to remunerate you in the way you need and deserve in order to carve out the larger life—the bliss of a family for which you can adequately provide, the bliss of traveling to faraway places other than archives, the bliss of having the ability to provide substantial funds to the charities whose commitments you value.

The wider world needs the talents and capabilities of historians, be they in the government, business, law, medicine, or public service. Furthermore, as someone trained to craft and defend a position with evidence, why not use this as an asset when speaking with a recruiter or hiring authority? Consider their perspective, and make a compelling case how hiring a historian opens up possibilities to strengthen their organization they may never have considered.

5. Know thyself. This last point is perhaps mundane or irrelevant, but has pertinence. As chiseled in stone at the Temple of Apollo at Delphi, “know thyself” is worth remembering. “Historian” is not exclusive to academe but rather becomes an inner calling and integral component of your mental and intellectual processes. The job market is terrifying only if you restrict yourself to the restrictive market of academe, if you disregard the fact that the market is much larger than academe indoctrinates you to believe. As long as humanity exists, there will be a need to study and utilize past actions and accomplishments for the betterment of tomorrow. You can and will be hired because of your skills as a historian, and those skills will always remain the defining characteristics of your position.

Monday, March 09, 2015

Pulp Nonfiction and The Big Red One





This article was originally published in World War II magazine.  Reprinted with permission



Released in 1980, The Big Red One tells the story of a squad leader and four privates fighting in every World War II campaign involving the U.S. 1st Infantry Division, whose nickname gives the film its title. These men are not the squad’s only members; they’re just the ones who survive. The rest are simply replacements, who look upon the veteran privates—“the Four Horsemen,” they’re called—with awe: How on earth do these men repeatedly escape death? For their part, the Horsemen regard replacements as “dead men who temporarily had the use of their arms and legs.” Replacements come and go so fast the Horsemen learn to keep them at a distance. A joke that runs through the film is that even after months of training alongside fellow soldiers none of the quartet has bothered to learn their names.
           
Lee Marvin plays the squad leader, known simply as The Sergeant. He is a veteran of World War I, grizzled, taciturn, and utterly realistic. The Horsemen are played by Mark Hamill—yes, that Mark Hamill—Bobby Di Cicco, Kelly Ward, and Robert Carradine. In this ensemble effort Carradine’s character, Zab, ranks first among equals and has the most developed role—unsurprising, since Zab, who provides a voiceover narration, stands in for Samuel Fuller, the man who lovingly wrote and directed The Big Red One.
           
Sam Fuller (1912-1997) himself fought in The Big Red One. A precocious writer, he was covering crime for the New York Evening Graphic in his teens and as a young man scripted B movies. He said he joined the Army because “I had a helluva opportunity to cover the biggest crime story of the century, and nothing was going to stop me from being an eyewitness.” Superiors tried twice to make him an official army reporter, but he insisted on serving in the infantry. Like alter ego Zab, Fuller published a mystery novel at the height of the fighting, sold the film rights to Hollywood, and spent a thousand dollars of the proceeds on a party for his squad days before the Battle of the Bulge. Like Zab, Fuller chomped cigars and did everything possible to make himself seem larger than life. And like Zab, he got through the war in one piece.
           
Fuller went on to become a Hollywood director whose 23 films have a deserved reputation for combining gritty realism and vivid characters with unabashed melodrama. During his life Fuller won little critical acclaim, but he influenced directors who became masters, including Martin Scorsese (Taxi Driver, GoodFellas) and Quentin Tarantino (Pulp Fiction, Inglourious Basterds).
           
Fuller spent years trying to get The Big Red One financed. When he did, the studio, horrified by his film’s 4-1/2 hour length, demanded edits that reduced the running time to 113 minutes. Fuller initially claimed to be glad the movie had been produced at all, but subsequently expressed regret that the public had not seen the movie he had imagined. His friend and admirer, film critic Richard Schickel, felt the same. After Fuller’s death Schickel supervised a “reconstruction” of The Big Red One, incorporating 50 minutes of footage the studio had cut. Upon the expanded version’s release in 2004, Pulitzer-prize winning critic Roger Ebert promptly placed it on his list of Great Movies.
           
The Big Red One isn’t merely based on Fuller’s wartime experiences. It pretty much recounts them—something that becomes obvious if you read his memoir, A Third Face, published posthumously in 2002. Even the least likely episodes turn out to dramatize events that really occurred: the private in Sicily who has a testicle blown off and is told that that’s why the good Lord gave him two, the raid on a Belgian insane asylum, the underage civilian sniper who when captured isn’t shot but spanked, the liberation of a concentration camp. Zab’s narration is pure Fuller, as when the Horseman explains an explosive device used to clear concertina wire on Omaha Beach. “The Bangalore torpedo was 50 feet long and packed with 85 pounds of TNT and you assembled it along the way, by hand,” Zab says. “I’d love to meet the asshole who invented it.”
           
Almost none of the film’s incidents build upon one another, but instead stand as deliberately unrelated vignettes. Fuller maintained that that is the combat soldier’s life: a string of vivid episodes, full of sound and fury signifying nothing. Thus he could place a firefight in a Roman amphitheater without invoking gladiators, as many another filmmaker might. In a heavy-handed scene, a battle is under way in an insane asylum when an inmate grabs a submachine gun. “I am one of you!” the madman shouts as he indiscriminately sprays bullets. “I am sane! I am sane!” This would be the clumsiest kind of symbolism—except that Fuller was recreating a wartime incident he witnessed and in his memoir even supplies the patient’s name.
           
At the conclusion, Zab conveys the picture’s moral: “Surviving is the only glory in war.”  This makes The Big Red One sound like an anti-war film. It isn’t. Fuller plainly regarded World War II as a great adventure, and his mindset seeps into The Big Red One, lending the film a seductive appeal. The focus on Zab and his comrades tempts the viewer to identify with the Four Horsemen, to imagine participating in the adventure and coming away unscathed. It’s easy to forget that in combat, a soldier’s fate is likelier to be that of a replacement.