Wednesday, December 23, 2015

The Meeting from Hell: Conspiracy

Originally published in World War II magazine.  Reprinted with permission.

We’ve all attended this meeting, convened by the leadership to discuss some new organizational undertaking and—ostensibly—to collect and synthesize the views of all assembled. Neophytes among us believe that; the more experienced know the score. We enter the conference room resigned, wary, even disposed to revolt. But the leadership has the clout to cram its agenda down our collective throats.

Such meetings occur in all walks of life: governmental, commercial, educational, ecclesiastical. The conference held on January 20, 1942, in Wannsee, a lakeside district southwest of Berlin, was like any other of its kind—except that this meeting was organized by the Schutzstaffel, and its agenda was the destruction of 11 million European Jews.

Directed by Frank Pierson and released in 2001, Conspiracy re-creates the Wannsee Conference in nearly real time, using as a set the mansion in which the actual event took place. Writer Loring Mandel based his script on the “Wannsee Protocol”—the meeting’s top-secret minutes. The original document is deliberately vague; its language gives no hint that the subject is mass murder. Nor does the protocol paint the conference as anything less than wholly harmonious. But anyone who has watched bureaucrats war over turf knows differently, and a close reading of the minutes suggests fault lines and objections. The filmmakers have fleshed these out to deliver a riveting drama that takes place almost entirely around a large conference table.

Conspiracy opens with SS Lieutenant Colonel Adolph Eichmann (Stanley Tucci) overseeing final touches for the meeting that include excellent wines, fine cigars, and a lavish buffet lunch for 15. As they enter, the guests, who represent some of Nazi Germany’s most powerful men, introduce themselves to one another and the viewer. Two look decidedly glum: Dr. Friedrich Kritzinger (David Threlfall), deputy head of the Reich Chancellery; and Dr. Wilhelm Stuckart (Colin Firth), chief architect of the Nuremberg Laws that have legally stripped German Jews of their civil rights, defining “Jew” using formulas of Stuckart’s devising. Kritzinger and Stuckart believe that their offices have resolved the “Jewish question.” Suspicious that the SS, the meeting’s sponsor, is about to hijack that “question” and impose a solution of its own, the two quietly grouse to one another.

 Last to arrive is the host, SS Obergruppenführer Reinhard Heydrich (Kenneth Branagh), Chief of the Reich Security Main Office. Heydrich, who called the meeting, presides with the jaunty, self-satisfied air of a man who knows that he is going places. He begins by quoting a directive from Reichsmarschall Herman Goering that assigns Heydrich to find “a complete solution of the Jewish question in the German sphere of influence in Europe” and stipulates that relevant government agencies are to “cooperate” with the security chief in this endeavor. Kritzinger instantly objects; the Chancellery, he declares, has received no directive on this subject. He fruitlessly tries to gain a hearing but Heydrich smoothly and stubbornly plows on, leaving the sidelined Kritzinger to fume.

Sunday, June 28, 2015

The Rout of the Confederate Flag

Cross posted from Civil Warriors



As surely you already know, in the wake of the murder of nine worshipers at Emanuel African Methodist Episcopal Church in Charleston, South Carolina, a petition was circulated demanding the removal of a Confederate battle flag from the grounds of the South Carolina state house. It quickly generated a massive number of signatures and a national ground swell of pressure arose for the state government to do precisely that. This story is already well-covered on Kevin Levin's Civil War Memory blog, so I won't rehearse the details any further.


Just for the record, I fully support the removal. The battle flag represents an army that fought for the preservation of slavery and has a long, notorious association with white racism. Yet it flies on the capitol grounds of a state whose population is 30 percent African American, most of them descendants of slaves. But this post isn't about that. It's about recent decisions to eliminate sales of the Confederate flag or to forbid its presence in certain sites. Those sites do not include National Park Service battlefields. But the park service has adopted a policy of ending sales of souvenirs in which the Confederate flag is depicted on "standalone" merchandise; that is to say, merchandise devoid of historical context. Gettysburg National Military Park has reportedly urged private businesses in Gettysburg to do the same. Tragically, this would result in the elimination of merchandise such as this:


Confederate Swim Suit Gettysburg Summer 2009 (sm)


I'm still learning about this issue, so consider this post a work in progress. It first got on my radar thanks to a Facebook status update by a friend of mine. Since his privacy settings are limited to friends, I'll quote the update without attribution:
So according to the NPS page, the only Confederate flags allowed are with permitted and approved living history events. You or I couldn't have one, you can't have one on your vehicle (I assume that means stickers too). The Lutheran Seminary at Gettysburg has banned them on the outside grounds completely.

Friday, June 26, 2015

Nonsense About the Confederate Flag

From (for some bizarre reason) FoxSports, written by one Clay Travis and entitled, "On the Confederate Flag."

Travis begins:
Only in modern day America could a racist psychopath kill nine people in a Southern church and the focus turn to a flag. Only in modern day America could our nation's largest retailer, Wal Mart, announce -- to substantial applause -- that it will no longer sell merchandise featuring the Confederate flag, but will continue to permit any mentally ill nut on the street to walk into its store and buy as many guns and ammunition as he can afford.

Did I miss the part of this story... where Dylann [sic] Roof stabbed nine people to death with a flag? Because every time I think we can't get dumber on social media, we get dumber.
Here's a link to the complete opinion piece.

Possibly I missed something, but in this writer's entire rant I never saw an effort to explain why it's important that the Confederate flag continue to fly over the South Carolina state house, or any acknowledgment of the arguments concerning why it is problematic to do so. Instead it suggests--actually, damn near says outright--that the arguments are focused on eliminating usage of the Confederate flag, period.

That's absurd. I would imagine that in the vastness of the Internet you can find someone advocating making the flag illegal, but no one I have seen is doing that. I think it is a matter of simple decency to remove the flag from a government facility that is supposed to represent every citizen in South Carolina, 30 percent of whom are African Americans, in most cases the descendants of slaves.

I think it is a matter of simple First Amendment rights to use the flag in non-government contexts: Civil War re-enactors, private homes, automobiles like the General Lee, even KKK rallies. Aside from re-enactors and other highly contextualized uses, know that you're going to offend a lot of people and some of them may tell you they're offended, but that's their First Amendment right. To paraphrase the author, Just when I think opinion pieces on this issue can't get any dumber, someone proves me wrong.

Monday, May 25, 2015

State of the Field: Military History/History of the Military

These were my opening remarks at an Organization of American Historians Round Table Session (commissioned by the Program Committee) at the OAH Annual Meeting in Saint Louis back in mid-April. My fellow panelists were Christian Appy, a professor of history at the University of Massachusetts, Amherst and the author of several books, including American Reckoning: The Vietnam War and Our National Identity; Meredith Lair, an associate professor in the Department of History & Art History at George Mason University, who is the author of Armed with Abundance: Consumerism and Soldiering in the Vietnam War; and Tami Davis Biddle, Professor of History and National Security Strategy at the US Army War College, who is the author of numerous articles and book chapters, as well as Rhetoric and Reality in Air Warfare: The Evolution of British and American Ideas about Strategic Bombing, 1914-1945.

Snapshots of the current state of a given field can be among the most interesting and valuable sessions at a conference, so when I was asked to participate in this one I accepted the invitation with pleasure. But once I began preparing these brief opening remarks I found myself with questions, mostly centering on what it means to speak of the “state” of a field. It seems to me that the term can indicate at least three things. It might mean the intellectual state of the field—the questions currently being asked most urgently, new conceptual frameworks and methodologies, and so on. For younger fields it might also mean the state of the field in terms of its maturity: for instance, just how many historians are now at work within it, how many history departments regard it as important enough to justify the creation or maintenance of one or more faculty lines? Related to this second meaning is a third, the general acceptance of the field within the overall discipline.

For me at least, it’s impossible to think of the state of military history, in any of the above meanings of the term, without being reminded that military history in the United States is an unusual field. Although it has been an academic field—in the sense of having PhDs trained specifically as military historians—since about 1970, the field has always had a powerful connection with an entity outside academe, namely the American military establishment. Indeed, our flagship organization, the Society for Military History, is a descendant of the American Military Institute, created by a group of active and retired U.S. Army officers as well as interested amateurs in the early 1930’s. Over time, as civilian scholars emerged who self-identified as military historians, they more or less glommed on to the AMI until around 1990 they acquired sufficient critical mass to turn the AMI into a conventional academic organization. Under academic leadership the organization changed its name, began to hold an annual conference, and created a refereed journal, the Journal of Military History.

Sunday, May 03, 2015

Patton Explains Academe

It is a huge lecture hall.  An image saying “Speak truth to the powerless” dominates the screen.  Patton emerges from his grave.

Be paupers.

Now I want you to remember that few PhDs ever get the job they really wanted. They get used to taking a job at some college where they feel under-placed.

Now, all this stuff about there not being many jobs, much less tenure-track jobs, is absolute gospel. Colleges love to exploit PhDs.  Most real colleges love to make you adjuncts.

When you were undergrads you all admired the coolest lecturer, the trendy scholar, the big deal professors, the erudite intellectuals.  Honors students love to apply to grad school and cannot be dissuaded.  Applicants overestimate their chances all the time. I wouldn’t give a hoot in hell for your chances of ever getting benefits.  That’s why adjuncts have never gotten, and will never get a living wage. Because the very thought of paying a decent wage is hateful to administrators.

Now, tenured faculty like to complain on your behalf.  They talk, fume, and pretend to sympathize with your plight. This "we’re all in this together" stuff is a bunch of crap.  The Ivy League bastards who feign indignation in the Chronicle of Higher Education don’t know any more about real job inequities than they do about teaching eight courses a year.

Now you have the most unrealistic expectations, the best intentions, and the worst career path in the world.  You know, by god I actually pity you starry-eyed saps, by god I do.  We’re not going to just crush your spirit. We’re going to remove your grip on reality and make you TA’s to speed the progress of our research.  We’re going to exploit you gullible chumps by the bushel.

Now some of you innocents, I know, are thinking you’ll get a decent job.  Don’t count on it.  I can assure you that you will all get screwed to the wall.  Administrators will be your enemy. Cower before them. Take their crap.  Get ulcers in your belly.  When you get a salary cut, that a fortnight before was a solemn promise you wouldn’t, you’ll know you were screwed.

Now there's another thing I want you to remember. Administrators don't want to get any messages saying you need decent benefits. They’re not giving you anything. Let your parents do that. They are proliferating constantly and they’re not interested in paying anyone -- except themselves. They’re going to grab you where it hurts, and they’re gonna blow smoke up your ass. They’re gonna exploit the hell out of you all the time, and they’re gonna tell you fairy tales like a sociopathic Mother Goose!

Now there’s one thing that you will be able to mumble to yourself in your cramped apartment, and you may thank your vague spirituality for it. Twenty years from now, with your dead end job still crushing your soul, and the creditors at your door, and you wonder what the hell did you do with your life, you won’t have to think:  “Well, at least I didn’t teach online courses for Take Their Money U.”

All right now you human beings, you know I don’t care.

Oh.  I will be proud to lead you gullible fools down the garden path any time I can get my readings course to subscribe.

That’s all.

Monday, April 27, 2015

Beyond the Academic Cage: Observations of a New Federal Government Historian

A guest post by Dr. Frank Blazich., Naval History and Heritage Command.
The views expressed in this post are his alone, not those of the NHHC, Department of the Navy, or Department of Defense.

Thousands of men and women across the United States and overseas are engaged in the pursuit of a doctoral degree in history. Most desire an academic position upon completion of their studies (preferably a tenure-track faculty position at a research institution), a career marked by the familiar rhythm of instruction, research, writing, and intellectual development. Unfortunately, a downward trend in tenure-track positions, budget cuts, and a growing reliance on adjunct positions has sharply reduced the odds of satisfying that desire. Yet as Daniel Drezner recently argued in the Washington Post, most graduate students have “drunk the Kool-Aid”: they get so fixated on the academic track as the only track that they will prefer an adjunct slot—and the increasingly naked exploitation that goes with it (crappy pay, few or no benefits, scant job security)—to any of the other tracks available. Indeed, they may have their eyes so fixated on the academic track that they don’t even know that other tracks exist. There are, however, alternatives to consider and pursue while in graduate school.

Sure, I too drank some of that metaphorical “Kool-Aid” too (as Drezner observed was practically inevitable) but only enough, as it turns out, to have but temporary effect. Instead, I’ve found my way to gainful, fulfilling employment, and a salary comparable to that of starting tenure-track assistant professors.

I did it by following a road less frequently traveled. And therefore my task is to make suggestions that can benefit graduate students in military history who are nearing the defense of their dissertation.

Everything started for myself with the omnipotent question: “What do you really want to do with your life?” The answer: “To be a professional historian,” a goal I believed I could and would achieve within traditional academe. Just how to get there also seemed straightforward. A BA in history from the University of North Carolina at Chapel Hill, and then an MA from North Carolina State University, and  finally a PhD from Ohio State—got me to my destination. I was indeed a professional historian. Now what? The inevitable guidance I received from a number of advisors, with minor variation, resembled a Philip Glass composition, a minimalist melody of “. . . and you can teach . . . and teach . . . and then teach . . . teach, teach, teach. . . .”

The only problem was that I didn’t particularly want to teach in the sense that they meant. I saw more self-fulfillment from researching, writing—and teaching in a different way, through public engagement. I thereafter resolved to pursue a federal or private industry position as a historian, and fairly quickly found a position in the federal government that allowed me to be a professional historian on the terms I truly desired. The famed scholar of mythology, Joseph Campbell, counseled his students, “Follow your bliss.” Well, I had done just that. So permit me to make a somewhat more than modest proposal, based on my personal journey, as an antidote to the Kool-Aid. I direct that proposal to graduate students, to suggest a different way by which to follow their own bliss, to provide them with fodder to reexamine the doubts they almost certainly have—doubts embedded by the mantra of “teach, teach, teach” within academe—on being a historian outside of the academy.

1. Find work outside of academe. Plenty of organizations can use a trained historian with skills in research, communication (oral and written), analysis, and interpretation. In my case, one such organization came to me—the Civil Air Patrol—and asked, based on my doctoral research on civil defense and emergency management, if I could help research their history. I joined the organization and began volunteering anywhere from five to twenty hours a week. I began as an unpaid internship, but shortly rose to become the CAP’s Chief Historian. Another possibility is to pursue contract history; that is to say, researching and writing reports or white papers for businesses, governmental bodies, or “think tanks”. During the final stages of completing my Master’s thesis, I signed a contract to write the fiftieth anniversary history for one such think tank. I will not claim that after earning my master's that I wrote a masterpiece, and certainly not one that traditional academe would recognize. But the work provided me with the equivalent of a post-doc in research, analysis, oral history, and business principles.

2. Diversify your historical skills. Many graduate programs—though far from all—equip students with valuable, albeit somewhat rudimentary, teaching skills, either through specific courses or by the osmosis inherent in repeated years of coursework. But that skill set can be deployed in places other than academe. A professional historian in the academic sense is more than capable of preparing graphic display panels, storyboarding a museum exhibit, or engaging in archival screening. Yes, these are public history skills, but a public that seeks its own version of a liberal arts education values precisely those skills. Therefore give serious thought to “following your bliss” by a different path.

Pursue the opportunities (many yielding salary and benefits as good or better that that of a career in academe) to use your teaching skills in another way. For those with knowledge of foreign languages or cultural knowledge from work overseas, apply your specialized knowledge reflected in a PhD to working within a law firm, business, government or museum. They value someone who has that kind of knowledge—and is usually more ready, willing and able than academe to place a realistic price upon their services. And the more you can demonstrate to potential employers—in ways not all that different from the conference presentations and referred articles you ought to generate, if you know what’s good for you, during your years as a PhD candidate—that you are capable of applying your education on terms other than those demanded by academe, the more you can demonstrate that you’re capable of performing an array of tasks and jobs.

3. Embrace the public. History is a popular field with the general public. Moviegoers spend billions annually to watch films “based on a true story.” Facebook, Flickr, Twitter, and blogs populate the Internet with historical morsels, nourishing seemingly every intellectual palate. A cartoon making its round with historian emails carries the caption “Those who don’t study history are doomed to repeat it. Those who do study history are doomed to stand by helplessly while everyone else repeats it.” This begs the question, why on earth would qualified historians choose to resign themselves to that fate?

Public historians have plenty of ways to achieve Campbell’s “bliss”. We build massive libraries, devote resources to rescue and save textual and non-textual records, and devote lifetimes to the study of the past. How much of your work is shared with the public? How much is written in a form and language accessible to the general public? Put a bit more brutally, how much are you fulfilling the social responsibility obligation demanded by a society in return for giving academe the freedom it long ago granted the earliest professions (law, medicine, and clergy): the freedom to generate expertise by making its own selection process, establishing its own criteria for the acquisition of that expertise, and providing its own mechanisms by which to assess whether an aspiring professional has in fact demonstrated that she or he has achieved professional status.

Academe replies: we “teach, teach, teach,” knowing all along that it really views teaching as secondary to the generation of a scholarship overwhelmingly directed toward specialists within academe, scholarship nearly incomprehensible to anyone other than those specialists.

Actually, fulfilling that social responsibility obligation to extend education to society at large is not all that difficult, even for those whose bliss truly lies within academe. (My reservations about academe come mostly from my observations about academic administrators rather than academic practitioners). It is easy to get angry at an inaccurate internet or media article about a historical topic. You don’t have to seethe about it, to do nothing to “speak truth to idiocy”. Do not remain silent, but tactfully respond. Offer your insight, share your knowledge, stand up as the subject matter expert and embrace the communication tools of the present day.

One obvious tool lies within the blogosphere, where you can fight blog post with blog post. Perhaps you found the misleading article via Facebook. You can fight back through Facebook. Fight dubious Twitter tweets with counter tweets. People beyond academe appreciate the appraisal of a true expert. And surprising number of them will beat a path to your door, contract in hand, and employment options will materialize.

4. Consider your perspective. Don’t just react to misguided assertions based on flawed, misguided, or outright bogus historical perspectives. Be proactive about engaging the larger public. Historiography is the ideal example of this, where historians can analyze and examine the differing perspectives allotted to a topic by multiple scholars. Now apply this outside of academe. If you are a business executive, would you not want to know what courses of action your predecessors took? Which ones succeeded and why? Why did others fail? What if an organization’s history is exclusively institutional memory that exists only within the minds of a handful of long-standing employees who could retire? Your training as a historian is ideal for bringing real expertise to bear upon these and myriad other questions, to provide the needed—and therefore valued—answers to a corporation or non-academic institution. Leveraging an organization’s heritage, creating a usable institutional memory, can easily save untold resources by avoiding past mistakes, or perhaps targeting new geographic or demographic markets.

Academe does not have a monopoly on such thinking or thought process. So why would you place your career in the hands of an institution that increasingly, and pretty remorselessly, will treat you like a commodity (in the sense, as expressed in the film Trading Places, of “gold, silver, platinum, heating oil, propane, cocoa and sugar. And, of course, frozen concentrated orange juice.”). Why would you limit your employment options only to academe if academe is frequently (though not yet always) unable to remunerate you in the way you need and deserve in order to carve out the larger life—the bliss of a family for which you can adequately provide, the bliss of traveling to faraway places other than archives, the bliss of having the ability to provide substantial funds to the charities whose commitments you value.

The wider world needs the talents and capabilities of historians, be they in the government, business, law, medicine, or public service. Furthermore, as someone trained to craft and defend a position with evidence, why not use this as an asset when speaking with a recruiter or hiring authority? Consider their perspective, and make a compelling case how hiring a historian opens up possibilities to strengthen their organization they may never have considered.

5. Know thyself. This last point is perhaps mundane or irrelevant, but has pertinence. As chiseled in stone at the Temple of Apollo at Delphi, “know thyself” is worth remembering. “Historian” is not exclusive to academe but rather becomes an inner calling and integral component of your mental and intellectual processes. The job market is terrifying only if you restrict yourself to the restrictive market of academe, if you disregard the fact that the market is much larger than academe indoctrinates you to believe. As long as humanity exists, there will be a need to study and utilize past actions and accomplishments for the betterment of tomorrow. You can and will be hired because of your skills as a historian, and those skills will always remain the defining characteristics of your position.

Monday, March 09, 2015

Pulp Nonfiction and The Big Red One





This article was originally published in World War II magazine.  Reprinted with permission



Released in 1980, The Big Red One tells the story of a squad leader and four privates fighting in every World War II campaign involving the U.S. 1st Infantry Division, whose nickname gives the film its title. These men are not the squad’s only members; they’re just the ones who survive. The rest are simply replacements, who look upon the veteran privates—“the Four Horsemen,” they’re called—with awe: How on earth do these men repeatedly escape death? For their part, the Horsemen regard replacements as “dead men who temporarily had the use of their arms and legs.” Replacements come and go so fast the Horsemen learn to keep them at a distance. A joke that runs through the film is that even after months of training alongside fellow soldiers none of the quartet has bothered to learn their names.
           
Lee Marvin plays the squad leader, known simply as The Sergeant. He is a veteran of World War I, grizzled, taciturn, and utterly realistic. The Horsemen are played by Mark Hamill—yes, that Mark Hamill—Bobby Di Cicco, Kelly Ward, and Robert Carradine. In this ensemble effort Carradine’s character, Zab, ranks first among equals and has the most developed role—unsurprising, since Zab, who provides a voiceover narration, stands in for Samuel Fuller, the man who lovingly wrote and directed The Big Red One.
           
Sam Fuller (1912-1997) himself fought in The Big Red One. A precocious writer, he was covering crime for the New York Evening Graphic in his teens and as a young man scripted B movies. He said he joined the Army because “I had a helluva opportunity to cover the biggest crime story of the century, and nothing was going to stop me from being an eyewitness.” Superiors tried twice to make him an official army reporter, but he insisted on serving in the infantry. Like alter ego Zab, Fuller published a mystery novel at the height of the fighting, sold the film rights to Hollywood, and spent a thousand dollars of the proceeds on a party for his squad days before the Battle of the Bulge. Like Zab, Fuller chomped cigars and did everything possible to make himself seem larger than life. And like Zab, he got through the war in one piece.
           
Fuller went on to become a Hollywood director whose 23 films have a deserved reputation for combining gritty realism and vivid characters with unabashed melodrama. During his life Fuller won little critical acclaim, but he influenced directors who became masters, including Martin Scorsese (Taxi Driver, GoodFellas) and Quentin Tarantino (Pulp Fiction, Inglourious Basterds).
           
Fuller spent years trying to get The Big Red One financed. When he did, the studio, horrified by his film’s 4-1/2 hour length, demanded edits that reduced the running time to 113 minutes. Fuller initially claimed to be glad the movie had been produced at all, but subsequently expressed regret that the public had not seen the movie he had imagined. His friend and admirer, film critic Richard Schickel, felt the same. After Fuller’s death Schickel supervised a “reconstruction” of The Big Red One, incorporating 50 minutes of footage the studio had cut. Upon the expanded version’s release in 2004, Pulitzer-prize winning critic Roger Ebert promptly placed it on his list of Great Movies.
           
The Big Red One isn’t merely based on Fuller’s wartime experiences. It pretty much recounts them—something that becomes obvious if you read his memoir, A Third Face, published posthumously in 2002. Even the least likely episodes turn out to dramatize events that really occurred: the private in Sicily who has a testicle blown off and is told that that’s why the good Lord gave him two, the raid on a Belgian insane asylum, the underage civilian sniper who when captured isn’t shot but spanked, the liberation of a concentration camp. Zab’s narration is pure Fuller, as when the Horseman explains an explosive device used to clear concertina wire on Omaha Beach. “The Bangalore torpedo was 50 feet long and packed with 85 pounds of TNT and you assembled it along the way, by hand,” Zab says. “I’d love to meet the asshole who invented it.”
           
Almost none of the film’s incidents build upon one another, but instead stand as deliberately unrelated vignettes. Fuller maintained that that is the combat soldier’s life: a string of vivid episodes, full of sound and fury signifying nothing. Thus he could place a firefight in a Roman amphitheater without invoking gladiators, as many another filmmaker might. In a heavy-handed scene, a battle is under way in an insane asylum when an inmate grabs a submachine gun. “I am one of you!” the madman shouts as he indiscriminately sprays bullets. “I am sane! I am sane!” This would be the clumsiest kind of symbolism—except that Fuller was recreating a wartime incident he witnessed and in his memoir even supplies the patient’s name.
           
At the conclusion, Zab conveys the picture’s moral: “Surviving is the only glory in war.”  This makes The Big Red One sound like an anti-war film. It isn’t. Fuller plainly regarded World War II as a great adventure, and his mindset seeps into The Big Red One, lending the film a seductive appeal. The focus on Zab and his comrades tempts the viewer to identify with the Four Horsemen, to imagine participating in the adventure and coming away unscathed. It’s easy to forget that in combat, a soldier’s fate is likelier to be that of a replacement.

Friday, January 16, 2015

The Great Civil War Historian Freak Out

I'll introduce this post by re-printing one I published earlier this month in Civil Warriors, entitled, Civil War Military Historians Are Freaking Out? - Part 1:

Recently two “think pieces,” coincidentally dealing with pretty much the same topic, appeared in the major professional journals concerned with the American Civil War:

Gary W. Gallagher and Kathryn Shively Meier, “Coming to Terms With Civil War Military History,” Journal of the Civil War Era (Volume 4, Issue 4):487-508.

and

Earl J. Hess, “Where Do We Stand?:  A Critical Assessment of Civil War Studies in the Sesquicentennial Era,” Civil War History (Volume 60, Number 4, December 2014):371-403.

Both articles depict, to varying degrees, the increasing marginalization of traditional military history (strategy, operations, tactics, etc.) within academe.  Actually, I would place the word “seemingly” immediately before the word “increasing.” But I’ll explain that in a future post.  For now, I’d just like to call attention to the response to these two pieces by Historista, the nom de blog of Megan Kate Nelson, author of Ruin Nation:  Destruction and the American Civil War (2012), which, according to the description on the back of the soft cover edition, is “the first book to bring together environmental and cultural histories to consider the evocative power of ruination [that is to say, the destruction of cities, houses, forests and soldiers’ bodies] as an imagined state, an act of destruction, and a process of change.”  Which is to say, one of the books forming part of the phenomenon that is causing Civil War military historians to freak out.

Her post, entitled “Civil War Military Historians Are Freaking Out,” appeared on her blog on December 10, 2014.  I self-identify as a military historian, and I’m freaking out so badly that I assigned Ruin Nation as a supplemental text in an undergraduate readings course I taught last summer and as a required book in my upcoming graduate readings course (it starts next week).

For now, I simply refer you to the post, with comment to come on the articles that prompted it:

Megan begins:
Let’s imagine that you wake up one morning after many years of writing and speaking and teaching in your academic specialty. You have tenure, you have written a lot of books and articles and book reviews, and colleagues across the profession (and sometimes, complete strangers) know who you are. But you wake up one morning convinced that it has all been for nothing. Nobody cares anymore about your research topic or your methodologies or your arguments. You wake up and think, “Oh my god! My field is dying." 
So what do you do?

Find out by reading Civil War Military Historians Are Freaking Out
***

It turns out that Megan isn't the only writer in the blogosphere to comment on these two articles, and I'm not the only one to comment on her post.

Over the holiday break, the staff of Civil War History compiled a list of online blogs and articles that relate (both directly and more indirectly) to the think piece by Earl Hess. The staff has shared the list on the CWH Facebook page  "in hopes that it continues to inspire a thoughtful and productive dialogue."  With that hope in mind, here's the list as they have it thus far (leaving aside the link to my own post, reprinted above):

Kevin Levin, Civil War Memory, What Do We Need to Know About Traditional Military History? (December 7, 2014)

Megan Kate Nelson, Historista, Civil War Military Historians Are Freaking Out (December 10, 2014)

Claire Potter, Tenured Radical, And the Dead (Fields of History) Shall Rise Up (December 11, 2014)

Kevin Levin, Civil War Memory, In Defense of Hess, Gallagher and Meier (December 11, 2014)

Kathleen Logothetis Thompson, Civil Discourse Blog, "Coming to Terms With Civil War Military History":  A Response (January 5, 2015)

Kevin Gannon, The Tattooed Professor,  Taking a Walk on the Civil War’s "Dark Side" (January 6, 2015)

(NB.  Actually, it's no longer accurate to refer to the "blogosphere," at least not as a self-contained entity, because when links to posts are shared on Facebook or Twitter (as they frequently are), most of the ensuing dialog takes place on those sites, especially FB.  The update on the Civil War History Journal Facebook page is itself a case in point.  The resulting dynamic is worth a post in its own right--something I'll have to place on my long list of things to blog about.  In the interim, it's time to write part 2 of  my own response.)

Monday, January 12, 2015

The Role of Military History in the Contemporary Academy

Cross-posted from the Society for Military History blog




The Society for Military History has just released a white paper entitled “The Role of Military History in the Contemporary Academy.”  In it, notes the SMH web site:
 co-authors Tami Davis Biddle of the U.S. Army War College and Robert M. Citino of the University of North Texas provide a compelling chronicle of military history’s revitalization over the past four decades and assess its current place in American higher education. In addition to the sub-field’s maturation in academic terms, its enduring popularity with the public and college students makes it an ideal lure for history departments concerned about course enrollments and the recruitment of majors and minors. Knowledge of the uses, abuses, and costs of war should also constitute a part of the education of future leaders in the world’s mightiest military power.
The SMH intends this white paper to generate a dialogue with history professors, college and university administrators, journalists, politicians, and citizens regarding the key role the study of military history can play in deepening our understanding of the world we inhabit and producing an informed citizenry.
The white paper is available here. (It can be read online or downloaded in PDF format.)

Monday, January 05, 2015

Come and See's Unblinking Eye


This article was originally published in World War II magazine.  Reprinted with permission


In his classic 1959 book, The Warriors: Reflections on Men and Battle, philosopher and World War II veteran J. Glenn Gray observed that war is visually fascinating. Renowned director François Truffaut expanded on this thought, arguing that it was impossible to make an effective anti-war movie because war by its nature is exciting, especially to the eye. Come and See, a 1985 film directed by Elem Klimov in the Soviet Union, defies Truffaut’s dictum.
            
 The film’s title comes from the Book of Revelation: “. . . And when he had opened the fourth seal, I heard the voice of the fourth beast say, Come and see. And I looked, and behold a pale horse: and his name that sat on him was Death, and Hell followed with him. And power was given unto them over the fourth part of the earth, to kill with sword, and with hunger, and with death, and with the beasts of the earth.”

Klimov asks, in effect, that we come and see an apocalyptic vision play out in a backwater of Nazi-occupied Belorussia, now Belarus, where forests, fields, and hamlets constitute a hell as vivid as the one Dante Alighieri portrayed in his Inferno. He succeeds, paradoxically, because he assumes no moral position but rather observes, with eyes wide open, his characters and the nightmare world in which he has placed them.
              
As the film begins an old man is calling to figures out of frame revealed to be two boys of 14 or so. The youths are digging in loose soil to retrieve weapons; they mean to sign up with a partisan band. The old man tries to discourage them, but the boys ignore him, and a few days later enlist.
             
From this point the film  follows Florya (Aleksei Kravchenko), who has joined the resistance to take action but instead is mostly acted upon. As the partisans depart on a mission, a man with tattered footwear appropriates Florya’s boots. The fighters order the boy to remain in camp, together with Glafira (Olga Moronova), a girl not much older than he. The two spend much of the film together, as companions rather than friends. They leave camp to wander a largely empty landscape, trying to avoid dangers both elusive and omnipresent. Florya suggests they go to the hovel where his mother and siblings live. No one is there, but a warm meal is in the oven. Flies are buzzing. The boy and girl begin to eat, nearly oblivious to the insects..
             
Florya decides his family has gone into hiding, and races from the hut, followed by  Glafira follow.  Glancing over her shoulder,  she sees the source of the flies: piled against the dwelling are dozens of naked corpses, surely including Florya’s family. Glafira keeps this information to herself as they wade through a waist-deep bog to the island where the boy imagines his family to be. Finally, drenched in mud and reeking water, the girl cannot contain herself. 

“No, they aren’t here! she screams.  “They’re dead!”  A furious Florya briefly tries to strangle her.  Then one of the resistance fighters appears; he takes the youngsters to a spot where refugees and partisans stand intermingled.  In the midst of them, lying on his back, is the man who at the film’s outset told Florya and his friend to stop digging. Third degree burns cover the old man’s body; he is dying. The Nazi patrol that slaughtered the civilians set him on fire as a lark.“Florya,” he says, “didn’t I warn you?  Didn’t I tell you not to dig?” In that moment Florya finally accepts the deaths of his mother and siblings.

The horror of that moment is oddly intensified by the obvious indifference of the partisans, one of whom has placed a skull atop a scarecrow-like torso and laughingly applies clay to it, sculpting a facsimile of Hitler’s head.
           
Indifference is, indeed, central to Come and See.  Klimov wanted to produce a film with devastating impact, and he succeeded; Come and See is widely admired as a masterpiece.  But the method by which he achieves this impact is a studied indifference. The camera impassively regards characters and situations. The editing never calls attention to agony, atrocity, or terror. The soundtrack, understated and monotonously ominous, conveys neither empathy nor antipathy.
             
A viewer may see good and evil, but every element in the film conspires to deny that good and evil exist. There is a  harrowing sequence in which Germans—who with one minor exception do not appear until 90 minutes into the two-and-a-half hour film—round up Florya and the population of a village and crowd them into a barn.  An SS officer says that those without children can leave through a window, but the children must stay.  No one accepts the cruel offer except Florya, although moments later a young mother attempts to leave with her toddler.   The soldiers toss the toddler back through a window, drag the woman off by her hair, and set the structure  on fire.. As the villagers begin to scream, the soldiers applaud. For a full 10 minutes the camera holds on the barn and the sheets of flame consuming it, cutting away only to show one group of soldiers carrying off the terrified young mother to be raped, while another forces Florya to his knees and places a pistol against his temple—not to kill him, but to use him as a prop for a jovial group snapshot.
           
In its closing moments the film does at last adopt a moral position.  The camera shows justice meted out.  The sound track, hitherto so muted, swells in tragedy.  The partisans seem somehow noble rather than wretched.  And Florya, so ineffectual through much of the film, suddenly performs an act of stunning power—yet one that occurs only symbolically.  There is no redemption from what we have seen.  
           
“The spirit gone, man is garbage,” wrote Joseph Heller in Catch-22.  In war, Come and See declares,  the living are garbage, too.