Ding Dong, Osama’s Dead

“Osama dead!!!” my college-age daughter texted me last night at 11:18.

“So I hear,” I texted back. “Didn’t get details.”

At 11:30 she texted me again: “waiting for Obama to speak…”

I was already in bed. Each time my cell phone dinged with an incoming text message, I had to haul myself out of bed and travel to the end of the walk-in closet where my phone was recharging. “Going to sleep,” I texted back before switching my phone to silent.

This morning I found a last word from my daughter: “LAME go watch, history in the making!!”

Call me LAME, but I’m finding myself strangely unmoved by this turn of events–and surprised that everyone else in the country seems to find it earth-shaking, especially the college-age kids like my daughter who were only children at the time of 9/11. Not to mention the cognitive dissonance, for someone my age, of hordes of college students displaying fervent patriotism. I realize the contexts are vastly different, but I find it hard to imagine ANYTHING happening in the 1970s that would have prompted college students to spontaneously gather in front of the White House cheering and chanting “USA!”

Does anyone really think that Osama bin Laden’s death will make a difference in the so-called War on Terror? The forces he unleashed have gone way beyond him now. Terrorists don’t need his orders to prompt or organize their movements. And, as reflected in the extra security measures now being taken around the world, there’s a good chance that his killing will only spark more anti-American violence.

(Astute readers will have noticed by this point that the subject of this post has nothing to do with the ostensible themes of this blog. But hey, it’s my blog and I suppose it’s my prerogative to violate my self-imposed parameters once inĀ  a while.)

Aside from that, it strikes me as unseemly to rejoice at anyone’s death, no matter how evil a monster he or she was. (And yes, to answer the inevitable question, that would include Hitler.) The only voice I’ve found in this morning’s news coverage that echoes my own feelings belongs to Harry Waizer, identified in the New York Times as a World Trade Center survivor. Asked by a Times reporter for his reaction, Waizer “paused nearly a minute before he began to speak.” Waizer was in an elevator at the World Trade Center when the plane struck the building and suffered third-degree burns.

“If this means there is one less death in the future, then I’m glad for that,” Waizer said. “But I just can’t find it in me to be glad one more person is dead, even if it is Osama bin Laden.”

If Waizer is able to extend the definition of humanity to include the man who nearly killed him, I wonder why it should be so hard for the rest of us.

Pandering to the Masses, Then and Now

Journalism, it is said, is in decline. And one proof being offered is a trend towards deciding which stories to cover based not on what editors think is important, but rather on what readers want to read–which is to be determined by what they’re searching for online.

The subject came up in a recent, and rather testy, interview with Arianna Huffington that appeared in the Sunday Times Magazine. The interviewer asked Huffington about an internal AOL memo, leaked shortly before AOL acquired the Huffington Post, saying that AOL wanted “95 percent of stories to be written based on what people are searching for.” (Huffington protested that she shouldn’t have to defend a memo that predated AOL’s acquisition of her website, and said that the document was “very, very, very far away in terms of where the company is now.”)

Obviously, journalism based on what people are searching for can leave much to be desired. We could end up with, say, a bunch of stories about Lindsay Lohan’s latest escapades instead of a searching analysis of what’s going on at Guantanamo. Or a lot of media coverage about the supposed falsity of President Obama’s birth certificate instead of a serious examination of what to do about the economy. Oh, wait a minute: that IS what we’ve ended up with.

But say what you will about the state of American journalism (and there is indeed much to bemoan), this trend isn’t exactly new. I’ve been spending a lot of time lately paging through a magazine that was published in 1807 (I’m researching a novel based on the life of the woman who edited it), and–what do you know?–the same issue, more or less, existed some 200 years ago.

For example: the magazine, which was called The Observer, began serializing a translation of a French novel called Adelaide; Or, a Lesson for Lovers. I’m not sure what the “lesson” was supposed to be, but by the standards of 1807 the novel was pretty racy. It’s about a couple of horny teenagers who can’t wait for the sanction of matrimony (or perhaps there was parental opposition to the match–I haven’t read every word). They do what comes naturally, she gets pregnant … you get the idea. Not too spicy by modern standards, perhaps, but it apparently caused a good deal of outraged comment in Baltimore in 1807.

Somewhat belatedly, the magazine’s editor–Eliza Anderson–decided to stop the serialization. Once she had seen the novel in its entirety, she said, she realized it was “too glowing, too impure, to be presented by a female, to the chaste eye of female modesty.” But lo and behold, the public–or at least the segment of it that wasn’t outraged by the novel’s publication–was outraged by its discontinuation. “Whilst some extracts we have made, from the most valuable works, are passed by,” Anderson complained, “this love-tale excites the liveliest interest, and when its publication has been suspended for a week, the office door has not stood still a moment, for the constant, the continual enquiries that were made, to know when it would be continued.”

It’s hard to tell if Anderson herself was genuinely outraged by the “glowing” and “impure” nature of Adelaide. For one thing, later in 1807 she herself translated another French novel that may have been even racier–according to a modern scholar, it contained perhaps “the first depiction of female orgasm in polite fiction.”

For another, she was pretty sensitive to what kinds of articles sold magazines. In fact, she started The Observer because its predecessor publication, for which she also wrote, was too dull. Anderson thought satire was the way to go–partly because she thought that was the best way to reform and mold people, and partly because she thought it made for a livelier publication.

She was right about the liveliness, but she ended up alienating quite a few people through her satire. On the other hand, as she recognized, the journalistic feuds that were fought out in the pages of The Observer and other publications actually had a salutary effect on sales. She noted at one point that subscriptions reached a sustainable level only after “some strokes of satire and criticism had given zest and interest to our pages.”

But Anderson didn’t just publish what she thought the public wanted to hear–not by a long shot. She never lost sight of her original goal, which was to educate and elevate the reading public of Baltimore, whether they wanted to be educated and elevated or not. So, alongside Adelaide and other fluffier offerings there were dense biographical tracts on Marmontel and Lord Mansfield and analyses of the contemporary political scene in Europe–the very “valuable works,” no doubt, that avid readers of Adelaide were passing by. Not to mention a lot of digs at the follies and foibles of the Baltimore citizenry.

Perhaps the moral here, if there is one, is that successful journalism has always been some kind of balancing act between what readers want and what editors, and writers, think they should want. The Internet has obviously made it easier to identify readers’ less-than-elevated interests and pander to them, but the basic issue remains the same. The trick, it seems, is to somehow present serious, thoughtful journalism in a guise that will appeal to the masses.

Maybe in another 200 years someone will figure out how to do that.

Rip Van Winkle at the Library of Congress

There’s nothing like a trip to the Library of Congress to lift my spirits–and to induce me to ponder the upsides and downsides of modern technology.

For those who have never experienced its delights, let me explain that the Library of Congress–and in particular the august Main Reading Room–is a shrine to that now almost obsolete format (or should I say “platform”?), The Book. The high-domed reading room is adorned with such a profusion of ornate marble and imposing allegorical figures representing all things book-related that it can sometimes be hard to concentrate on the actual, usually rather modest-looking, book in front of you.

But the Library’s collection is far from modest. It’s basically everything that’s ever been published in this country, and a lot that’s been published outside it–plus unpublished letters, diaries, maps, drawings. You name it. All brought to you on a silver platter (metaphorically speaking) a mere 30 to 90 minutes after you fill out a call slip with one of those tiny eraserless pencils otherwise reserved for keeping score in miniature golf. And all this for free–or rather, paid for by tax dollars. For my money, it’s tax revenue well spent.

But as to technology: yesterday I had an experience in the Reading Room that illustrated the ways in which old-fashioned book-related research methods can lead to serendipitous discoveries. I had requested a scholarly article on the Baltimore Almshouse, which I thought might be relevant to the novel I’m now researching (one of my characters is based on an early 19th-century Baltimore doctor who tended to the poor). As it turned out, the article dealt with the wrong time period. But in the same bound volume of the scholarly magazine I found another article–on the Baltimore yellow fever epidemic of 1800–that I eagerly realized was right up my alley. I learned that the predicament of the poor during the epidemic led to the founding of the Baltimore Dispensary, where my doctor was a key player.

What does technology have to do with this? Well, if I’d looked at the Almshouse article online–in isolation rather than in a bound volume with other articles–I never would have come across the yellow fever article.

On the other hand … after reading the yellow fever article I made my way down the hall to the Microform Reading Room, which is something of a letdown after the Main Reading Room. The last time I was there, perhaps a year ago, it looked like a forgotten broom closet that for some reason had been stocked with recalcitrant, creaky microfilm readers. It still looks like a broom closet, but the old microfilm readers have now been banished to a back room (and the back room of a broom closet is a pretty ignominious place to be banished). In their place stood sleek little black models perched next to equally sleek computer screens.

A friendly librarian, noting my confusion, explained that the new microfilm readers were actually hooked up to the computers: you viewed the images on the monitors, where you could enlarge or darken them or rotate them with the click of a mouse. Not only that, she told me, you didn’t have to copy things the old way: by pressing a button that caused the image to be temporarily sucked into the bowels of the microfilm reader, only to emerge as an often illegible hard copy at twenty-five cents a pop. Now you could simply copy the images to a flash drive, take them back home, and insert them into your own computer.

Of course, being a female version of Rip Van Winkle, I hadn’t thought to bring a flash drive. But the gift shop stocks them, apparently for hapless souls like myself. I was happy to fork over the somewhat exorbitant price of fifteen bucks–not that bad, really, when you consider the flash drive is a lovely shade of blue and doubles as a souvenir, since it’s emblazoned with the words “Library of Congress.” After the librarian gave me a crash tutorial in using the newfangled equipment, I spent a few joyful hours stalking, and saving, my microfilmed quarry: The Observer, an obscure weekly magazine published in Baltimore during the year 1807 and edited by Eliza Anderson–the first woman to edit a magazine in the United States and one of the main characters in my novel.

How happy did this make me? I can’t even begin to tell you. When I started researching Eliza, I had to transport myself to the Maryland Historical Society library in Baltimore to read the magazine in bound volume form. Oh, they had it on microfilm, but the copier function had ceased working at some undetermined time in the past, and there was no money to fix it. I couldn’t even xerox the hard copy pages of the magazine because they were too fragile. Nor could I even use a pen to take notes, because only pencils were allowed in the library. So I spent many hours taking notes on the articles with an increasingly dull pencil (the library did provide an electronic sharpener, which would periodically pierce the silence), and sometimes copying them word for word. Let’s just say it was a bit tedious.

Imagine my joy when I discovered that the microfilm was also available just a Metro ride away from my house in Washington DC at the Library of Congress–where they had an actual working microfilm copier, albeit a cranky one. What I really dreamed of, though, was a way of having access to every page of every issue of the magazine at home, so that I could draw on them at leisure in writing the novel. It was hard to predict which pages I would need and therefore which I should copy, but it would have cost a fortune–and taken untold hours–to copy them all. And the idea of buying the reel of microfilm and a cranky microfilm reader of my own crossed my mind, but I quickly dismissed it as unrealistic. Clearly, there was no way my dream would ever come true.

Until now, that is–just a year later. It will take a while, but I can copy every single page of The Observer onto my flash drive and install them on my computer. I’m amazed. But my amazement is nothing compared to what Eliza Anderson would experience if she were to be revived and told that all 52 issues of her magazine–the publication she sweated and slaved over for many hours each week, the source of so much joy and angst, the means by which she made her minor mark on history–could be easily contained within a bright blue object that’s only two inches long.

The Agony of Age

I haven’t written for a while, mostly because my 87-year-old mother, who suffers from dementia and cataracts, has broken her right leg for the second time in four months and is now recovering from major surgery. All of which has led me to think about (among other things) the indignities and unpleasantnesses of hospitalization, then and now.

By “then” I mean, of course, about 200 years ago — the late 18th and early 19th centuries, the period I researched for my first novel, A More Obedient Wife, and that I’m currently researching for my novel-in-progress. Because one of the characters in the current novel (based on a real historical figure) is a doctor, I’ve been reading up on the history of medicine.

As with so many things about the past, the more you know about medicine back then, the luckier you feel to be alive now. Sure, things go wrong in modern hospitals: no one tells you the doctor you’ve been waiting for for two hours has decided not to come after all; the nurse takes forever to answer an urgent call for help; what seems like a steady stream of people invade your room and ask you the same questions over and over again. Even my mother, who has trouble remembering what happened five minutes ago, can tell that she’s been asked these same question many times before — and to show her annoyance she often refuses to answer, which doesn’t help matters any.

But at least she doesn’t lose all access to food the next day, which is what might have happened to an uncooperative patient in the 18th century. According to a book on 18th-century medicine, appropriately entitled The Age of Agony, patients at London’s Guy’s Hospital who engaged in cursing or swearing, or were “found guilty of any Indecency, or commit any Nuisance,” were to lose “their next Day’s Diet.”

Not that their “next Day’s Diet” was liable to be all that enticing. My mother has a menu of food choices from “room service” that she finds overwhelming — or would find overwhelming if she could see well enough to read it. Maybe the hospital kitchen wouldn’t win any stars from Michelin, but at least there’s more variety than was to be found at St. Bartholomew’s Hospital in London, where the menu was restricted to bread, boiled beef, “milk pottage,” and beef broth. On the other hand, each patient also received a daily allotment of a pint of “Ale Cawdle” at night and three pints of beer. That might have made things a little more tolerable. Nevertheless, patients at Guy’s and other hospitals routinely slipped out to nearby “Publick-Houses” and “Brandy Shops” and came back to the wards drunk. As a result of which they forfeited “their next Day’s Diet,” which was apparently an all-purpose disciplinary measure.

You can kind of understand the patients slipping away to get drunk when you consider that there wasn’t much for them to do in the hospital. No TV, certainly, which is the only thing that seems to keep my mother’s mind off her pain and relieve her of her general disorientation (she always finds comfort in the world of Turner Classic Movies). In some hospitals there was group Bible-reading on Sunday evenings, led by “some sober Person in each Ward” (assuming, I suppose, they could find one), but not much else in the way of entertainment. Patients at Guy’s who were able-bodied were kept busy taking care of the weaker patients and helping to clean the wards and fetch coals. If they shirked those duties — well, you can probably guess the punishment. But for a second offense they would actually be discharged from the hospital.

Not to mention that there was very little the hospital could do for you, other than warehouse and feed you. In fact, a hospital was likely to make you even sicker than you were when you came in, given that — before the advent of germ theory — you would be kept in close proximity to all sorts of pathogenic organisms harbored by your fellow patients. Other drawbacks included the noxious smells from the lack of sanitation and the intense discomfort due to infestations of lice. In 1765 a surgeon at Guy’s remarked that in London hospitals “bugs are frequently a greater evil to the patient than the malady for which he seeks an hospital.” No wonder hospitals were reserved for the poor. Wealthy invalids preferred to do their suffering at home.

I suppose all of this should make me feel much more appreciative of the modern comforts and conveniences at my mother’s disposal. But it’s a sad truth that no matter how much we understand, on an intellectual level, that we’re really a lot better off than many others — including those who lived in the past — it’s our present distress that looms larger in our consciousness. Should I tell my mother she should feel lucky that her hospital bed isn’t full of lice — or that she has a private toilet, even though at the moment there’s no way she can get to it? It would probably only add to her general confusion.

The truth is, two hundred years ago my mother and many of the other elderly inmates of the rehabilitation facility where she is currently housed wouldn’t have lived long enough to suffer the ailments of old age. Their lives would have ended before their bones grew so brittle and their eyes so dim. We can thank modern medicine for that. And sometimes, I suppose, we can also curse it.

Et in Stoppard’s Arcadia Ego

I recently had the thoroughly enjoyable experience of seeing the current Broadway production of Tom Stoppard’s play Arcadia, and I would encourage everyone who can to go and do likewise.

There are so many different things going on in this brilliant play that everyone who sees it is likely to latch onto something different (and I would highly recommend reading the play beforehand so that you have a better chance of following all the various strands). No doubt a physicist or a mathematician would be entranced by the scientific angles in the play. But as someone who has spent a fair amount of time parsing fragmentary 200-year-old documents and trying to reconstruct from them what actually happened in the past, the aspect of the play that really grabbed me was the historiographical one: two characters who are 20th-century historians tangling over just what happened at an English country estate in the early 1800’s. (I say “20th-century” simply because the play was written in 1993, but of course they could just as easily be 21st-century historians–the study of history hasn’t changed significantly in the last 18 years, and perhaps it never will.)

Stoppard’s conceit is that the play alternates between two time periods: 1809 and a few years later, on the one hand; and the present (more or less), when descendants of the estate’s 19th-century inhabitants still live in the house and are hosting the two historians who are doing their research there. The audience is in the privileged position of seeing both what really happened in 1809, and what the historians think happened–rightly or wrongly.

It’s a delicious position to be in, and one that reveals the human psychology at work behind historical endeavor. We all want answers, we want certainty–or as close to certainty as we can get. And so, presented with bits and pieces of information, we construct a story that makes sense to us–a story that often requires making certain assumptions.

One of the historians, Bernard, decides that the subject of his own expertise–Lord Byron–must have been a guest at the house in 1809. After all, he lived not far away, and he was a schoolfellow and (presumably) friend of the resident tutor there, Septimus Hodge. Makes sense, doesn’t it? Well, yes–and he turns out to be correct on those points.

But Bernard goes on to deduce that while staying at the house, Byron fought a duel with another guest there–a minor poet whose work he had (presumably) savaged in print and whose wife he had (presumably) seduced. And that leads him to another deduction: in the duel Byron murdered the poet, who is not heard from thereafter, and had to flee the country. This story provides a convenient explanation for Byron’s otherwise puzzling voyage to Lisbon that year, at a time when Europe was ravaged by war and travel was risky.

Makes sense, doesn’t it? Well, yes–and it also makes headlines and gets Bernard on the morning TV talk shows. The only problem is–as the audience knows and as the other historian in the play maintains–Bernard gets this part of the story all wrong.

In the play, Bernard later comes across evidence that disproves the central element of his theory, the murder, much to his dismay. But in reality–as anyone who has worked extensively with primary historical sources knows–these kinds of mistakes often get perpetuated for generations in secondary sources.

To offer a minor example, in the course of researching my novel A More Obedient Wife, I read what was then (and, I think, now) the leading biography of one of the historical figures I was writing about, James Wilson, an early Supreme Court Justice. According to this biography, shortly after Wilson married his second wife he freed a slave he owned. The author also mentioned in passing, without any citation, that this second wife was a Quaker and had undoubtedly urged her new husband to free the slave, in keeping with her abolitionist views.

A good story, I thought. And it makes sense, right? But as they used to say during the Cold War, “trust, but verify.” I managed to find the document granting the slave his freedom, dated shortly after the marriage, so that checked out. But nowhere, in any primary source, could I find any evidence that the second wife was a Quaker, or that she had anything to do with freeing the slave. And I’m pretty sure I found every primary source relating to the second wife, Hannah Gray Wilson, who was one of my two main characters.

What I did find, however, were repetitions of the assertion that she was a Quaker in at least two later secondary sources. Which is understandable. After all, we’re conditioned to believe what reputable historians say, especially if it seems to make sense. (Although I have to admit that this particular biography, written in the mid-1950’s, raised all sorts of red flags for me despite its iconic status. The author–Charles Page Smith–kept putting in details like, “As he read the letter, his glasses began to slip slowly down his nose.” Oh yeah, I wanted to say? How do you know?)

Okay, so sometimes historians get it wrong. Does that mean they should just throw up their hands and give up? Consign certain things to the dustbin of history that’s labeled “Unknowable”? Well, they should at least exercise caution–as the more skeptical historian in the play, Hannah, keeps urging (at least when it’s her competitor who’s the one jumping to conclusions). But as Hannah herself says, it’s the search for answers–not its ultimate success or failure–that’s important. “It’s wanting to know that makes us matter,” she says. “Otherwise we’re going out the way we came in.”

Of course, there’s another way to come up with an answer, of sorts–one that accepts the unknowability of the past and just keeps going. I’m talking, of course, about historical fiction, which can provide the satisfaction of a “good story” without distorting (consciously or unconsciously) the historical record.

I decided, for example, that I really liked the idea that James Wilson freed his slave because of pressure from his new wife. It made sense, and it fit in with the story I was weaving. But the idea that she was a Quaker–even aside from the absence of proof–just didn’t make any sense to me. She was from a fairly elite family in Boston, a stronghold of Congregationalism, and I even have a reference to the church her family attended. (It was called “Dr. Thatcher’s Meeting,” after the name of the pastor. Actually, this may have been where Smith got the idea that she was a Quaker–today we use “meeting” to refer to Quaker congregations. But in 18th-century New England, the term was used to refer to Congregationalist churches as well.) So I made her an abolitionist, but not a Quaker.

So historical fiction has its uses, and its satisfactions. But it’s no substitute for straight-ahead, just-the-facts-ma’am history. When I put on my historian hat, I try to rein in my imagination and retain a healthy skepticism. As Arcadia shows us so wittily, it’s not always easy–and maybe it’s not always possible. Sometimes I may be more like Bernard than I’d like to admit. But the sad truth is that there are some gaps in the historical record that only fiction–clearly labeled as such–can fill.

History and Literature

Note: This blog post was originally posted under the date February 17, 2011, which is the day I started writing it. I saved it as a draft that day and didn’t finish and publish it until some weeks later. I assumed it would go up as a new post on the day I published it, but I just realized that it didn’t — it was buried among older posts. So I’m putting it up as a new post in case anyone missed it the first time around!

There’s a certain kind of person who gets all starry-eyed when you say the name “E.P. Thompson,” or utter the title The Making of the English Working Class.

That kind of person would be me (I’ve met a few others). For those who have not been anointed into this cult, E.P. Thompson was a British historian, and his magnum opus was published in the sixties. When I came across it in 1977 or so, as a sophomore in college, I had never been so bowled over by a book before–at least, a nonfiction book. I suppose it’s fair to say the book changed my life: I decided I wanted to BE E.P. Thompson. After I graduated from college, I got a fellowship to study in England, with the intention of stalking my idol and becoming his protege.

I was informed, however, that the university where he taught–Warwick–was in a less than appealing location, and also that he had basically retired from teaching. So I ended up at the University of Sussex, charmingly situated near the seaside resort of Brighton, studying with a disciple of his (a woman who had also made the journey to the U.K. from the U.S. to study at the feet of the master, and who had never left).

What was it about the man, and the book, that I found so captivating? I suppose it was Thompson’s marriage of academics and the armchair socialism that I was then prone to: instead of writing a history of the elites, which is what so much of history is inevitably about, Thompson focused on what used to be called the lower orders of society: the mechanics, the artisans, the workers. In a deservedly much-quoted phrase, he said he wrote to rescue these people “from the enormous condescension of posterity.”

The difficulty with writing about such people as individuals–as thinking, feeling beings rather than statistics–is that they didn’t leave much behind in the way of a paper trail. They didn’t keep diaries (or if they did, the diaries generally weren’t preserved), they didn’t publish memoirs, they didn’t make headlines. But Thompson was able to unearth and mine what they did leave behind: broadsides, pamphlets, hymns. As I recall (and I haven’t read the book in many years), he used the techniques of literary criticism to penetrate the opacity of these sources, extrapolating from their choice of words and tropes to reconstruct their lives, their hopes and dreams and frustrations. For someone who was majoring in English History and Literature, it was perhaps the perfect textbook.

I didn’t end up becoming E.P. Thompson, as it happens. But I never lost my interest in trying to reconstruct the lives of ordinary people who inhabited the past. And somehow, at long last, I’m getting a chance to do it, through the medium of fiction. For the novel I’m currently working on, which is set in early 19th-century Baltimore, I’m in the process of inventing a character who is a member of the working class of that era (if it’s not an anachronism to use that phrase to describe a stratum of an essentially pre-industrial society).

My task has been made much easier by the existence of a book that, if it hadn’t existed, I would have been tempted to commission: Scraping By, by Seth Rockman. It’s an exploration, as the subtitle tells us, of “Wage Labor, Slavery, and Survival in Early Baltimore.” A veritable gold mine! And, as Rockman politely implies in his introduction, his task is even harder than Thompson’s. Thompson focused on the artisan class–the skilled workers–who were generally educated enough to leave at least some written records behind. But, as Rockman points out, this was really the cream of the working class. The majority of workers were unskilled and largely unschooled. And, generally speaking, too worn down by the arduous process of trying merely to survive–“scraping by” in Rockman’s phrase–to even attempt the efforts at self-organization and resistance that Thompson chronicled.

So Rockman has to rely on documents like almshouse rolls and jailhouse records to reconstruct the lives he’s writing about. I found the book fascinating, but inevitably, there are huge gaps in the historical record. Occasionally a few tantalizing details of an individual life pop up, but most of it remains submerged, mysterious as the underside of an iceberg. And of course, the actual voices of these long-dead ordinary people, so vital to Thompson’s approach, are almost entirely lost.

That’s where fiction comes in–or at least I hope it will. In my first novel, A More Obedient Wife, I tried to give voice, through the medium of my imagination, to two obscure women of the 1790’s whose lives, as best I could reconstruct them, were fascinating and dramatic. But of course, in that case I had letters–to, from, and about them–to serve as my guide to who these women might have been. And in the novel I’m currently working on, one of my characters (again based on a real, if obscure, historical figure) left behind not only some letters but an entire year of a magazine she edited and largely wrote. Her voice resounds quite clearly across the centuries.

It’s the other main character in this novel-in-progress who I intend to draw from the ranks of the working class. Her experiences and her personality will be influenced by what I’ve read in Rockman and other sources–particularly Gordon Wood’s Empire of Liberty, which focuses to a large extent on the early 19th-century tension between elitism and democracy. This character, who I’m calling Margaret, will exemplify the nascent trend towards what became Jacksonian democracy: a servant who rejects that label in favor of “the help,” thought to be less demeaning, and for similar reason chooses to call her employer “boss” instead of master or mistress. She’ll have had some experience of the almshouse, and she’ll have chosen domestic service because her other alternatives–either “slop work” (low-paying piecework in the garment trade) or prostitution–were so unappealing.

But her voice will be my own invention–and so far I’m having a great time inventing it. Margaret herself never existed, of course, but surely many people like her did. And maybe at last, after two centuries, some of them will be speaking through her– or rather, through me. I guess I haven’t exactly turned into E.P. Thompson, but I feel that I’m trying to do through the medium of fiction part of what he accomplished as a historian. And I like to think that he’d approve.

To Outline or Not to Outline

I’m at that juncture in writing a novel when I’d much rather write a blog post. (Some writers would probably point out that one is liable to meet with many such junctures in the course of writing a novel. They would be right.)

No doubt it would be better to resist this impulse and just force myself to stare at my computer for the time I’m now spending on this blog post, but I thought if I described where I’m at it might conceivably be of interest to others, writers and readers both. And it might even help me figure out what to do next with the novel.

Here’s where I’m at: after several years of research and two abortive drafts of perhaps 50 pages each, I’ve now started another draft, and I have about 20 pages. I feel that I’ve finally figured out my focus in terms of voice and time span and characters, and I have a general sense of what’s going to happen. The question is, do I now try to outline, in somewhat more detail, a plot?

I don’t really know what most other writers do about this, but I suspect it varies. I’ve certainly heard writers say that they just start with a situation or a character and see where the writing takes them. I would guess that this is the more “literary” approach. But I did just browse through an article about plot in an issue of Writer’s Digest that said something like, “It’s a good idea to sketch out as much of the plot as possible in advance. I always like to know where I’m going. Don’t you?”

Well, yes, actually. Generally speaking, I like to plan things ahead of time. In fact, this has been a point of contention for many years between my husband and myself. He’s a big proponent of what he terms “spontaneity,” particularly with regard to family vacations, and he thinks my insistence on having, say, hotel reservations and a general itinerary is micromanaging to the point of joylessness. Years ago I told him: fine, you can have one night of “spontaneity.” This led to a situation where we arrived at a ramshackle, haunted-looking “bed and breakfast” in some godforsaken town in upstate New York, where we appeared to be the only guests, and where our kids (then about seven and ten) flatly refused to spend the night.

But that, of course, is beside the point. The point is that I do like to plan ahead in most areas of my life. Why not when I’m writing a novel?

Maybe because I’d like to think I’m too “literary” for that sort of thing–that my characters will come alive and simply take over, as writers are wont to claim their characters do. Unfortunately, my characters show that kind of initiative only rarely. Instead, what often happens is that I get to the end of a scene or a chapter and I have little or no idea what to do with them next. Eventually I think of something, but I can’t help wondering if there isn’t some better way to go about this.

With my first novel–which was in many ways a learning experience–after I thought I was done writing I ended up cutting about 200 pages, mostly from the beginning, on the advice of a publishing-industry professional who said the story started too slowly. With my second novel–also in many ways a learning experience–I was told by another publishing-industry professional, after I thought I was done, that I didn’t have a plot. While I think that was something of an exaggeration, I did end up shoehorning another plot into the novel (no mean feat, let me tell you). Both of these changes made the novels better, but it would have been a lot easier if I’d been able to just write them that way the first time around.

The problem, of course, is that difficult as it is to write a novel without a plot outline, in some ways it’s even more difficult to come up with the outline. An unfleshed-out plot can seem ridiculous and mechanical: first she does this, and then she does that, and then she realizes something else. It’s enough to make you lose faith in your own endeavor. Plus, all that stuff that makes the plot (one hopes) seem less ridiculous–dialogue, nuance, perceptions–sometimes leads to an unexpected twist. Even if the characters don’t exactly take over, on occasion you realize something as a result of having written a scene that causes you to rethink your plan: now that she’s said or done X, she would never go on to do Y.

On the other hand, that’s no reason not to try. There aren’t any plot police roaming around who will force you to stick to your outline–just as, if you pass some intriguing and unanticipated roadside attraction on a family vacation, no one will actually prevent you from stopping. That doesn’t mean you shouldn’t make a hotel reservation for that evening. And if the roadside attraction is really intriguing, you can always change the reservation.

Actually, I’ve already done some planning. Earlier this week I banged out a seven- or eight-page summary of the plot. It was helpful, but at the same time discouraging. There’s a lot I have to work out in terms of what happens when, and who does what to whom. And given that I’m working with some real people and real events, it can get pretty complicated.

So I just dug out an old artifact that I used to help me with my first novel, also based on the lives of historical figures: it’s a huge roll of paper I bought when my kids were little, because I read in some catalogue it would be perfect for art projects calling for … well, huge pieces of paper. My kids made a banner or two, but never used it much. Then, when I was writing A More Obedient Wife–a novel that was based on the lives of four real people and spanned a period of about eight years–I unrolled a length of it and turned it into a time line.

In that case, I ended up putting way too much information on it, but it was better than nothing. I’m hoping that this time I can be more disciplined. Now that I have a sense of what real events, and what people, are important to my story (which only covers about one year instead of eight), maybe I won’t throw in everything but the kitchen sink. And I’m focusing on one historical figure rather than four.

Of course, I’ll still be left with my other main character–a fictional one. What will I do about her? I’ll save that for another blog post. After all, there’s bound to be another point, soon, when I’d rather write a blog post than a novel.

The Beauty of Xiaohe

The other day I traveled to Philadelphia to see an exhibition that the Washington Post has described as “one of the hottest tickets on the East Coast.”

Hot it may be, but I have to admit that some of the artifacts I saw on display gave me the chills. The exhibit, which is at the Penn Museum, is called “Secrets of the Silk Road.” And what’s chilling about it is that it includes a number of extremely ancient things that really have no business being around anymore: textiles, foods like pastries and wontons, even a couple of amazingly well preserved dead bodies–all of them thousands of years old.

The main attraction is a mummy–one that was, unlike the deliberately embalmed Egyptian variety, naturally preserved by the dry climate of western China–that has been dubbed “the Beauty of Xiaohe,” and that dates from between 1800 and 15oo B.C.E. Again, unlike Egyptian mummies, she isn’t wrapped up in material that conceals her face and skin from view. She’s wearing what was presumably the latest fashion in what we now call the Tarim Basin in about 1500 B.C. E–furry booties, a pointed felt hat, and (we’re told) a “string skirt” under the blanket she’s wrapped in.

She’s basically just lying there, almost as though she’s taking a 4000-year nap, albeit in a glass case. Her eyes are gone, but her eyelashes are still there, and quite lush. She has an adorable little pointed nose (as the Post points out, the kind of nose Michael Jackson spent his life pursuing) and prominent cheekbones (perhaps a bit more prominent now than when she was first buried), and an abundance of auburn hair that cascades around her shoulders. She is, or was, indeed a beauty, and it’s hard to take your eyes off her. But to me there’s something disturbing and frustrating about her as well, and about many of the other objects of daily life in the exhibit.

The Post review of the exhibit, which ran the day before I saw it, made an intriguing point about the “dichotomy between narrative history (which is obsessively interested in authority figures, emperors, kings, generals and the like) and the history of ordinary people (who strive for survival and, if they’re lucky, dignity).” I’m primarily interested in the latter–that’s what draws me to historical fiction, which allows you to get at lives that are otherwise opaque–and I looked forward to seeing an exhibit that focused on it. (The New York Times has also run a piece on the exhibit, which is surely the biggest thing to hit the Penn Museum in quite some time.)

But, as the Post article also pointed out, the fact is nobody really knows much about the people whose artifacts, and in a couple of cases bodies (there’s also a mummified infant), are on display. In fact, it’s all pretty confusing: a lot of different people apparently passed through this area, leaving traces of themselves behind, over the course of thousands of years. The signs accompanying the artifacts do the best they can with the little information available, but often they’re reduced to simply describing the object you’re looking at without giving you much in the way of context or background information. (Of course, sometimes the information wasn’t all that obvious: apparently the tall wooden objects situated at the graves where the Beauty of Xiaohe was found were phallic symbols, for the graves of females, and vulvas, for the graves of men. To me they looked more like oars and wooden renditions of those primitive stone faces on Easter Island, but what do I know.)

As the Post writer observed, the challenge of this exhibit is to get us to care about people to whom we can attach no names and no stories–in contrast to an exhibit about, say, Thomas Jefferson, or even Cleopatra. I did want to care, but I have to admit it would have been easier had there been more of a “narrative” available–something the Post article dismisses as a “now-cliched crutch” in the context of museum exhibits.

Cliched or not, it’s a crutch that seems to work, at least for me. Amazing as it is to see a 4000-year old woman, complete with eyelashes and hair, she doesn’t really come alive, as it were, unless you know something of her story.

And I guess that’s why the historical figures I write about–even the ones whose appearance remains a mystery to me–seem far more real to me than the Beauty of Xiaohe. I can always imagine what their eyelashes looked like, or their hair. But without some facts about their lives, and without the letters that have preserved their voices, I would never be able to imagine them.

The Oscars and the Truth

As the Academy Awards approach–they’re now only a breathtaking few hours away–it’s interesting to consider that four of the ten nominees for best picture are, as they like to say, “based on a true story.”

Or are they? Two of them–The Fighter and 127 Hours–are presumably pretty accurate, or at least pretty close to the way the central characters’ remember the experience, since we see the real people on whose experiences the stories are based on screen at the end of the movie.

But the other two–The Social Network and The King’s Speech–appear to have taken some liberties with the truth. In today’s Washington Post, film historian Jeanine Basinger dismisses what she acknowledges to be historical inaccuracies in The King’s Speech as “nitpicks.” What difference does it make if films get the historical facts wrong, she asks, as long as the end result is a good movie?

Okay, let’s go through the inaccuracies she lists. Colin Firth is taller and more handsome than King George VI. Okay, fine–that’s Hollywood. Winston Churchill wasn’t as fat as the actor who portrayed him. No big deal. The King didn’t actually stammer that badly. Hmm, well, that’s a pretty central element of the plot, but exaggeration in the pursuit of a good story is a minor sin. And Churchill didn’t really think it was necessary for Edward VIII to abdicate before marrying Wallis Simpson. Whoa. That seems like a pretty big nit to me. In fact, that seems like a distortion of the historical record. And was it really necessary to change that fact in order to make a good movie?

The Social Network, from what I’ve read, is even worse in terms of hewing to the truth. For instance, according to the movie, Mark Zuckerberg’s motive for inventing Facebook was to impress and/or avenge himself on a girlfriend who had dumped him. But in fact, Zuckerberg still has the same girlfriend he had at the time he started what became Facebook. Beyond that, the movie portrays Zuckerberg as a socially inept, social-climbing, obnoxious, opportunistic (albeit smart) nerd. Given that they changed the girlfriend thing, I don’t have too much confidence in the rest of the portrayal. And yet Zuckerberg, who isn’t even thirty yet, will have to spend the rest of his life living in the shadow of a fictional portrayal that I would bet the vast majority of moviegoers accept as the gospel truth.

Don’t get me wrong–I enjoyed and admired both of these movies. But I have to admit that when I learned about their cavalier attitude towards the truth I felt a little uneasy. Kind of the way I’ve felt about books that purport to be “true” but turn out to be largely, or entirely, fiction. If that sort of thing bothers people–and given the furor surrounding the revelation that James Frey’s “memoir” a few years ago was more like a novel, that sort of thing does bother people–why shouldn’t it bother them when the medium is a movie?

Maybe some people will say that it doesn’t really matter what Churchill thought about Edward VIII’s abdication, or that Mark Zuckerberg has enough money that he shouldn’t care how he’s portrayed in a movie. I happen to disagree with both those observations, but the question is, where do we draw the line? And who gets to decide? Do we really want our understanding of history, or of the characters of real people, to be determined by movie studio moguls whose primary concern is “telling a good story”? I’m pretty familiar with the story of someone named Betsy Bonaparte, a 19th-century celebrity who married–and then was abandoned by–Napoleon’s youngest brother. Back in the early days of Hollywood, two movies were made “based on” her life. In both of them, her errant husband returns to her. That may make a better story in the eyes of screenwriters (or at least it did then), but it’s about as far from the truth as you can get.

The thing is, stories get a boost from their association with reality–which is why that “based on a true story” label gets slapped onto whatever seems to qualify. We get a little added frisson from the idea that “this really happened.” But are movie-makers–or writers–entitled to take advantage of that frisson when they’ve rearranged the facts? Yes, it’s true that real events don’t always naturally fall into a convenient narrative arc, and that people don’t always behave quite the way fictional conventions would dictate. But that, it seems to me, is part of the challenge of writing about real people: you need to make sense of them and their lives, not just convert them into characters who follow the path that you’d like them to.

This may sound strange coming from someone who has written a novel–A More Obedient Wife–based on the lives of real people. But I chose to write about people who lived so long ago, and who were sufficiently obscure, that I didn’t have to alter the historical record to come up with a decent plot. All I had to do was fill in the many gaps in the record with my imagination. (Okay, I did eliminate a few of the many siblings a couple of my characters had, but given that so few people have heard of these historical figures, let alone the siblings I killed off, I don’t really see that as a major problem.)

I’ve heard other authors of historical novels make a similar point. Even if they write about well known historical personages, they often choose to write about parts of their lives that are cloaked in obscurity. Otherwise there’s nothing to play with.

There’s another way to write about, or portray, real people without confronting this dilemma: make them relatively minor characters and embed them in what is clearly a fictional narrative. A case in point is the current Masterpiece Theatre presentation, Any Human Heart (which, alas, conflicts with the Academy Awards tonight). Its central character, a sort of British upper-class Zelig figure named Logan Mountstuart, interacts with a delicious array of real personages from the 20th century–Ernest Hemingway, Ian Fleming, and a corrupt and vengeful Duke and Duchess of Windsor. We get that added frisson that comes from watching “a true story,” or at least a story peopled with real individuals, but at the same time we retain our awareness that what we’re watching is fiction.

All that having been said, I’ll be perfectly happy if either The King’s Speech or The Social Network wins the Oscar for best picture, because both of them were terrific movies. I just wish that when Hollywood powers-that-be are looking for a good story, they would either find a true one that doesn’t require tampering with significant facts, or else come up with something that’s NOT “based on a true story.” After all, although it’s probably true that the number of basic plots available to mankind is finite, the number of variations on those plots is pretty much unlimited. You just have to use your imagination.

Ablene and Aibileen

In a stunningly ironic development, this morning’s papers bring the news that the author of the best-selling novel The Help is being sued by a maid in Jackson, Mississippi, for allegedly using her name and physical description without permission.

Why so ironic? As those who’ve read The Help know, its plot hinges on the publication of a book that is a thinly disguised portrait of a group of middle-class white women in Jackson as seen through the eyes of their black maids. Much of the tension in the novel comes from wondering whether those white women will recognize themselves in the book, and what action they’ll take if they do. Only–this being the South in the early sixties–the maids who have told their stories are worried more about lynchings than lawsuits.

Ablene Cooper, the maid who is suing author Kathryn Stockett for supposedly portraying her in The Help, is seriously misguided for a number of reasons. Yes, there’s a similarity in names (the character in the book is named Aibileen, which is apparently how “Ablene” is pronounced) and appearance (both Ablene and Aibileen sport a gold tooth). And both have a son who has died. But Aibeleen’s story is not Ablene’s, nor could it be. For one thing, Aibeleen’s experience takes place fifty years ago, when Ms. Cooper (who is 60) was a child. For Ms. Cooper to argue that she is somehow “embarrassed” by the racial insults suffered by the fictional Aibileen in the early sixties is so convoluted as to boggle the mind.

One problem is that the book’s Aibileen is such a sympathetic, admirable character–unlike most of the white women portrayed in the book-within-the-book–that it’s hard to see how the real Ablene can credibly claim to have suffered any injury. According to the Wall Street Journal, one of Ms. Cooper’s claims is that the Aibileen character speaks in a “thick ethnic vernacular,” which embarrassed Ms. Cooper because she herself doesn’t speak like that. Ms. Cooper herself undercut that claim somewhat when she told the New York Times, “Ain’t too many Ablenes.”

Aside from the vernacular, she has a point. No matter how admirable the character she invented is, Ms. Stockett would have been wise to choose another name. (She apparently knows Ms. Cooper, who works for her brother and sister-in-law, only slightly; the real-life maid Ms. Stockett based the character on, and who was her family’s own maid when she was a child, was named Demetrie.) A writing teacher once advised me to change as many details as possible when a fictional character is modeled on a real person: if the real person is tall, make the character short; if the real person has blonde hair, make it brown; and so on. Surely it would have been easy to come up with some other name. But Stockett probably just started writing the book using the name Aibileen, and then the character and the name became inseparable. I know how that is.

But let’s leave all that to one side. The basic, and most obvious, point to make here–and the one Ms. Cooper has apparently failed to grasp–is that Aibileen is a fictional creation. That’s one essential difference between The Help, which is a novel, and the book-within-the-book about white women in Jackson, which was nonfiction (albeit with disguised names). Even if Aibileen were a lot closer to the real Ablene, Ms. Cooper would–or at least should–face a high bar in prevailing in court.

The fact is, ALL fictional characters are modeled on real people, to a greater or lesser extent. Even if an author doesn’t have an actual person in mind, he or she is probably borrowing various attributes of a character from different real people. And even if that’s not true, it’s likely that a fictional character, if done well, will at least remind some readers of an actual person. Should that person be able to sue? Clearly not. And where, exactly, do you draw the line?

But people can be quick to take offense, or at least to see the opportunity to make some money off a best-selling book. That’s why, if you’re going to write fiction based on the lives of real people, it’s a lot safer to choose people who are dead–preferably long dead. That’s what I did in my first novel, A More Obedient Wife, and that’s what I’m doing in the one I’m working on now. Not only are they themselves unable to sue, their descendants tend to be flattered rather than offended by an author’s imaginative riffs.

I confess to having some trepidation on this score about the novel that’s with my agent now. Its setting is contemporary, and let’s just say there are a few people who might think they recognize themselves in it if it should ever get published. But in that unlikely eventuality, I hope they’ll understand that the characters in the novel are no more “them” than the protagonist is “me”–and also that, in a way, ALL of them are me. They’re all bits and pieces of me and other real people and things that just came into my head. In other words, they’re fictional.

But who knows? A few weeks ago there was a chilling essay in the New York Times Book Review that told the tale of an unstable man who was convinced that a 1909 novel was a thinly veiled attack on his family, the Goldsboroughs of Maryland. He didn’t just sue. He stalked the novelist who wrote it, David Graham Phillips, and shot him six times near his home on Gramercy Park. After the shooting, when Phillips was asked whether he had in fact based the novel on Goldsborough’s family, he responded, “No, I don’t know the man.” They were, perhaps, his last words.

So if my novel doesn’t get published, I can at least console myself with the thought that I won’t have to worry about armed stalkers. Unless they’re zombies, risen from the dead. Now THERE’s a plot that would probably make the editors at major publishing houses sit up and take notice. Maybe I should give it a whirl.