I’m old enough to remember when everyone smoked—when college classrooms had ashtrays, airline meals came with a little package of two cigarettes and a book of matches, and athletes at Madison Square Garden (the old one) competed below a blue haze of smoke in the rafters. I can also remember how smoking was a social signifier. Taking out a pack of cigarettes signaled a pause in conversation and perhaps a move to another level of intimacy. The little rituals—tapping a cigarette on the pack, flicking open a Zippo or Dunhill, offering your companion a cigarette (and maybe lighting it)—all said what kind of people were there, and what they thought of each other. Even the kind of cigarette mattered; when I was an undergraduate, a whiff of the distinctive, burning-dung smell of Gauloise meant that a left-wing intellectual was in the room, the sight of the black paper and gold tips of Balkan Sobranies declared (to me at least) pretentious aestheticism, and the flat red pack of Dunhills meant that someone had money to burn.

I’m also old enough to remember when smoking become uncool, and how long it took. It it was more than ten years after the Surgeon General’s Report of 1964 that Minnesota passed the first, tentative ban on smoking in public places, and it took another decade for such bans to become general.  When I began teaching at Episcopal Academy in 1985, the faculty lounge was still a smoke-filled room, and the Student-Faculty Senate passed a special dispensation every year so that students could smoke at the prom; it wasn’t long, though, before smoking was banned altogether, on campus and at any campus function. But the most effective thing was the force of social disapproval. Somehow smoking became Not Done–a sign of weakness and lack of proper regard for one’s own health and that of others.  It became OK to ask someone not to smoke in your presence, even if there was no formal ban on it, and to expect them to accede to your demand. Now nicotine is an expensive, slightly addictive drug of the lower classes.  The last time I looked, cigarettes in my neck of the woods cost nearly $10.00 a pack, and a pack a day will set you back enough to make a sizable dent in the $26,572 that, according to the Pew Research Center, marks the upper limit of the lower income bracket in Pennsylvania.

And so I come to Twitter (and its progenitor Facebook and cousin Instagram). They look a bit like smoking did fifty years ago. Everyone does it. Their little rituals—checking our phones, flicking open the case, tapping hearts, typing messages, and sending selfies—occupy our hands and fill our leisure. They proclaim class and affiliation, and in the academic world, some people take them seriously enough to suggest that an assistant professor’s tweets should be part of a tenure dossier.  But signs are beginning to appear that just as we paid for the pleasure of smoking with our health, so we pay for the delight of tweeting with our sanity, and that social media are at least as addictive as smoking. I wonder whether we’ll see some shift in elite opinion, followed by bans here and there (in schools, perhaps), followed by more and more people giving up on Facebook and its kin, until, sometime around 2070, Twitter becomes yet another way to keep the poor occupied.

Yesterday afternoon I spent a little time looking at “ReconTEXTILEize,” an exhibit on the second floor of Canaday Library at Bryn Mawr College. The exhibit centers on a group of Byzantine textiles in the collection of Thomas Jefferson University that have been loaned to Bryn Mawr for study in a year-long 360° Course Cluster, “Textiles in Context: Analysis, Interpretation, and Exhibition.” Like a lot of the work that Bryn Mawr students do, especially in archaeology, classics, and museum studies, the exhibit was professional and informative. It did a good job of something that classical studies excel at: taking small bits and scraps of evidence and extracting as much meaning as possible from them.

Then I noticed this sentence off by itself in bold type at the bottom of the placard introducing the exhibit: “This exhibition contains content on death, child mortality, burial practices, and colonialism.”

My first reaction was to laugh.  I’m used to trigger warnings, and I occasionally use them just to be polite—but really?

Then I thought about it a bit.  Child mortality is mercifully rare these days, but it was part of life for much of human history, and it remains hard to think about, especially if you’re a parent.  I know a man who cannot look at Charles Willson Peale’s “Rachel Weeping” in the Philadelphia Museum of Art because he and his wife lost an infant daughter.  In 1782 Peale showed the painting behind a curtain, along with a trigger warning: “Before you draw this curtain Consider whether you will afflict a Mother or Father who has lost a Child.” (Consider yourself warned before you follow this link.)  So maybe infant mortality, for a few people nowadays, will trigger unpleasant memories.

But the rest of the notice, like many similar well-meaning statements that I see at Bryn Mawr, remains incoherent, and even a bit anti-intellectual. Death can’t be avoided, and knowing that we will come to it is one of the things that makes us human.  Education—Bryn Mawr’s purpose, the last time I looked—exists, in part, to help us think about this one event that none of us can escape, and about the art and rituals (burial practices among them) that human beings have devised to manage their knowledge of it.  Those three items about mortality sit oddly with the fourth, “colonialism.”  Colonialism is a matter of history, and so a matter of argument.  We can’t argue about whether death is a good idea, but we can ask whether colonialism is. Education, again, exists to help us ask hard questions and think in clear and nuanced ways about history, and about things whose ethical status is debatable.

Finally, I was left wondering whether everything in my own academic subject might seem to some people to need a trigger warning.  Death? Life? Violence? Beauty? How to live, and how to die?  Slavery, and freedom?  Classics can’t avoid these things, and its clear gaze at them is one of its great strengths.  How big is the gap, also, between warning someone about something and warning them against it?  Between putting a curtain in front of “Rachel Weeping” and taking it off the wall?

-Lee Pearcy

Tags: , , ,

Some new light on my post of January 12 may be shed by a letter from the SCS president following the Society’s review of video of the incident described in my post, by this update in Inside Higher Education, and by this first-hand account, which should be read along with this one.

LTP
Ἐξελαύνω Day, 2019,

It’s like the arrow in the FedEx logo: you don’t notice it, but then something makes you switch between figure and ground, and there it is. That’s the way I felt after I returned on January 6 from the Society for Classical Studies meeting in San Diego. It had been, for me, an unusually pleasant meeting–I managed to schedule everything that I had to do on Friday, January 4, and so had all of the 5th to see some of the sights and visit old friends in the city.

So it wasn’t until I was back in Pennsylvania that I saw the headline: “After Racist Incidents Mire a Conference, Classicists Point to Bigger Problems.” (I’ll resist the philological impulse to emend “mire” to “mar.”) On Saturday, as the Chronicle of Higher Education reports, an independent scholar named Mary Frances Williams stood up during an open discussion at a panel celebrating the 150th anniversary of the SCS, formerly the American Philological Association, and suggested that Dan-el Padilla Peralta, an Afro-Latin assistant professor at Princeton, had gotten his job “because he’s black.” People in the room condemned her remark as racist, and some moved to take away her microphone. The SCS Board of Directors tweeted a condemnation of “the racist acts and speech that occurred” and expelled Williams from the meeting.  A few days later, SCS President Mary T. Boatwright issued a letter to the SCS membership affirming the Society’s need to “confront, meet, and remedy the problems so appallingly revealed in San Diego.”

I need to say at once that Williams’ remarks strike me as racist, as well as simply rude, and that they deserve condemnation. Classics, at least in higher education, does have a deserved reputation for being whiter and more privileged than many other disciplines in the humanities, and you’ll look a long way at an SCS convention before seeing any persons of color with badges hanging around their necks.

But just stand a step or two to one side and look at what happened through the lens of intersectionality. The situation at the SCS begins to look a bit more complicated once you recognize how different systems of social and professional stratification weave through each other. A white woman accused a black man of owing his job to racial preference, and so implied that he was unqualified; but also, a female, marginalized member of the academic precariat—an “independent scholar”—suggested that a member of academe’s elite, a male Ivy League professor, had not earned his privilege, whereupon she was promptly crushed by the profession’s establishment. The fact that her challenge was racist and wrong doesn’t, I think, change that dynamic.

Complicating our understanding of this incident doesn’t make the problems of racism and unjustified exclusivity in classics go away. Suddenly you see the arrow, but the E and X are still there. But the intersectional lens may help us see that the problems are not simply a matter of individual prejudice or personality, and that they are not confined to classics, or to racists.

Tags: , , , , , ,

As academic specialties go, “classical reception,” or the ways in which people have “received”—enjoyed, used, learned from—the cultures of ancient Greece and Rome, seems harmless enough, and even respectable. It is, after all, the umbrella under which Classicizing Philadelphia, the project that launched this blog, shelters. Lately, though, I’ve been wondering how classical reception relates to another phenomenon that doesn’t have a good reputation at all: cultural appropriation. “Cultural appropriation” happens when someone—usually someone perceived as in some way privileged or elite—enjoys, uses, or exploits something characteristic of another culture. The term seems to have originated with academic sociologists and been weaponized by indigenous peoples with histories of colonization. Lately it’s been applied to practices as diverse as yoga, wearing sombreros, and a poem written in Black English by a white poet.

From one point of view, classical reception and cultural appropriation look a lot alike: one culture takes over something from another one and uses it.  So what’s the difference, and why isn’t classical reception a bad thing?  One difference is obvious: cultural appropriation is thoughtless. It doesn’t give any consideration to what the appropriated object or practice means or does in the culture from which it has been appropriated, and it does not try to give the object new meaning within the appropriating culture.  The classic example is acquisition of Native American artifacts and skeletal remains by nineteenth- and early twentieth-century museums.  In a museum case or on a warehouse shelf these objects become, as the title of a recent book has it, plundered skulls and stolen spirits.  (By this standard, Anders Carson-Wee’s thoughtful poem in The Nation doesn’t qualify as cultural appropriation, while thinking it’s funny to wear a sombrero at your fraternity’s Halloween party does.)  The best-known examples of classical reception, on the other hand, depend on thinking about the matter being received and either trying to recover its meaning or giving it a new one.  Marsilio Ficino and his friends in fifteenth-century Florence thought deeply about Plato’s Academy before they imagined that they were re-creating it, and a century earlier Dante made Vergil mean something new.

But the distinction between thoughtful reception and thoughtless appropriation will take us only so far; for one thing, some examples of classical reception are pretty lacking in thought, like this lipstick ad from the bizarre uses of antiquity in advertising that Edith Hall has been collecting in her Twitter feed lately.

Maybe this ad is thoughtless enough to qualify as cultural appropriation, or maybe the distinction between thoughtless appropriation and thoughtful reception doesn’t take us far enough. I want to suggest that another factor is in play when we draw a line between appropriation and reception: the presence or absence of a perceived cultural hierarchy.

Cultural appropriation depends on a perceived inequality. The culture doing the receiving is not only clueless about the cultural significance of the received material but also in a position to be clueless—the position of acknowledged cultural or political or social superiority.  The culture whose products are being appropriated, on the other hand, is acutely aware of the unequal status of the two cultures.  Only people who are aware of their lower position in a hierarchy of cultural status can complain of cultural appropriation or feel the pain it causes.  Having the Elgin Marbles in London does no harm to ancient Greece, but the modern Greeks can feel aggrieved because they believe that bullying Britain took their treasures when they were weak and oppressed by the Ottoman Empire. They are caught in a trap: every complaint about cultural appropriation affirms and reinforces their perception of inferior status. (Arguments about whether what Elgin did was lawful or not are another matter.)

Reception, in contrast, depends on an understanding that the culture being received will not be diminished or harmed by the other culture’s use of it.  And implicit in that understanding is an assumption, which doesn’t have to be explicit or even conscious, that the culture being received is in some way equal, or even superior, to the one doing the receiving.  It’s like potlatch, or Homeric gift-giving: giving only augments the prestige of the giver, and receiving a gift acknowledges the giver’s status. No one has yet (to my knowledge) accused Julia Child of cultural appropriation, first because cooking French recipes does no harm to the glories of la cuisine française, and second because no Frenchman believes that French culture is inferior to or of lesser status than American culture. It may be otherwise with burritos or collard greens.  Ancient Greece and Rome can be objects of reception not only because they are safely in the past and can’t object, but because of the perception that their material, literary, and political cultures are worth receiving and beyond harm. Every act of reception, even a lipstick ad, confirms their status.

–Lee T. Pearcy

9/2/2018:  And now Kwame Anthony Appiah has said it better, as usual, in this WSJ piece.

Tags: , , , , ,

« Older entries