Tear Them All Down

I’ve about had a enough of these arguments defending Confederate monuments.

Yesterday, workers in New Orleans began the process of taking down the first of four monuments celebrating the memory of the Confederacy and white supremacy. And people are losing their minds. The first monument to come down was built to honor the Battle of Liberty Place (1874), a three-day hostile take over of the New Orleans government by the white supremacist paramilitary group known as the White League. The monument went up in 1891 to honor this insurrection, and to make sure everyone knew exactly what these racists were fighting for, the inscription on the monument reads:

“[Democrats] McEnery and Penny having been elected governor and lieutenant-governor by the white people, were duly installed by this overthrow of carpetbag government, ousting the usurpers, Governor Kellogg (white) and Lieutenant-Governor Antoine (colored).
United States troops took over the state government and reinstated the usurpers but the national election of November 1876 recognized white supremacy in the South and gave us our state.”

(Inscription added in 1932. Emphasis added in 2017, by me.)

800px-NOLAWhiteLeagueMonumentByTracks
Pictured here: a marker celebrating racism. (Image source: Wikimedia Commons.)

This is a literal monument to white supremacy. And yet, a number of people go so upset that the city workers who took this monument down on Monday had to do so wearing flak jackets and helmets for fear of a possible hostile response.

Yeah. Chew on that.

Denouncements from racists aren’t surprising, let’s be honest. The ones that really get under my skin are the people who keep claiming that taking down these monuments will “erase the history.” No matter how bad that history is, we should leave these monuments up, or risk “making the same mistakes.”

Really? That’s what you are worried about? You’re concerned people will forget about the atrocities of white supremacist terrorist organizations, or the damning legacy of slavery that shaped all aspects of Antebellum life?

Yes, taking down four monuments will erase all knowledge and memory of the deepest historical scars that continue to run through American society.

I tell my students all the time that people often use history to sell a perspective. Politicians certainly love to do it, but this is a perfect example of how people build faulty arguments, wrap them them in the defensive guise of “history” and cling to that until their dying days.

For this current issue, this approach spans the aisle. These monuments are healing the political divide in this country as I type. Self-proclaimed liberals and conservatives alike have come out of the woodwork to defend the continued existence of these marble markers of white supremacy. One comment, which has become my personal favorite, made use of this historical argument approach to voice their support of the monuments, stated (paraphrased), “I am a liberal, but I am a proud southerner, and I will always be sad these historical markers are coming down. They commemorate an ugly time in history, but it’s history, so they need preserving.”

Well, honey darlin’, I have some bad news for you. These monuments aren’t preserving any history. They only thing these monuments preserve is that proud southern heritage of glorifying an Antebellum past when white supremacy could still claim to reign. Bless your heart.

I’ll end this here, because I’ve had enough. If you are upset these monuments are coming down, at least be honest about why you are. You are sad to see that romanticized memory of the “good ol’ days” come down. You may be angry people don’t want to see markers of the Confederacy as something worth celebrating. But stop using history as the basis for your claims. Four monuments coming down isn’t erasing anything. Outside of WWII, I don’t know if there is another subject historians have studied and published more books on than the Antebellum South and the American Civil War. Seriously. There are thousands of books out there on this very topic. Historians are preserving what happened in the past, so if that is your concern, rest easy.

Just be aware, it may not be the kind of narrative you hope to read.

An End to Silence

three-monkeys-1212616_960_720

“You cannot escape the responsibility of tomorrow by evading it today.”
– Abraham Lincoln

The future, while always uncertain, seems to carry a darker tint these days. Regularly now, nearly on a daily basis, I encounter someone lamenting the troubled nature of the times in which we live. Each day brings a new kind of chaos, making uncertainty one of the most certain aspects of life. Of course, nothing about this is new. We will never know what tomorrow might bring, but it seems safe to say that anxiety about the future continues to rise.

As a historian, I have grown used to looking back on periods of great change and upheaval with scholarly interest. It is a comfortable position – observing and analyzing tumultuous events from the safe distance brought by time. The era I specialize in includes what historians broadly refer to as “The Age of Atlantic Revolutions,” successive centuries of revolt, war, and at times shocking levels of violence. Like myself, students find these events interesting as well. War and death are usually easy means for capturing and holding students’ attention. But even when students recognize the connections between past events and the present, the events remain distantly, and comfortably, in the past.

Living in the midst of great change doesn’t allow such separation. In particular, the present appears ever more to demand a call to action, one heeded by people from all sides of the social, economic, and political spectrum. The tenuous nature of the present has made me uneasy and, at times, hesitant. The emotional nature of the recent election has brought a new kind of charge into classroom discussions about race and privilege. Emotions, more than analysis, feed interpretations and perspectives. This shift in the nature of classroom discussions has given me pause. I found myself asking how, and should, I approach such topics?

My wariness moved beyond the classroom. During and in the immediate aftermath of the election, I debated whether or not to post (or repost) certain stories on social media. When creating and using various online platforms, I always kept the “adage” in mind: never post anything online you wouldn’t want a potential employer to see. This was (and is) especially true, as I am actively seeking longterm employment. In the highly competitive academic job market, any possible misstep could have real repercussions.

Then, as fall classes came to an end, I asked my students to consider the ways the past informs our understanding of the present, and they began a discussion about the outcome of the election. I had skirted the topic up to that point, but in that moment, my creeping caution went out the window, and I let the students take the lead. During that discussion the students made it painfully clear how badly they needed a space to confront these real and pressing issues. Where, outside of social media accounts, can students exchange differing views with their peers? Our online spaces do not always allow voices of dissent or alternative perspectives to come through. We have the luxury to craft the discourse, to shut out those with whom we disagree, and to create a space that reinforces, rather than challenges, our present worldview. The classroom is something of a sacred space where students can challenge and learn from each other. It was then I recognized that my responsibility as an educator supersedes any concern about employment.

Avoiding, rather than initiating, discussion in difficult topics is not why I walk into the classroom each week. I looked into the faces of my students and saw real fear about the future. I thought about my former students. International students, students from a wide array of racial and ethnic groups, students who learned English as a second, third, or fourth langue, students who are a part of the LBGTQ community. Students who now wonder what place they have in our present society, or who will stand by their side.  Politicians and pundits often voice concern about the kind of world we will leave for children. Who is asking about what kind of world these college students have to encounter as they enter adulthood? Concern for the lives of children is valid and necessary, but we cannot ignore the pressing burdens that weigh on the newest generation of adults.

The time in which we live features very real forces that seek to silence opposition and dissent. An abhorrent “Professor Watch List” publicizes the names of professors who challenge certain political views in the classroom (challenging worldviews, a.k.a. doing their job). The rise in hate crimes seeks to intimidate and silence those who do not fit within a racist perspective of what and who qualifies as “American” – a perspective that fits comfortably within the era of the late-nineteenth century. The Ku Klux Klan has revived itself and marches again in celebration of recent political developments. Our students see all of this and wonder what it will take to live in, or even survive, this world.

And yet, every day, new dismissals of “millennials” appear, steadily undercutting the value and ingenuity of these emerging adults. Students repeatedly hear these critiques, and as as result so many have internalized this disbelief in their, and their generation’s, self worth. It’s time we recognize these narratives as destructive in nature and listen, carefully, to what students have to say. Instead of dismissing young adults as foolish and self-involved, it’s time we hear their voices, understand their fears, and lift them up as we progress through this era of uncertainty. This is not a time for hesitation. This is a call to action. Across the country, thousands of students are taking up that call. Who will stand by their side?

It is possible this will have repercussions. Interviews or job offers may not come (though, in academia, that was already a strong possibility). But, as I tell my students when we confront difficult conversations of the past and today, these issues are bigger than the self. Meaningful change will not occur if we are unwilling or unable to look beyond our personal feelings or concerns to understand the broader, structural forces that perpetuate this fear and anxiety. It’s time we accept the responsibility of today, to help and support those who truly do not know whether there will be a better tomorrow.

Selling History to the Public

speakers-414553_960_720

A recent op-ed by Jim Grossman – “History isn’t a ‘useless’ major. It teaches critical thinking, something America needs plenty more of” – published earlier this week in the Los Angeles Times has made (and continues to make) the rounds on social media. Responses to the editorial are not surprising. Many, like me, reposted it to help circulate the message. Many more posted comments with the link to show their agreement with Grossman’s argument. Plenty of others challenged the claim that History is not a useless major; to quote one person who commented on the editorial (‘commenter2015’), “History is fun. But so is going to Six Flags, I wouldn’t major in it.”

What intrigued me were the handful of people on Twitter, fellow historians, who expressed frustration – not at the message of the editorial, but the fact that historians have to make this case over and over again. It seems that historians, and those who specialize in the humanities in general, are always on the defense. Politicians from both national parties mock the liberal arts (lest we forget President Obama’s infamous dig at Art History), funding for programs continues to dwindle, and now the soon-to-be nominee for the Republican Party is advocating an education plan that would actively discourage students from majoring in the liberal arts.

Those outside the profession may acknowledge the importance of learning past events, but the true meaning of studying history and the benefits it offers is still something a lot of people simply do not grasp. Coincidentally, the same day Grossman’s op-ed released, Patrick Johnson, the Vice Chancellor of Queen’s University Belfast, dropped a dismissive line in an interview, stating, “Society doesn’t need a 21-year-old who is a sixth century historian.” But, according to the Vice Chancellor, society does need, “a 21-year-old who really understands how to analyse things, understands the tenets of leadership and contributing to society, who is a thinker and someone who has the potential to help society drive forward.”

being-alone-513526_960_720

Statements like this, or like the one stating History is “fun” but not worth majoring in, make historians and humanists alike want to bang their heads (repeatedly) against a wall (or desk, or door… any hard surface will suffice). Patrick Johnson’s description of an ideal graduate is – wait for it – a liberal arts major. Someone who can analyze (check), who is a thinker (check), and who can contribute to society/drive society forward/insert other generic comment about leadership here. That last point can apply to anyone, but it does not in any way exclude liberal arts majors, especially historians of the sixth century (solidarity, my medieval historian friends).

This is what is so maddening about the conversation, but it is also why pieces like Grossman’s are so important. For decades, humanists proclaim endlessly about the value of what we study, yet these words seem to fall on deaf ears every time. How do we change that? Grossman sells the economic value of majoring in history, because he has to. The idea of pursuing higher education that is not some blatant form of job training no longer makes sense to the general public. Degrees are investments, universities are businesses, and students go to college to get a job.

The discourse has grown so stark, it chills the bones.

Whether or not we will return to a day in which the value of education does not depend solely on economic returns is unclear. Maybe it will only happen in my dreams. Regardless, we (historians) know that history courses teach essential skills, and that history majors are in high demand, especially in the tech sector.

So why do we keep having this conversation?

Why are op-eds, like Grossman’s piece, still necessary, and yet still receive such criticism? Why do historians and humanists alike have to keep beating that poor, dead horse with the same defensive claims? What about this isn’t working?

I wish I had the answers. In my mind, the front line of history and public engagement is the classroom. Maybe that is where those in the liberal arts should focus (but certainly not limit) their attention. The classroom is the space in which we, the instructors, can at least show the next generation of economic and civic leaders what the liberal arts brings to the table. This would also mean a fundamental shift in the way academia values teaching, but that is a topic for another day.

Unless something changes, I do not see how this conversation can move forward. We seem to be stuck in a loop with each side talking past each other but never listening. There must be different tactics available to achieve some kind of progress.

But what if progress is already happening? Maybe the conversation has begun to change. It isn’t clear. Until we gain the distance of time to examine the shifting patterns of discourse, we may not yet be able to decipher and understand these developments. But engaging in that kind of analysis would require a certain set of critical skills… now, where on earth could we find that?

On Graduating and the Loss of Identity

The months following graduation proved more challenging than expected. Not because of my first attempt to take on the bewildering, crushing academic job market. Also not because of the loss of purpose I once had while writing and editing my dissertation. Those experiences were certainly part of the challenges that awaited me after my mentor placed the doctoral hood around my neck and after my official diploma arrived in the mail.

I felt lost. Time passed, and the sense of feeling suspended between worlds persisted. It followed me as I took on the role of adjunct and lined up classes to teach at my (now) alma mater. It remained stuck at the edge of my mind as I scrolled through posts on social media. Spaces once filled with familiar topics about academia and teaching, carried on by familiar names and faces, became alien. My usual retorts or curious inquiries fell silent. I had no words to offer.

This shift confounded me. I felt a persistent sense of dread that I no longer knew who I was.

But that didn’t make sense. I accomplished so much (so I told myself). Friends and family passed along their congratulations and well wishes. Their kind words and excitement sat heavy like a rock deep in my core. I felt ashamed I didn’t share in that sense of pride for what I did. I earned a doctorate, and that alone is a challenging feat. Less than 2% of the population in the United States has done the same, and being a woman, I was part of the less-than one percent. Or so I was told.

I struggled to understand this sense of loss – to understand why, after finally achieving success, my world seemed so out of sorts. Nothing seemed clear until, months after the fact, I understood.

For seven and a half years, I was in two graduate programs, in two different schools, in two different states. Goal-oriented and motivated by some undefined source of willpower, I devoted a portion of my life to earning two graduate degrees. For what purpose, I still struggle to know, but graduate school was more than what I did – it became who I was. It got under my skin. It changed the way I spoke. It changed the way I dressed and how I carried myself. It utterly redefined my very being.

And then it ended.

Why else would graduation and achieving the one goal I’ve worked toward for the past several years have such a disconcerting effect? I didn’t simply graduate from a PhD program, I lost an essential part of my identity.

It took me three months to realize that.

For years I’ve introduced myself as a graduate student. Doing so can carry so many different meanings, which I found to be a convenient means of explaining or excusing myself (why I didn’t have a full-time job, why I often bemoaned my economic standing, why I never had time to do ______ or go to _____). I never completely understood how doing so, year after year, reshaped the way I thought of myself, or how I perceived of myself.

On social media outlets, especially Twitter, where I grew accustomed to conversing with an array of scholars – food and drink aficionados, and other intriguing, sharp-witted people – it was as though I forgot what to say. I followed conversations but seemed to lack the ability to join. Even in day-to-day conversations and encounters, I felt like part of my personality was gone, evaporated into the ether, and I was little more than a spiritless automaton.

I suppose that is the danger of making a temporary identity such a fundamental part of your being. It is now clear to me why the lack of success on the academic market is so emotionally destructive to so many. It isn’t about the job that didn’t pan out. I wonder now if it is ever about the job. People can research, write, and teach in many capacities outside of the academy. No – it is about coming to terms that you have to separate yourself from an all-encompassing identity. You have to acknowledge that your hope of turning your graduate student identity into the different, but still familiar assistant professor identity won’t happen. It is the realization you have to separate yourself from all that is recognizable and comfortable.

The realization that it is time not just to do something else, but to become someone else.

It took me three months to come to terms with this transition, and while there remains some creasing for the iron, I finally feel a sense of peace. I have new tasks and objectives now, offered in the form of full-time, “alternative-academic” employment. I see new ways to apply the knowledge and skills gained over the past decade. I also recognize that I am more than a grad student. I am more than my PhD. I am not my degree, and neither are you. Knowledge and education can shape us in powerful ways, but in the end it is up to us to chart our own paths and remain our own advocates.

Only time will tell what challenges come next.

Assessing the Year’s End

With 2015 coming to a close, myself and people across the world inevitably take stock of the past year and look ahead to what may come next. While many set goals for the coming year, I think it is good to take a moment and evaluate accomplishments from the past twelve months. In academia, the ‘to-do list’ receives heavy precedence, and often we forget to acknowledge and give ourselves proper credit for all the work we completed. I am especially guilty of this – brushing off accomplishments or passing milestones to focus instead, almost exclusively, on all that I did not do or achieve.

For this year, I aim to change that, and I plan to do so through an ‘I’ve done list.’ D’Lane Compton (@yourqueerprof) put forward this wonderful idea on Twitter. Dr. Compton suggested that, instead of focusing on what you need to do next, take some time to look at what you have done. Write out a list of all your accomplishments from the past year and use that as a platform to build awareness, as well as self-confidence. For grad students and recent graduates, both of these aspects are critical in what can be challenging times.

grandes_heures_anne_de_bretagne_saint_matthieu
Focus on what you have done, not only on what you need to do. Matthew the Evangelist, miniature from the Grades Heures of Anne of Brittany, Bourdichon. Wikimedia Commons.

This year has been, without a doubt, one of the most challenging, both personally and professionally. Finishing, editing, and defending my dissertation was only one part of a busy year full of conference presentations, building and teaching new courses, and writing pieces for publication. It is easy to throw the dissertation and finishing graduate school into the mix as just another part of a busy year. In doing so, I forget the significance of that accomplishment. It becomes easy to focus instead on what I need to do next (first and foremost: find employment).

We become so conditioned to jump from one task to another – finish a chapter draft, grade papers, submit that article, teach a class, grade more papers – that looking ahead becomes a learned trait. There are always more books and articles to read, there are always – always – more papers to grade, and the publications list of your C.V. could always be longer… The noise that surrounds all we feel like we need to do threatens to drown out all that we’ve achieved. When that happens, it can leave you feeling rather empty and low. Instead, take a moment, shut out that noise of the never-ending to-do list, and write out what you accomplished this year. Give yourself proper credit and take pride in what you are capable of achieving. The list will likely end up longer than you expect.

I hope you all had a happy 2015. Best wishes for the new year.

brinde_espumantes
Cheers.

Twitter and the College Classroom

In April, I had the fortunate experience of participating on a roundtable for the Organization of American Historians (OAH) Annual Meeting in St. Louis. The roundtable focused on uses of Twitter in the college classroom and featured presentations by Ian Aebel (panel organizer), Matt Hinckley, and myself. Laura Fowler chaired the session and moderated the Q&A. Here is a video of the session:

.

After the presentation at the OAH Annual Meeting, editors at Process – a blog operated by the OAH, the Journal of American History (JAH), and The American Historian (TAH) – asked if I might write a post on this topic. The post, “Don’t Put Away Your Phones: Bringing Twitter into the College Classroom” became available today. In it, I discuss my views on the positive ways Twitter, and other social media, might operate in a classroom setting. While the use of Twitter as an assignment can involve a learning curve for likely everyone, I believe that it has far more positives to offer than not.

Here is a snippet from the post:

Initially, most of the students seemed perplexed by the assignment. The majority of the students stated on the first day that they had never used Twitter before. Their response served as a good reminder that one cannot assume students are inherently familiar with social media or certain aspects of technology… Most of the class was reluctant at first, but within a few weeks, participation via the course hashtag began to warm up; by the midway point of the semester, online discussions were an established routine.

You can read the full post over at Process. Feel free to post any comments or questions below!

Subversive Brewers in Medieval England

This past week, I presented on the panel, “Beer and Taxes: Nothing Can Be So Certain,” at the Sixty-Second Annual Midwest Conference on British Studies in Detroit (hosted by Wayne State University). My paper, “Subversive Brewers: Ale and Tax Evasion in Medieval and Early Modern England” featured research completed for my M.A. thesis. I had not returned to my thesis in roughly six years, so it was an amusing, and at times cringeworthy, experience. Still, dusting off my thesis reminded me of how much fun I had researching the history of beer. It remains a topic near and dear to my heart.

For those interested, my entire thesis is available online, either through the Proquest thesis database or at Academia.edu, but a snippet of the paper I presented in Detroit is included below.

joerg-preumaister
Abb. 8: Jorg Prewmaister, Mendel Band I (1437), Seite 60 (Source: Historical depictions, guild signs and symbols of the brewing and malting handcraft; online thesis by Matthias Trum.)

Segment from, “Subversive Brewers: Ale and Tax Evasion in Medieval and Early Modern England” (please contact me if you wish to read the full paper):

Along with setting the price of ale, the English government also regulated the size and cost of serving measurements. The London Aldermen only allowed three measurements for selling ale: the quart, the pottle, and the gallon.[1] An Assize from 1277 declares that, “no brewster henceforth sell except by true measures, viz. the gallon, the pottle and the quart. And that they be marked by the seal of the Alderman.”[2] The law required all brewers to have their quart, pottle, and gallon containers inspected by an Alderman four times a year.[3] If a brewer brought any container to the Alderman that did not meet the measurement standard – the wooden serving containers shrank over time – the Alderman destroyed the vessel. Brewers who neglected to present their measures to the Aldermen had to pay a monetary fine.[4]

A section of the Liber Albus, the first book of English common law, lists the punishments for any brewer caught serving ale in a measurement that did not have the seal of an Alderman. According to the Liber Albus, those caught breaking the laws of appropriate measurements faced a fine of forty pence and the destruction of the brewer’s measures for the first offense. The punishments increased for multiple offenses: “The second time let her be amerced to the amount of half a mark. And the third time, let her be amerced to the amount of twenty shillings.”[5] As the government reissued the Assize of Ale, the punishments for breaking the law became harsher. A proclamation from 1316 set the cost of one gallon of ale at three farthings and one penny for the city of London.[6] Any brewer caught breaking the ale law lost her brewery for the first offense, and she lost access to engage in the trade completely for the second offense. For the third offense, the guilty party faced exile from the city.[7]

Ale brewers resisted the regulations implemented by the English government, either by openly objecting to the prices set by officials, or by subversive means of fraudulence. Multiple accounts appear of brewers approaching the Mayor and Aldermen with complaints of the legal price of ale appear, and often the brewers threatened to discontinue their service of providing ale to the public. The Plea and Memoranda Rolls feature many accounts of London brewers who appeared in the Mayor’s Court challenging or refusing to abide by the ale laws. The brewers made these objections in hopes they would receive higher wages, but their attempts to alter the ale laws ultimately failed.[8] London officials viewed such demands as particularly harmful to society, as threats to cut off or diminish the ale supply would directly affect the ability of London citizens to obtain the necessary victual. The majority of the brewers who threatened to cease brewing faced imprisonment as a result. The Liber Albus outlines such punishment, stating:

And if any brewer or brewster be not willing to brew, or brew less than such person was wont to brew, let such person be held to be a withholder of victuals from the City, and for such disobedience and malice incur the penalty of imprisonment, at the will of the Mayor for the time being; and nevertheless, let such person foreswear the said trade within the franchise of the City for ever.[9]

Brewers not only had to adhere to measurement and cost restrictions, but they faced an obligation to engage in the trade regularly to ensure a steady supply of ale to the public.

As the English government tightened its control over the brewing trade through the Assize, problems between brewers and local authority figures developed during the fourteenth century. On May 19, 1350, Adam le Brewere proclaimed before both the Mayor and the Sheriffs of London that brewers deserved exemption from the Alderman’s regulation and control. Adam states that he intended to “gather together the brewers, and they would agree not to take service except by day only and at the wage of 12d. [pence] a day.”[10] Adam’s threat to halt the availability of ale to the public resulted in his imprisonment, as, according to the Alderman, his demands directly displayed contempt for the King and the commonwealth of the people.[11]

Other brewers openly challenged and made threats in public and in the Mayor’s Court against ale taxation caused by the Assize. In 1375, a brewer named Simon Macchyng declared that the brewers of London “would or could not observe the recent proclamation” of the Assize, which put in him prison.[12] Like Simon, Thomas Goudsyre also faced charges of imprisonment in the same year for refusing to sell a gallon of ale at the legal price.[13] Another troublesome brewer named William Ronyn made a public declaration in a market place in November of 1375 that he and all other brewers of London would stop brewing due to the price set by the Assize.[16] William faced additional charges for carrying out his threat, as he convinced a portion of the brewers to cease production or refuse the price of the Assize.[17]

Ale brewers made up a unique area of England’s economy during the Middle Ages and the Renaissance. Due to the wide practice of the trade and the high demand for ale, the English government regulated brewing more than most other crafts. While brewing and baking shared a similar importance, the early thirteenth-century formation of the Bakers’ Gild provided bakers with a greater advantage than brewers. Bakers faced public humiliation for providing small loaves of bread, as did other craftsmen caught breaking the law by short-changing their customers, including brewers. Unlike brewers, however, neither bakers nor members of other trade gilds had to pay fines in order to engage in their ordinary work. Instead, the English government left control of the respective industries largely to the gilds.[18] Other craftsmen could freely manufacture goods in accordance with the law, but brewers had to pay standard fees simply because they made ale. The government regulated brewing more because it wanted to ensure the public had access to ale, but also because of the profits gained by taxing ale brewers.[19]

[1] A pottle measured around one-half of a gallon.

[2] Sharpe, ed., Letter Book A, folio 129 b, available online at http://www.british-history.ac.uk/report.aspx?compid=33031.

[3] Monckton, English Ale and Beer, 57.

[4] Monckton, English Ale and Beer, 57-58.

[5] Monckton, English Ale and Beer, 56-7. The feminine form appears in this declaration because women made up the majority of brewers during the thirteenth and fourteenth centuries. A half-mark was equal to six shillings and eight pence.

[6] A farthing was equal to a quarter of a penny. The price for a gallon of ale cost 1 ¾ pence.

[7] Sharpe, ed., Letter Book E, folio lvii, available online at http://www.british-history.ac.uk/report.aspx?compid=33100.

[8]A.H. Thomas, ed., Calendar of Plea and Memoranda Rolls Preserved Among the Archives of the Corporation of the City of London at the Gildhall: Rolls A1a-A9, A.D. 1323-1364, Volume I (Cambridge: Cambridge University Press, 1926), 260-70; A.H. Thomas, ed., Calendar of the Plea and Memoranda Rolls of the City of London: Volume II (Cambridge: Cambridge University Press, 1929), Roll A 21, Membr. 3 and 3b, available online at http://www.british-history.ac.uk/report.aspx?compid=36685. Specific details pertaining to Brewere, Macchyng, Goudsyre, and other disruptive brewers’ appearances before the Mayor’s Court appear in chapter five.

[9] John Carpenter, Liber Albus: The White Book of the City of London, trans. and ed. Henry Thomas Riley (London, 1861), 311.

[10] A.H. Thomas, ed., Calendar of Plea and Memoranda Rolls Preserved Among the Archives of the Corporation of the City of London at the Gildhall: Rolls A1a-A9, A.D. 1323-1364, Volume I (Cambridge: Cambridge University Press, 1926), roll A6, membr. 5b, available online at http://www.british-history.ac.uk/report.aspx?compid=36660. Although wages increased following the outbreak of bubonic plague, Le Brewere’s demand of 12 d. is particularly high.

[11] Thomas, ed., Plea and Memoranda Rolls: Vol. I, roll A6, membr. 5b.

[12] A.H. Thomas, ed., Calendar of the Plea and Memoranda Rolls of the City of London: Volume II (Cambridge: Cambridge University Press, 1929), roll A21, membr. 3b, available online at http://www.british-history.ac.uk/report.aspx?compid=36685.

[13] Thomas, ed., Plea and Memoranda Rolls: Vol. II, roll A21, membr. 3b, available online at http://www.british-history.ac.uk/report.aspx?compid=36685.

[14] Thomas, ed., Plea and Memoranda Rolls: Vol. I, roll A9, membr. 2b, available online at http://www.british-history.ac.uk/report.aspx?compid=36663.

[15] Thomas, ed., Plea and Memoranda Rolls: Vol. I, roll A9, membr. 2b, available online at http://www.british-history.ac.uk/report.aspx?compid=36663.

[16] The price of ale during listed in the record at this time was 1 ½ d. per gallon.

[17] Thomas, ed., Plea and Memoranda Rolls: Vol. II, roll A21, membr. 3b, available online at http://www.british-history.ac.uk/report.aspx?compid=36685.

[18] Salzman, English Industries of the Middle Ages, 313-15.

[19] Bennett, Ale, Beer, and Brewsters, 47; Salzman, English Industries in the Middle Ages, 297. Brewers remained below the ranks of other gilds, because an overall social perception that brewers were public servants existed at the time.

Finding Success while Dissertating

The dissertation.

It’s a term that can spark an array of emotions in PhD candidates: excitement, fear, dread, exhaustion, curiosity, or even panic. A feeling that also emerges as one nears the end of such a monumental project is a growing sense of accomplishment. It is a feeling that, regardless of what may happen after graduation, one can look at their completed dissertation and say, “wrote that.”

Writing a dissertation is a remarkably difficult process. I can only speak of my experience working in the humanities, but I have no doubt that the difference of discipline in no way diminishes the challenges posed by constructing and writing original research. Oddly enough, before I took my comprehensive exams, I wasn’t that worried about the dissertation. I wrote a thesis while completing my Master’s degree in history, and I wrote so many papers for classes throughout my graduate career, the dissertation didn’t seem so daunting. Once I became immersed in the process, however, I felt a sense of shock at just how different the experience was.

6 Steps

The dissertation stage is the point when many PhD candidates falter. The structure of coursework and exams vanishes. While you have an advisor and a committee to guide you and provide feedback through the process, you are responsible for the doing the research, and – most importantly – writing a cohesive manuscript that presents (and proves!) original research. That requires a great deal of self motivation, and finding that motivation is no easy feat.

While I still have a way to go – editing and writing the conclusion – I have drafted the bulk of my dissertation, and my committee chair felt confident in my ability to defend in the fall. It feels a bit surreal, and also underwhelming. The grind of editing is nothing to shrug off, and the conclusion, while I do have much of the research compiled and a thesis in mind, still does not exist. I know that steady work, though, will see me through to the end. By December of 2015, I should have my degree in hand.

Writing my dissertation has been an educational experience, to say the least. While the last thing the world needs is another blog post about writing a dissertation, I wanted to share some of the important lessons I learned throughout the process.

1.) Research takes a very long time.

Don’t underestimate the amount of time it takes to compile, read, and analyze the sources you need for a book-length project. I sorely misjudged how long it would take me to gather sources and determine how I might use them. Because of funding restrictions, some research trips forced (what I came to call) a ‘slash-and-burn’ approach to gathering materials. This was especially true for international research trips. Archives that allowed photographs became my favorite places in the world. I photographed as much as I could – anything that appeared like it might be useful. I gathered, and gathered…

Docs - NA UK
Researching at the National Archives UK.

When I returned home, the real challenge began. I converted the pictures into PDFs to make reading the picture files easier (I still have only done this for a fraction of the collections I looked at, simply due to the insane amount of time this converting process took.) Then, I found myself staring at thousands of pictures. For weeks. Months. I stared at pictures – squinting, zooming – doing everything I could to decipher the words written in a letter five centuries ago. I took notes. So many notes. All of this was simply to see if any of the documents fit into the idea of what I imagined my dissertation would be about.

Respect the research, but more importantly – respect the amount of time it requires.

2.) The project will change as you write.

I wrote three chapters before the overarching narrative of my dissertation became clear. The idea of a dissertation prospectus (the lengthy written overview of the project) that departments often require is to help you go into the research and writing process with a clear idea of what you want to accomplish. There is value in writing a prospectus, but the project I wrote about in that document bears zero resemblance to the dissertation I produced.

Many PhD students are familiar with this. They know going into the dissertation stage that the project will evolve, and it should evolve, based on the documents you use. Still, what I didn’t expect was how much the project would change when I was deep into the writing. The first few chapters were more like seminar papers, linked together by topic and timeframe, but the overarching connections were not clear. You don’t always know where the story will go until you write it. It wasn’t until I wrote the second-to-last body chapter that it started to make sense. This means I need to go back and revise the earlier chapters to fit the narrative, but at least I now have that sense of direction in mind.

Don’t let the shifting nature of the project deter you, though. Writing without a sense of direction is challenging, but it can also reveal unexpected changes in the narrative.

3.) The information you gather doesn’t mean anything until you write about it.

This was the best advice I received during the entire process. While I worked as a research fellow at the Fred W. Smith National Library for the Study of George Washington at Mount Vernon, Dr. Doug Bradburn, the Founding Director of the Library, often checked in on the fellows and offered his advice, especially to the fellows who were graduate students. During a conversation we had, Doug said, “You have all of this information in your head, but it is just mush – it doesn’t mean anything – until you write about it.” While slightly paraphrased, the meaning of this statement stuck with me. I struggled a lot with the ‘I have all of this stuff, what am I supposed to do with it?’ questions, but Doug’s advice helped me kick things into gear. Any time I became stuck or felt overwhelmed with the amount of work I had left to do, I recalled that advice, and I sat down to write.

It is only in the writing process that the meaning of documents, the direction of an argument, or even the narrative of the story becomes clear. So…

4.) Shut up and write.

This is a popular phrase used by writers. A quick Google search will bring up a litany of blog posts and other web pages that promote the mantra: “Shut up and write.” There is even a #shutupandwrite hashtag on Twitter. I made my own “motivational monk” who constantly looks down on my desk with these words displayed in his hand.

IMG_9627

Some have pushed back against this, viewing the phrase as a form of ‘shaming’ people into writing. Personally, I like it. It works for me. I combined that mantra with a daily word count goal popularized by L.D. Burnett’s “Grafton Line” challenge. The idea is to write every day with a word count goal in mind. I set mine at 500 words. I figured that writing 500 words a day roughly translated into a chapter per month. Most days I met that goal, other days… well, I did what I could. There were writing days when things flowed so well, I embodied the song, “Walking on Sunshine.” Other days, it took everything I had just to reach 500 words. Those days felt like true battles. But I stuck to it, and the results were surprising.

IMG_9582
I kept track of my daily progress in the Notes app on my phone. This helped keep me accountable.

At the start of the summer, my committee chair and I agreed to a plan: write two chapter drafts (the last two body chapters of my dissertation) before the start of the fall semester, and I could plan on defending before December. It felt daunting. I knew going into this plan that the last two chapters would be the most difficult. In the midst of those summer writing goal, I moved to a new city, I got married – you know, life kept happening. Still, every day, I sat down and I wrote. The first of the two chapter drafts went in July 3, and I submitted the second on August 14.

Daily goals and persistence resulted in over 100 pages of written work in about two months. While people may push back against feeling ‘shamed’ into writing, writing is the only way a dissertation will come into existence. If you don’t like ‘Shut up and Write,’ maybe try a ‘Just Do It,’ approach – though Nike may want some royalties for that…

The dissertation is a beast. While mine is not yet complete, I can see the finish line. There will be days when finishing it seems impossible. There will be days when you question your life choices. But there will also be days when you can look back through all of your drafts, see the massive amount of work you’ve done, and feel a sense of pride that you wrote that.

Best of luck to all the ‘dissertaters’ out there! Now, back to work…

idris

A (re)turn to the humanities?

To say the humanities have taken a beating the past several years would be a glorious understatement.

Slashed funding, shrinking departments, and the constant public rhetoric on STEM, “employable” degrees… the anti-humanities cycle continues to spin. As a result, college students are actively encouraged to avoid pursuing liberal arts degrees and instead focus on the degrees that “make sense” – you know, the ones that will land college graduates a job. This approach is understandable, as Scott Jaschik recently pointed out in Inside Higer Ed, the number of liberal arts majors tend to decline in times of recession. While the trend makes sense, the public discourse surrounding university degrees in the humanities has been brutal. These stories aren’t new. Each new development (Gov. Rick Scott’s plan to raise tuition for liberal arts majors, recent attacks on the education system in Wisconsin, and even backhanded remarks from the sitting President of the United States) seems to deepen the ever present sense of dread that seems to follow humanities advocates. That is, until recently.

Heritage National Archives
“The heritage of the past is the seed that brings forth the harvest of the future.” Heritage statue outside the National Archives of the United States.

There now seems to be a turn back toward the humanities. Perhaps a glut of STEM majors flooding select job markets caused people to realize the necessity of a diverse education. Perhaps people are starting to remember that higher education is not simply pre-job training. Still, the number of articles coming out in recent weeks is a refreshing change in the discourse. – Drew Gilpin Faust

Last week, Forbes published an article on the value of liberal arts degrees in the tech industry (That ‘Useless’ Liberal Arts Degree Has Become Tech’s Hottest Ticket) that made waves across the Internet. Some responded to the article with surprise, but for those involved in the humanities, this news isn’t exactly… news. For for an industry dependent upon creative thought, appealing design, and finding new ways to make software and computer programs an intricate part of our daily lives, a humanistic touch is necessary.

Drew Gilpin Faust, an historian and President of Harvard, recently dropped a perfect summation on the societal value of the humanities.

The world needs scientifically sophisticated humanists and humanistically grounded scientists and engineers who can think beyond the immediate and instrumental to address the bigger picture and the longer term.
– Drew Gilpin Faust

The humanities are a indispensable component of higher education, and a foundational aspect of an educated populace capable of critical, creative thought. Every semester that I teach the introductory courses on the history of the United States, I face a challenge in which I must prove to a (usually substantial) portion of the class that the study of the past is essential. Students absorb the vitriol that surround public discussions about liberal arts degrees, causing some students to see taking courses unrelated to their STEM majors as frivolous. They perceive the study of history as disconnected from their lives, or their ability to engage in the world as an well-informed citizen, which could not be further from the truth. It is my hope that this recent turn in the nature of discussions about the humanities and the value of liberal arts degrees may alter students’ perceptions and increase enthusiasm for the critical knowledge and skills such degrees offer.

Toe-dipping in the mainstream

This is a nice piece on the power, reach, and speed of online publishing. Historians are plenty engaged with the public; putting debates like this online only make them more accessible.

historywomble

These days, academics are encouraged to appeal to big audiences. We are told that we must have ‘impact’ (teaching 2.2 million university students doesn’t seem to count, oddly enough – but let’s leave that for another time). Just over a fortnight ago I had my first proper taste of this when I got involved in a collective letter published in History Today, whose readership and twitter followers represent a much bigger audience than I am likely to reach through my academic writing and teaching.

View original post 819 more words