Yesterday, I finished a first draft of my dissertation. This is not to brag or invite people to congratulate me (although, to be honest, I did briefly consider posting something about my finishing on Facebook, primarily to receive congratulations). What I want to discuss here is the deep ambivalence I felt/feel upon finishing. If six (!) years ago, you had told me that my dissertation’s first stage would end with a whimper, not a bang, I would’ve been surprised if not shocked. No matter what, I would certainly not have expected to feel, frankly, so ambivalent.
Now that I’m done, what do I have? A 600-page tome that needs to be cut down by at least one-third, if not one-half; a sneaking suspicion that few people will ever read this thing; and nagging questions about whether it was worth the time and investment, given the abysmal academic job market. This is not to say that I don’t love what I do, or that I regret spending my 20s studying a relatively arcane subject. It’s just to say that, surprisingly, I do not feel the sense of accomplishment I expected to upon starting this endeavor.
Perhaps this is just the nature of completing a project that you’ve worked on for so long that it becomes a part of you. (Though one would expect feelings of sadness, rather than ambivalence, if this were the case.) I mean, I’m impressed with what I’ve done, certainly, and think that I did produce some relatively worthwhile new knowledge. But I think the major cause of my ambivalence is the deep difficulty that I have/will have communicating my dissertation’s argument to non-academics. And this leads me to a question that has been talked and blogged about a lot in the past decade: the relationship between academia and the non-academic world.
This past weekend, I returned to my graduate school for the first time in over a year. It was a typical visit; I met with my advisors, said hello to colleagues, and stayed with my little sister, who—hilariously and weirdly—is now a first-year in the same program and department as myself. It was great to be back, see the old haunts, and walk around the (soon-to-be) alma mater. Thankfully, I’m very close to finishing my dissertation, and the questions I received mostly concerned the project. In speaking to younger years, I realized that the dissertation is a largely mystical product. It is spoken about as something tangible yet unknowable.
For this reason, I figured I’d post a short list of tips that I’ve learned while writing my dissertation. I don’t mean to imply that everyone will find these tips useful, and I’m well aware that people have very different writing processes. But, I think any advice on the issue can perhaps help those who are beginning this arduous task. Some of these tips relate to picking a topic, some relate to research, and some relate to writing. I hope they might be useful to my colleagues in earlier years. In no particular order, here they are:
If you can, take courses related to your topic.
This is a semi-controversial tip, as one of the joys of graduate school is taking classes on topics with which you are unfamiliar and expanding your intellectual horizons. I very much support this. However, graduate school is also about pre-professional training, and getting a jumpstart on your dissertation by taking classes in topics broadly related to your interests is important for completing your dissertation in 5-6 years. Reading the secondary literature in your field will also help you situate your dissertation, important for both the prospectus and the final product.
Pick a topic in which you are incredibly interested.
You will probably be working on your dissertation for 3-5 years, so it is incredibly important to pick a topic that you can imagine reading, writing, and thinking about for thousands of hours. The last thing you want is to awaken in the middle of your fifth year, as you’re slogging through the Russian state archives, to realize that you don’t really care about the intersections between space travel and class in the 1950s Soviet Union.
Pick a topic that can be researched and written about in a timely manner.
Everyone enters graduate school wanting to write a dissertation like William Cronon’s Changes in the Land. Unfortunately, this is virtually impossible. In my opinion, it is much smarter to choose a topic that you know contributes to the literature, produces new knowledge, and can be written about in 5-6 years. In an era of dwindling funding, where many graduate students are unsure whether they will have funding after their fifth years, this is perhaps the most important rule. Having ambition is important, but it is unlikely you will suddenly revise the way we understand the French Revolution. For a first project, modesty is best.
Know your topic.
This is why taking courses on your topic is important. A dissertation is very time-consuming, and you don’t want it to be your sixth year when you realize that you really aren’t adding very much new information to the corpus of literature with which you are engaged. Having a good, general sense of where your work fits in will very much ease the writing of your dissertation. That being said …
Don’t feel compelled to know everything about your topic.
It is too easy to get distracted by the fact that, as someone who has spent only half a decade ensconced in your research field, in many ways you barely know the literature to which you are contributing. This is an unfortunate fact, and part of the reason why it takes such a long time to transform your dissertation into a book. However, you should be careful not to distract yourself too much with reading all of the secondary literature on every topic upon which your dissertation touches. Be familiar with these literatures, of course, but don’t go down too many rabbit holes. If you do, you’ll never finish.
This past week saw the release of one of the most anticipated video games of the year: Mass Effect 3 (ME3). In this game, players take control of Commander John/Jane Shepard, a sort of futuristic Navy SEAL. Shepard is charged with defending not only humanity, but all organic existence, from Reapers, “a highly advanced machine race of synthetic/organic starships” (think Cylons). The release of ME3 has been accompanied by the usual discussions about whether videogames are art. (Roger Ebert says no. Everyone under 30 says yes.)
I’d like to elide this discussion for now, mainly because—save for taking introduction to art history—I’m not very familiar with the history or theory of art. What struck me most about ME3 is its extensive focus on diplomacy. Unlike most action roleplaying games, ME3 allows players a significant amount of choice in whether or not they become a “paragon” (basically a good guy) or “renegade” (a devil-may-care good but rough guy). The game centers on building an alliance similar to NATO designed to combat the Reaper menace. Therefore, whether one becomes a paragon or renegade depends, essentially, on how the player conducts him or herself diplomatically. For example, if one attempts to win a given planet over to the alliance through threats or blackmail, one wins renegade points; the opposite is true of paragons. In either case, the game is at heart about diplomacy, a fact that had me questioning the relationship between gaming, history, and international relations.
Although ME3 is just a video game, today a similar game is regularly played by students, professors, and even policymakers. In these modern political war games, players adopt the perspective/persona of a given nation. For instance, Player 1 will play as Barack Obama, while Player Two will become Vladimir Putin, each facing off against the other to address, say, an Iranian nuclear breakout. These games are designed with the purpose of teaching the players “to think like” policymakers. The idea is that practice, even fictional practice, enables one to either think about or prosecute diplomacy. Interestingly, although these games reached their apex in 1950s and 1960s America, their origins may be traced to Weimar Germany. Examining the history of war games not only sheds light on the transnational connections that shaped America’s Cold War foreign policy, but also illuminates important questions about the relationship between gaming, knowledge, and practice.
One of the game’s main developers was a man named Hans Speier, a forgotten, though important, German-American exile intellectual. Speier began his career as the first doctoral student of Karl Mannheim, the creator of the modern sociology of knowledge, at the University of Heidelberg (home of the famous Philosophenweg) After leaving Germany soon after the Machtergreifung, Speier became the youngest founding member of the New School for Social Research’s University in Exile (peruse that list for a who’s who of twentieth century intellectual history). Speier spent the war years working for the Foreign Broadcast Information Service, the Office of War Information, and the Division for Occupied Areas, before helping found the RAND Corporation’s Social Science Division in 1948. Throughout the 1950s, he embodied a more general shift experienced by a generation of German exiles from socialist to Cold Warrior, routinely arguing for the United States to adopt an extremely hardline position vis-à-vis the Soviet Union. (As you’ve probably guessed by now, I’m writing my dissertation on Speier.)
While at RAND, Speier, along with another sociologist, Herbert Goldhamer, standardized the political war game described above. Through RAND connections, the game moved to MIT and then into the Joint Chiefs of Staff’s Joint Gaming Agency (see Chapter 6 of Sharon Ghamari-Tabrizi’s The Worlds of Herman Kahn). A number of well-known figures, including Maxwell Taylor, McGeorge Bundy, and Bobby Kennedy, played the game before making important policy decisions (although the relative influence of the game on decision-making remains obscure). The game’s meta purpose was to teach players about the importance of historical context for international relations and was a reaction against the game theory that dominated RAND’s Economics Division. But the game’s origins remained distant from the Cold War context in which it reached fruition. In the late-1920s, Karl Mannheim developed a new pedagogy with the goal of harmonizing the incredibly contentious democratic politics of Weimar Germany. Mannheim argued that if Weimar democracy was to have a future, intellectuals needed to create a classroom environment where students adopted the personas of representatives of different political parties and political interest groups. By discussing and arguing with each other over the most pressing issues of the day, Mannheim maintained, students would learn to be democrats. Practice was the path to democracy.
Speier adopted and adapted this idea in the Cold War. Political war games are now played throughout the world, and are an important part of many security studies curricula. Moreover, it is the basic notion of the political war game—that gaming diplomacy can make players more astute negotiators—that undergirds ME3’s appeal as an “intellectual” video game. Clearly, individuals interested in politics want to have some way to practice the art. The question, of course, is whether this is ever possible. Can a political war game recreate the Cuban Missile Crisis? Can ME3 teach people what its like to form and maintain an international (or intergalactic) alliance?
The question that underlies this entire post concerns what role knowledge, however acquired, can play in teaching diplomacy and improving outcomes, however defined. A basic assumption of Speier’s, Goldhamer’s, and postwar security studies (and countless model UN clubs) is that it could. But is this the case? Are games more than distractions? Happily, a number of academics have begun to address this and similar questions. Game studies is one of the newest fields in academia and potentially one of the most exciting. I for one very much look forward to seeing how this often-overlooked field develops in the coming years. Perhaps we will soon learn that certain games do indeed fulfill their intended, practical telos. Or perhaps we will learn that they don’t.