Blog Post

Advice to DigHum Job Candidates: Don't Lead With HTML

When I was contacted yesterday by a third MLA colleague who reported being dismayed by some of the Digital Humanities candidates they were meeting and interviewing at MLA, I realized some of us Digital Humanities mentors had really missed the boat.  Are we sending our students into the world armed with a didactic approach to Digital Humanities that is turning off those outside this subfield?  I hasten to add that I also heard about several Digital Humanities doctoral students, including some HASTAC Scholars and some of my own students, who are on the job market and who did a brilliant and convincing job and who are much sought after by departments.  I heard glowing comments about many of those---raves, in fact.  They will help to transform the profession.   But the fact that any DigHum students were going into MLA interviews saying that "knowing HTML" or, more generally,  "being able to program" was their primary job qualification indicates to me that some people are not getting good mentoring and that HASTAC had also missed the boat by not doing a HASTAC Scholars Career Forum, with lots of Q and A and anecdotes and back and forth about what does and doesn't work at a job interview.   We'll make up for that and do one soon.  


But in the meantime, here's the first advice (and like all advice, it admits some exceptions but is still a darn good rule):   Unless the first question out of your interviewer's mouth is "Do you know HTML?"  please, please, next time make sure the first self-description out of your mouth isn't "I know HTML."   Of course HTML is intrinsically important; if Tim Berners-Lee invented it the World Wide Web wouldn't exist.   But the wonder of HTML is not very useful for someone applying for a job.  HTML has been around since 1980.  It's not exactly cutting edge anything, and certainly is not the defining "brag" of professional web developers.  In fact, whether HTML or any kind of coding, the bare ability to program doesn't mean much.  There are elegant coders and clumsy ones, people who simply plug in the code and others who invent whole new, brilliant programs, languages, and innovations.  And HTML?  Please!  At best, HTML mark-up is  a baseline for some practitioners of Digital Humanities, but, as a job requirement, it's just a baseline.  It's like an English major bragging that they've read Shakespeare or Foucault.  


The ability to do HTML mark-up or any single technical ability (such as programming, web development, scientific visualization, sonification engneering, or any other one, single skill in and of itself) is part of a portfoloio and a job pitch but it shouldn't be a stand-alone, manifesto-like statement at a general interview in any discipline, not if you want to get a job.  A computer science doctoral student applying for a job wouldn't lead with "I write C++."    Anyone who trumpets a single programming language or one technical proficiency as the first qualification for a job is likely not to get one.  Leading with "I know HTML" is likely to get you hired as an intern or assistant on someone else's project, doing their HTML mark-up on their grant, for their grand project.   The ability to code is not likely to get you hired as an assistant professor in an English Department.   Leading with HTML is analogous to the old High Theory days when people who didn't have ideas or an indepth knowledge of theoretical traditions would lead at job interviews with "I'm a Lacanian" or "I'm a deconstructionist" and then would have little more to say on the subject but a few key words, sometimes even reducible to lamentable, cliche jargon.   If you want a job, you lead with great ideas, you walk the walk and talk the talk.   That you are a Lacanian, or a deconstructionist, or that you do your own mark-up should flow as part of the methodologies, tools, and skills that contribute to an argument so passionate and compelling that the only response on the interviewer's side is "You are hired.   I want you to be my colleague."  


What you never, ever want to do as a job candidate is occasion what I like to call the "Elevator Anxiety."   That is, you do not want your interviewer to be waiting for the end of your interviewing thinking, "If I were ever in an elevator stuck between floors, this is the last person on earth that I would want to be holed up with." You never, ever want to be the tiresome graduate student with only one note to sing over and over and over and over and over---even if it happens to be on key.    (More advice:  if you start every sentence with "In my dissertation . . . " you also generate Elevator Anxiety in every interviewer.)


Here's the big backstory.  Of all the inane hallmarks of a field to "stick" the "does his/her own mark-up" versus "doesn't do his/her own mark-up" is becoming the reductio ad absurdum of Digital Humanities.  For those of us who are senior in the field, we should know better.   We senior Digital Humanities scholars (no matter what position we take, no matter what side we are on) cannot make knowing or not knowing Mark Up the one thing everyone not in the field knows about us or we will destroy our field by provincializing it----and by stigmatizing our students out of the one area where there are jobs right now.  HTML (knowing it or not knowing it) absolutely cannot be the definitional argument of our field any more than "knows the difference between 'signified' and 'signifier'" would have been a ridiculous shibboleth to test who did or didn't understand Barthes thirty years ago.  If that is the touchstone of our field, our field will be parodied--and is being parodied right now--by those outside it.   Our work is far too important to be reduced to one baseline element, and not even a very interesting, versatile, or expansive element.  There are so many brilliant Digital Humanities scholars who don't code, and so many exciting new forms of Digital Humanities where knowing code is irrelevant--or, as one of the World Famous Scientists (his lab created Netscape Navigator, the world's first browser) who helped to create HASTAC said at our first meeting at NSF back in 2003:  "I haven't written my own code in a quarter century.  I want to talk to brilliant humanists, not to mediocre coders!"  Ouch.  There are so many brilliant Digital Humanities scholars who do write elegant contemporary code but for whom "knowing HTML" is just about the least exciting thing one would ever think to say about their work.   Go back and look at the "Feel the Noise" HASTAC Scholars Forum, or any recent Forum.   There is a high degree of technical proficiency evidenced in that Forum but what excites is the field-changing range of ideas.   I read the Forum and I do feel the noise.


In other words, the issue of "must-know or not-necessary-to-know HTML" (or to code more generally) may well be a field-specific gripe that some of us have, and that different ones of us fall on different sides of, but if, as a field, we are defined by our programming skills, our brand new Flavor of the Month status as the only hot field in the humanities will fall away.  There will be a backlash soon, if there isn't already.  There are so many things wrong with leading with HTML or coding more generally instead of with substance, content, ideas, theory, imagination, arguments.   First, you don't have a skill as your raison d'etre but as one of many, many tools in your kit.  Second, if talking to a more general audience, you don't go with something they don't have--in any interdisciplinary field, you the interviewee, have to be a translator from what you know to what they envision for their department.  Defining yourself by one skill that, outside the field, is esoteric rather than a defining intellectual agenda, one that will help to shape a department in a new area but is still compatible with its mission, is a losing strategy.  You find points of commonality . . . and then have something extremely urgent, exciting, and indispensable to add.   That isn't mark-up.


One of the people who was in utter dismay over the quality of Digital Humanities people she was running into at MLA happens to be a Digital Humanities advocate and pioneer at a famous university.   Recently, in an undergraduate class that involved a lot of race and gender and postcolonial theory, she did away with a typical final research paper in favor of a collaborative multimedia, multimodal project where students were asked to meticulously research and then create a street from a specific time and place other than U.S.A. right now.  Students could use any digital tools to do this.   And the projects that I'd heard about in this class were beautiful and historically subtle and precise visualizations that also incorporated, as a multimodal project can, different and even contradictory and competing points of view from the time period and that particular cultural site.   And did so in a way illuminating for anyone who viewed and participated in and contributed to the interactive project.   From her high excitement at pioneering such a fascinating undergraduate project at a university where there was little other interest in Digital Humanities, she found herself, at MLA, backing away, wondering if this was a "dead end field" that had "outlived its historical moment."   The HTML or Not HTML argument was so loud and persistent in the sessions she attended with huge enthusiasm and in the interviews where she had fought to interview Digital Humanities candidates that she was reconsidering her advocacy and reevaluating the strength of the field.   That's devastating for us.


Something similar happened with a distressed email that I and Fiona Barnett, Director of the HASTAC Scholars, received by someone who is actually signing up to be part of one of the most forward-looking and ambitious new-style Digital Humanities-inflected programs in the country.   She is a true believer.  She has directed brilliant Digital Humanities dissertations, by students who have fantastic jobs and postdocs.   One of her students is a game designer, who writes hypertext novels, and who teaches interactive classes using all the imaginable social media and game mechanics for classroom social organizing and assessment mechanisms.   She was in distress that a brilliant and promising field was looking reductionist, rigid, dictatorial, didactic, and off-putting in its own defiant self-representations at MLA, in the interviewing rooms and in the sessions, and, worse, she was fearful that her brilliant students who were doing the most sophisticated kinds of Digital Humanities would suffer from the "If you don't do your own mark-up, you aren't a Digital Humanist" brand of rigidity and tiresomeness (ie. Elevator Anxiety again).  Of course they do HTML.  She was concerned that other humanists would think that is all they and other Digital Humanists do and decide not to hire them based on being tarred by a rigid, narrow-minded subfield mentality.


The third encounter (I'm changing the field here, quite deliberately, so I don't cause anyone anxiety) was actually with two people, one of whom chaired the tenure-track assistant professor search, and both of whom were disgusted and dismissive and derisive of several candidates who couldn't be budged off the HTML fixation.  The search committee chair said, "I actually tried to save one of them who seemed promising on paper.  I said, 'Okay, okay.  We get it that you know how to code.  That's fine.  What would you do differently if you were the candidate we hired for this 19th century British job? What would you add to our department?' And they actually answered, after that set up, where I was trying to save their candidacy:  'I'd teach my graduate students how to do my Mark Up for me so I could create a new data base for 19th century British novels.'"   Guess who isn't getting a fly back?

If you are on the job market and reading this, it is not too late to make a correction.  Even if you don't get a fly back this year, I suspect next year there will still be jobs in Digital Humanities because there is wonderful grant funding out there and there are so many amazing new ways to learn from the enormous store of digital resources the world is offering to us.   But we who claim to be experts in the field have to be at least as creative as Google in our dextrous, imaginative, content-inspired, theoretically sophisticated, issue-rich, provocative, passionate new ways of thinking for a digital age . . . or we aren't going to convince anyone and we will lose this moment for the humanities yet again.   That the enormously exciting Google Books project infamously had no humanists on its board this December (not digital humanists nor humanists of any kind) suggests that we are not doing a good enough job firing the imaginations of those in charge of making the digital archives of all printed books available to the world.  


We need to fire imagination.  We need to be transformative not only in how we do what we do but in how we envision the future of a field and a profession--in this case, Modern Languages but I mean the humanities more generally.  As we've been saying for over a decade now, the humanities should be the very center of the Information Age if only we claim our importance, our expertise, and our theoretical and historical understanding of technological and historical change.  Digital Humanities should be leading the way, not with mark up but with bold, transformative goals that help make the humanities central in a confusing time that no one fully understands or has mastered.   An ideal job candidate burns with the passion of making a field anew.   Vision, expansiveness, imagination, ideas, and brilliance are the requirements.   Knowing or not knowing HTML is way down the list of attributes that make colleagues know that you are the one they need for a better and brighter future . . . or even to get from the top to the bottom floor of the skyscraper at the MLA interview without thinking, "Please, please, don't let me get stuck between floors with this pedantic, boring, tiresome person."   What you want is for those elevator doors to open and your interviewer to still be asking you follow-up questions, and the logical next statement is:  "By the way, are you free now for coffee?  I'd love to talk more about what you just told us about how you are co-teaching an undergrad class in Kreyol literature with college students in Haiti, building a virtual environment as your meet-up space, that your own students in your Haitian literature seminar are in social networks with Haitian students, and they are collaborating on an anthology of contemporary Haitian post-earthquake writings by displaced students living in the homeless camps?   People are saying that students at your university are deciding to major in French in greater numbers than any time in the last decade.   That's amazing . . . " *




*p.s.  That's a real example, from the Haiti Lab at the John Hope Franklin Humanities Institute at Duke, and, yes, for the first time we are seeing new majors in French, some of whom also are majoring in biology since the Haiti Lab is an interdisciplinary partnership with Global Health, with the Dance Department, the Visual Studies program, with Computer Science, the Nicholas School of the Environment, and the Law School too.    Virtual environments, creating an online archive of endangered historical documents, creating an online anthology of contemporary writings by Haitian students whose universities have been destroyed in the earthquake, and partnering with students and scholars in Haiti are all part of this next generations of humanities.   The "digital" is the least important part of a new vision of why the humanities are central, and why you have to know everything from Haitian history (a grad student in the Lab made history last year when she found the only known extant copy of the Haitian Declaration of Independence) to critical race theory and globalization theory in order to do humanitarian rescue work or anything else, for that matter.



1 comment

Mere words cannot describe how much I agree with this article--even with the benefit of markup.