Friday, December 23, 2005

On Being Played by a Student

I generally love my students, and I don't have much patience with the blogs (usually anonymous) that do nothing but complain about students, but every once in a while I get a really egregious example, and this is one.

For this particular class, one paper was due Friday the 9th and the other on Monday the 12th (the one on Friday was a revision, so it's not like I was piling the work on). On Monday the 19, I received the following [n.b.: I've paraphrased the emails so as not to be posting the student's exact words, though I don't see why I shouldn't]:

Professor Drout,
Just wrtiting to make sure you got my final paper. Also, what did I get on essays X and Y, I don't remember getting them back.

Now it's very strange that a student writes on Monday the 19th to ask if I received papers from a week before. A few students, who were worried about dropping things off at my office, emailed immediately afterwards, but to be unworried enough to wait a week, but then worried enough to email was strange. Also, this student hadn't received essays X and Y back because the student had never turned them in.

Back when I was a naive paper I would have said "Oh no! Your papers must have gotten lost! Do you still have an electronic copy?" Now I write:

I have gone through all of the student papers that were submitting to my office and I have not received yours.

Student replies:
I don't know what happened then, can I email/fax it to you or something?

Note that we're still talking about a final paper (singular) although two papers were due.

I write back:
where and when did you turn the papers in?

Student replies:
On last Monday, I thought I put them in your office. Maybe I accidentially put them in the wrong office.

Notice that now (that I've reminded the student) it's two papers, not one. Notice also that they are put "in" the wrong office (implying that the student walked in. Now what's in my office is very different than what's in the office of the film prof next door and the creative writing prof on the other side, and office doors aren't left completely open any more.

So I write:
What do you mean you put them "in my office"? Where exactly did you put them? Did you turn in both at the same time, even though one was due on Friday? Why would you put them in the wrong office when office doors are marked with names?

Student replies:
I'm not sure where I put them, I did turn them both in on Monday though. That is the reason I emailed you, because I checked my binder at the end of the week, and there was nothing there. But, I didn't really remember dropping them in your office. I honestly can't tell you where I put them, I might have spaced out and put them in the wrong place, I'm not sure.

Now the papers might have been put "in the wrong place," rather than in the wrong office. So that seems to limit it to the baskets on the doors of the professors on either side of my office (again, ridiculous, as I have Beowulf pictures on my door while the door material of the other two profs is very distinctive -- not to mention the names on the doors).

I write:
Well, we'll have to figure out where you put them. I will email the colleagues in the offices next to mine to see if they received them by accident. But generally if a paper gets turned in to the wrong office, the person who receives it posts it in the department secretary's office. There were no papers there when I checked on Monday.

At this point the student stops emailing, so perhaps the student has gotten the message that I'm not going to fall for the "I turned it in, but it disappeared" story. Students used to always blame the custodian, which I found particularly odious, since he is a friend of mine and has never in his life thrown away a student paper stuffed under an office door.

Now, before you think I am too mean or obnoxious (ok, I am, but not too much) note that I could have set a trap by accepting an e-submission and then checking the "created" date in the MS Word file. There is a small chance that this is entirely a cock-and-bull story (although the attempt to suggest that two previous missing papers had been turned in is pretty sleazy), but I'm skeptical, also because this semester students have tried to pull all kinds of stunts with electronic submissions (including the famous "I forgot to attach the attachment but now I'm home on mid-term break" -- which also may be true, but which is why I don't allow e-submissions without a hard-copy submission the same day).

Memo to any students reading this: Your professors are not as dumb as you think we are. And this isn't high school -- be a responsible adult and own up to your mistakes and you'll get a lot more slack than if you're trying to weasel something.

Monday, December 19, 2005

Dan Timmons, R.I.P.

I was very sad to get the news today that Dan Timmons, best known as the editor (with George Clark) of J. R. R. Tolkien and His Literary Resonances, passed away yesterday after a long illness.

I met Dan a number of years back at Kalamazoo, before Beowulf and the Critics or his book came out, and we had a wonderful, late, long and apparently too loud (someone yelled at us to get out of the courtyard because it was after 2 a.m.) conversation about our frustrations at getting Tolkien studies taken seriously by the academy.

Dan was an excellent scholar, a talented critic, and, most of all, a warm and generous person.

About a month ago, he sent me some very large and important entries for the J.R.R. Tolkien Encyclopedia (this led me to believe that he might be getting better, but, alas, that was not the case). His entries are testament to his excellent scholarship and, even more importantly, his essential fair-mindedness (a quality sadly lacking in too many critics -- and not my own strong suit). Dan wrote balanced and insightful treatments of controversial topics, and he combined a deep and abiding love of Tolkien's work with effective critical judgment. The field will miss the many additional contributions he would have made over the years, but even more, we will miss him.

Tuesday, December 06, 2005

Where are all the female orcs?

I have to admit that one of my favorite bits of speculation/inference in Middle-earth studies is the problem of the orcs, their origin and reproduction. Here's are the inter-related suites of problems as succinctly as I can express them:

To have exciting battles, you need lots of cannon fodder (which is why in FRPGs the generic term for relatively weak bad guys who are easily killed is "orcs").

Morgoth and Sauron, by theological definition, "cannot make, they can only mock," that is, they cannot generate ex nihilo free-willed (or pseudo-free-willed) creatures to serve them.

So you need a source for orcs.

The only free-willed creatures in Middle-earth are Valar and Maiar ("powers," angelic/demonic spirits), the "Children of Iluvatar" (elves and men), and a kind of special dispensation given to the dwarves, ents, and apparently the eagles.

So, I think Tolkien reasoned, the orcs were somehow transformed elves, elves who had been subject to torture until they became orcs.

But this clashes, possibly, with his reincarnation of the elves idea.

And there's an associated problem, that the production of countless orcs for cannon fodder purposes then requires a very extensive operation of elf-torture over many years, even years when Morgoth and/or Sauron are not present in Middle-earth.

So, you could let the orcs breed (and at least in LotR and Silm there are hints that Tolkien adopts this idea).

But breeding is problematic, as it includes elements of sexual pleasure (we'd assume) and even more so that someone would have to love and nurse the cuddly little baby orcs. And so when you're spearing Shagrat, you're killing some mother's beloved son.

I think towards the end of his life, JRRT was tending towards making the orcs having been men who volunteered for Morgoth's evil and were thus transformed (doesn't solve the breeding / raising innocent little cuddlesome orc babies).

But I think there might be a simpler explanation, consistent with both the origins of the word "orc" and the later development of Tolkien's thought regarding the reincarnation of the elves.

"Orc" comes from Beowulf 112; a list of monsters includes "eotenas ond ylfe ond orc-neas", usually translated as ettins (trolls), elves and animated corpses (in some of his earlier work Tolkien translated orc-neas as "barrow wights"). So we have the concept of the animated corpse.

Now, what animates the corpse? Here I turn to the philosophical essays in Morgoth's Rings. Morgoth, in his attempt to control Arda, became like a glacier of evil, calving off bits of evil throughout the world. He puts forth some of his spirit, and gets Glaurung, and when Turin kills Glaurung, that little bit of evil is gone from Morgoth. Thus Morgoth grows smaller and less powerful the longer he is in Angband, but the amount of evil in the world stays the same or grows slightly larger as his evils creations work evil into the very fabric of the world.

Now, where does the corpse come from? I think from elves. Tolkien's original ideas about elves and orcs could work if the process of elvish reincarnation that Tolkien eventually settled on were to work. There are, for elves, two components in a living creature: a Fea, or spirit, and a hroa, or body. And, Tolkien says, the Fea creates the body (he got this idea from Charles Kinglsey's The Water Babies but don't you dare steal it because I'm writing it into an article for Chris Vacarro, whom I went to high school with, but I digress...).

So, if an elf's body, his Hroa, is killed, his spirit, Fea, goes to Mandos to wait until such time as it is appropriate for him to grow a new body (Tolkien ended up rejecting the idea that the elf spirit would be borne as a baby to its parents, or to some other parents, etc).

So, if Morgoth captures an elf, and tortures it until its spirit leaves its body and goes off to Mandos to grow a new body, the old body remains in the possession of Morgoth. He can imbue that tortured body with his dark spirit, thus making a orc without creating something new.

I know there are some subtle problems still, but for the sake of argument, let's say that the above can explain the origin of orcs: Morgoth has calved off bits of himself and set them in elf bodies. It explains the "animated corpses" connection and the "not allowed to make" objection.

But once Morgoth was cast out of the Doors of Night, where did the new orcs come from? We're back to breeding in some way. I'd propose some kind of hideous rite in which the orcs hack off pieces of their limbs, which then re-grow into new creatures. These each have the same identity of the orc from whom they are chopped, so they're immediately viciously rivalrous with that orc. That way you squeeze out sexual love and familial affection. Maybe orcs force other orcs to breed because the process is so miserably painful.

So really this is just a long train of bizarre speculation set off by Emma Goldman's interesting post on the possibility of female orcs in Middle-earth.
Quis custodiet ipsos custodes?

In this post, I mentioned my displeasure at a journal that took five months to reject a note with a one-line email. Such sloth and discourtesty is, it seems to me, a symptom of a very serious problem in academia that is not regularly addressed but is near the heart of the discontent that so many people (inside and outside) seem to have with academia: the keepers of many of our key academic institutions are failing in their duties.

I think this is as big a factor in the stress and discontent of junior faculty as the job crisis itself: when the keepers of important institutions (journal editorships, society presidents, press directors, chair holders) don't do their jobs efficiently, they don't pay the price: junior faculty do (when a press drops the ball and takes years too long to publish a book, it doesn't hurt my tenure prospects; if an article is out for a year, or gets accepted last year for the 2007 issue of a journal, I can wait. Not so for junior faculty).

If there is a backlog of work (which we've all had happen), then it is incumbent upon those who hold substantial privileges to buckle down and plow through the work. If there is such a backlog at presses and journals, then that fact should be made public so that we can have an academy-wide debate about what to do (rather than just letting the backlog lengthen).

Here at Wheaton there is, right now, a lot of the standard end-of-term griping from junior faculty (well, from everyone really, because it is end of term, but I've happened to notice the junior faculty griping). Because I am currently the chair of the Educational Policy Committee, I feel like I have to look into this griping to see if there are any real grievances. Almost every thing I've been able to track down has the same source: people not doing their jobs in a timely manner and thus putting more stress on others (and if you think junior faculty have this bad, you should see what faculty obliviousness about deadlines does to the staff).

Where does this problem come from? See the title of this post. There's almost no supervision (which is good) and no recourse (which is bad). The system is supposed to work because scholarly peers, also of advanced standing, put pressure to do a good job on those who are also high up in the system. But no one wants to be the bad guy, no one wants to be the heavy, and so everything just slips (and I will say that Wheaton is about 500 times better than any other place I've been).

What to do? Herewith a few simple rules of thumb.

1. The deadline actually does apply to you. [Faculty whine constantly about students missing deadlines but then constantly miss their own. And I mean all the time.]

2. Return phone calls within 24 hours and emails within 48.

3. 30 days for straight rejections/acceptance from a journal. Any later and the person gets a reader's report.

4. 90 days for rejection/acceptance with a reader's report. If your readers can't turn things around that quickly, get some new readers and fire the old ones. Book manuscripts may take a little longer, but you have to inform the person who submitted when the decision will be made.

5. Let people know when they have failed at the above (i.e., being collegial is important, but give feedback like "I'm really glad you accepted my article, but making me wait eleven months was a little out of hand, don't you think?" -- this kind of thing is hard to do, but it's essential. People respond to social pressure; the squeaky wheel gets the grease).

Or, to sum up: Do your job!

I think that following my few rules would do an enormous amount to pull angst out of the system and make people a little happier in their jobs (and to do these things is just a tiny bit easier than revising the entire academic labor system, by the way)

[Readers might reasonably ask "Drout, you have tenure. Why don't you name names?" It's a good question, and I had thought about saying that Notes and Queries was the journal that couldn't be bothered with a timely response or a reader's report. But to make specific charges against specific individuals would require a new level of research that I don't have time to do. For instance, if a journal loses an article or a reviewer takes eleven months to return a reader's report or a contributor bails out from a project three weeks before the deadline (after having eleven months to do the work), I have a right to be irritated, and I can assemble this material into a pattern and comment on it, but I don't want to blast the single person or institution and then find out that the specific example I've chosen happens to have a new baby or a financial crisis or, God forbid, a serious illness. I'm sure that not all of the people/institutions failing to do their jobs have these problems, but I don't know which ones do, and that makes naming names problematic.]

Sunday, November 27, 2005

A Helpful Tip

[UPDATE: In response to comments below: I am fortunate that I am not relying on this piece--or another one that has been at a journal for five months--for tenure or promotion. I'm tenured and already have enough scholarship for promotion when the time comes. My point is not "woe is me, I have been wronged," but "here's something seriously wrong with the academic system." I hope to do a post in a few days about how poor performance by the keepers of major institutions is doing as much to harm and frustrate junior faculty as the state of the job market.]

Dear Editor of Relatively Respected Journal,

A. It is perfectly acceptable to reject an inappropriate article with a one-line email.

B. It is perfectly acceptable to reject an inapproriate article after a thorough review that takes five solid months. The reader's report rejecting the piece is well worth the wait.

Combining the terse reply of A with the slothful response time of B is not at all acceptable. The article so rejected is a brief note. I could translate it into Gothic, translate it from Gothic into Old Norse, translate it from Old Norse to Latin and then paint it on the side of my house in less than five months. To get only one pro forma line out of that long wait is a fairly grotesque failure of courtesy and, more importantly, intellectual responsibility. I'll bet those slothful, lazy reviewers put their reviewing duties on their vitae.

If you wonder why so many younger scholars are starting to submit articles simultaneously to multiple journals (creating headaches and strife for editors), see above.


Mike Drout

P.S.: Now I'll be sending the article off the journal where I thought it belonged in the first place.

Tuesday, November 22, 2005

Fanboys and Scholars (and Twenty-sided dice)

Jeff at Quid Nomen Illius? has a really good post on the influence of Dungeons and Dragons on the current conquest of popular culture by fantasy (and props for Jeff on his excellent post titles: "Bree-yark" for this one, and a very obscure Jethro Tull reference on the next).

I was one of those kids influenced by D&D, and if I'd been a bit older or younger, I would probably have had a happy career as a game designer: I was too young to be really caught up in the first wave of paper-and-dice D&D (I was mostly interested in it in 6th-8th grade; I don't think I played at all in high school), but a little too old to really get into on-line gaming and high quality computer RPGs. But I wrote a lot of "modules" for D&D, always with a very Tolkienian aspect, plumbing Unfinished Tales for details, sketching out details of weaponry or costume, and making my own weathered-looking maps (I once got into huge trouble because I'd been using my mother's iron to heat lemon-juice-impregnated paper in order to age it: I didn't clear the iron afterwards, and thus some piece of clothing ended up with a large lemon-juice stain on it).

I think all of that give me at least a little fanboy cred when I use the term (despite a reader on The One Ring's amusement). I don't think fanboys are (any more) stalkers or hopelessly in love with movie characters (and I know not all self-identified fangirls are either), but they (we) are obsessive about aspects of our chosen afficion. I haven't had time to do the research, but I'm relatively certain that fanboy/girl are originally pejorative terms (which would fit Squire's interpretation of the word), but they have now been taken up by those inside a subculture as self-identifying labels (and "fanboy" is equivalent to Japanese "otaku," which seems to mean "person who still lives in his parents' house and is obsessed with gaming/anime/manga").

[I'm getting tired of typing fanboy/girl, so I'll use "otaku" from now on].

I think that being an otaku is a necessary, if not sufficient, condition for being a good scholar. Scott Nokes here says much the same thing, arguing that scholarship starts in otaku-ish appreciation (I'd add, 'and obsession over detail') and then progresses, through the addition of theory (and I'd add, 'method') and formal communication, to scholarship. I'm not entirely comfortably with the way Scott treats 'theory' here, but I think he's basically right, and his point could be justified historically: literary studies, particularly medievalism, grew out of the practices of gentleman scholars and antiquaries who were collecting manuscripts and trying to understand them and, essentially, writing fan newsletters about Old English texts instead of computer games.

All kinds of scholarship (not just literary work) require the focus of the otaku, the obsessive ability to spend twenty years studying one genus of dragonfly or, like Darwin, work for hours just about every day for a decade dissecting and classifying barnacles.

From these obsessions we generate new insights into the natural and human worlds and, since we've seen the enormous payoff from 18th-century and Victorian gentleman scientists and their journals, we have, as a culture, tried to copy them in the humanities, with many great results (and probably a lot of wasted time as well).

But the social structures built by those early, gentleman otaku--the departments of English, the journals, the lecture circuits--provide real-life benefits (salaries, tenure, publicity, security) that people who are not otaku also covet. And thus, I think, the evolution of problems in scholarship that are usually put under the umbrella of "professionalization." People no longer publish strictly from love of the subject and a desire to inform (though obviously that's important) but also to get financial rewards, respect, power and influence (and it's not like the original Victorian gentlemen were immune to such temptations, either).

Well, if you are attracted to academia because you want to be free to pursue your obsession, you're going to have a very different focus than if you are attracted to academia because you want summers "off," job security, and some initials after your name. Academia is, for many (obviously for me) a very, very appealing life.

But (and here comes the unproveable assertion) if you didn't go into academia because you were an otaku about your subject, then you begin to resent the obsessive work required to be good at it. And you do things like continually re-publish your dissertation (my pet peeve right now) with minor variants ("The bleeding saint in Andreas," "The bleeding saint in Judith," "The bleeding saint in Elene," "The subtextual figure of the bleeding saint in Beowulf" ... you get the idea). Or you stop writing after you get tenure. Or you become bitter and miserable about your students.

But, thankfully, a fairly high percentage of the scholars I know are, in one way or another, otaku. I know a Hawthorne otaku, a Poe otaku, a Louisa May Alcott otaku, a Victorian ghost story otaku. They have (to switch cultures) enthusiasmos or (to switch again) afficion, and it shows in their teaching and their scholarship and their happiness.

I think attempts to bring too much "method" or "theory" to literature can interfere with enthusiasmos, and likewise literary scholars' ill-starred forays into political interpretations undercuts their own afficion (because once you move into politics, you are accepting the idea that there is something that is more important to you than your subject).

Being a fanboy/girl or otaku isn't quite socially mainstream, and being a literary scholar shouldn't be, either: you are doing something that is very weird, and to do it well, you have to be a little weird yourself. You have to get carried away, you have to have a little too much enthusiasm for your obsession than is completely normal.

"a little too much enthusiasm for your obsession than is completely normal" -- sound familiar?

Tuesday, November 15, 2005

The Horned Moses, The Twilight of the Gods, and the Spoiled Eel
Or, can an error improve a text? And if so, who owns it?

Part of the discussion that's been going on here, on Scott Nokes' Unlocked Wordhoard, and on The One Ring circles around (without, I think, actually mentioning it, the idea of "author intent." Now "author intent" is a phrase very out of favor in contemporary literary criticism. The loci classici for its discussion are essays by Roland Barthes ("The Death of the Author") and Michel Foucaul ("What is an Author?"), which are mainstays of introductory theory classes. (I discuss these two essays in regard to Tolkien in my essay "Towards a Better Tolkien Criticism," which is in this book that is supposed to be out soon, so I won't rehearse the whole argument here).

There are a lot of reasons why many (not all, by any means) scholars are wary of author intent: you can't ever be sure you've found it, and the text may have effects that the author did not intend but which nevertheless manifest themselves in the minds of some non-trivial subset of readers. For instance, Tolkien asserted that the name "Galadriel" had nothing to do with the root "galadh" (meaning "tree"), but it's not unreasonable to discuss what effects might be created in the minds of many readers when they notice the similarity (in, for example, the place-name Caras Galadhon or the name for the people of Lothlorien, the Galadhrim). Similarly (but not exactly the same), Tolkien took great exception to a critic's linking "Moria" with the biblical "Moriah," noting that in his invented languages the root "mor" = "black" was sufficient to explain the various "mor-" compound names.

But figuring out what Tolkien meant can't be the end of criticism. Whether or not he intended Moria to be like Moriah (or for readers to perceive an association between Mordor and Murder), in the minds of some readers there will be an association and hence some kind of effect while reading. These may be "mis-readings" or "misprisions" to use Harold Bloom's terms, but if criticism is going to describe what it's like to read The Lord of the Rings, then it has to take these associations and effects into account, regardless of what the author said he wanted.

Now the two instances I've given above a pretty trivial in that they might create some kind of literary resonance for the reader but they don't really change the interpretation of the text significantly, and they are not really significant "errors" for the reader who adopts them. But what about when something goes really wrong? For example, what if you translate a Hebrew word that means "beams" or "rays" of light as Latin "cornu," horns? And then you start a tradition of depicting Moses with Horns (Michelangelo obviously followed this tradition). As I noted in this post, I am not a fan of the hand-waving school of criticism ("Oooh, this is great!! You know why? Because of its essential greatness!! Ooh, look, more greatness!!") that is a somewhat (only somewhat) unfair exaggeration of "conservative" art and literary criticism (the quote from Roger Kemble in the link above illustrates this phenomenon, and Harold's Bloom's introduction to the second edition of his The Anxiety of Influence is almost entirely handwaving about Shakespeare's greatness without ever offering any analysis), but I think Kemble may be on to something in this quote:
What we see in Moses here--Moses the law-giver, Moses the chap who has just had an awful (in the old sense) encounter with God--is the results of an artist's effort to represent visually something that exceeds the boundaries of the representable: the horns are a sort of objective correlative of that overwhelming moral awesomeness: forbidding, grotesque, yet commanding.

I think he's on to the idea that the Moses with horns is better (that is, creating stronger emotional and intellectual responses in viewers) than the same Moses without horns would be. So in this case a translation error has led to an improvement in the quality of the art.

Likewise look at the phrase "Twilight of the Gods" used most famously, obviously, in Wagner's opera Götterdämmerung. This is a mis-translation of Old Norse "ragna-røkr" (it shows up in Snorri's Edda), which actually means "the doom of the Gods" [philology: røk = destined end; røkr = twilight. The "r" on the compound is a nominative ending in ON, but it caused early German scholars mistranslate the word]. Now there's nothing particularly wrong with "Doom of the Gods," but (here I will hand-wave a little due to lack of time) it's not nearly as poetic, beautiful and unexpected as "Twilight of the Gods. " A scholar's error has ramified through the culture to help produce a work of great beauty in Wagner's opera (and it would be impossible, I think, to tease out how much of the opera and the cycle as a whole--musically as well as in terms of plot--is due to Wagner's thinking of the final piece as a "twilight" rather than as just a "doom.").

Finally, the spoiled ell. This is a famous, famous line (or it was, early in the 20th century) from Melville's Moby Dick: a net drops a load of fish on the deck, and there lies a "spoiled eel." Articles were written about the spoiled eel; great existentialist meanings were ascribed to it: had man "spoiled" the eel by wresting it from the sea? Was it "spoiled" intrinsically, the horror of nature apart from humanity? Didn't it show Melville's great poetic genius.

It was a printer's error for "coiled eel," a phrase that would attract just about zero attention because it is so conventional.*

So, is Moby Dick better if it contains "spoiled eel"? Early critical consensus would seem to so indicate. Yet "spoiled eel" was not the author's intent for the passage. So the printer's error is "better" than the author's intent?

What if Melville had noted the error in reading the proofs but had decided to let it stand because he thought it was better than his original? Who gets credit for the poetry of the line?

I could go on raising various questions, but this entry is getting too long, and I actually want to propose some kind of possible solution.

If my theory is correct in describing how traditions evolve (that they involve the accumulation of small differences over time), then an "error" like the horned Moses, the Twilight of the Gods or the spoiled eel provides a saltation, a jump from one portion of the adaptive landscape into another. The vast majority of these kinds of errors are going to be harmful, in that they'll make the text less "adaptive" (that is, aesthetically appealing to the reader): they will add mis-spellings or confusion. But some very small set of these errors will be "better" than the original, and they then may start their own, new, traditions, opening up new areas of adaptive space.

I think this theory might be right because it comes around from an entirely different directon (meme theory, evolutionary biology) and ends up at a place similar to Bloom's Anxiety of Influence, that "strong misreading" drives literary improvement and evolution.

And that's an awful lot of half-argued points to leave off in two paragraphs, but I have a few lectures to write. I'll look forward to further discussion and debate in the comments or on other blogs.

*I think I got the spoiled eel example from Donald Foster's Author Unknown, a book which would be much more interesting if Foster gave more methodology (and which also points to a serious crime that might have been committed by the White House during the Lewinski hearings but which the special prosecutor was too sex-focused to notice).

Monday, November 14, 2005

King Alfred's Grammar Really Works!

Ok, that's probably a little bit of hyperbole, but I've used the latest iteration of my grammar book in my Anglo-Saxon class this semester, and we just cruised through 65 lines of Maldon today with plenty of time for in-between-line discussion and more than 80% of the students making acceptable out-loud translations.

When I suggested to a possible publisher of the grammar book (still waiting to hear) that one could teach an entire semester with King Alfred's Grammar and Pope's Eight Old English Poems (previously Seven Old English Poems) an anonymous reviewer stated that jumping right into poetry was too hard for introductory students, who really needed a slow introduction through prose.

Well, in your face, anonymous reviewer! Because my students have got the language down! They picked up subjunctives in Beorhtnoth's speech to the Viking herald. One of my best students caught a preterite-present verb without looking it up ("that looks like a strong verb past tense, but the sentence only makes sense if it's in the present tense" -- woot!). Other students caught subject deletion. They were making detailed stylistic comparisons between The Battle of Brunanburh, The Dream of the Rood, and The Battle of Maldon.

My favorite moment came when a student (who had been sick during the Brunanburh translation and so was just turning it in) said "Professor Drout, I don't really know if I have a right to say this, but I don't think the Brunanburh poet was nearly as good as the guy who wrote Rood or even the guy who wrote Maldon." -- She was picking up on style in poetry in a new language!!!

My point is not only to toot the King Alfred's Grammar horn, but also to point out that students respond very well to being pushed quite hard and challenged -- and as hard as the work has been for them, 13 of the 22 are signed up for all of Beowulf next semester.

So don't give in to the temptation--which will sometimes be pushed by your colleagues--to make things easier in order to keep students in a difficult subject. I heard the poet Robert Pinsky speak about his brilliant Dante translation one time, and he said "people are in love with difficulty." I think my students are right now (though their tune may, of course, change when we do a chunk of Beowulf as the semester's conclusion) and I am certainly enjoying having them carry me along for their adventure.

Wednesday, November 09, 2005

And I seyde his opinion was good

Over at The One Ring, there's a little thread referring to the discussion between Scott Nokes and Horace Jeffery Hodges and me. N. E. Brigand writes:

Nokes’s observation that “a deep understanding of medieval language or culture is a prerequisite to serious study” of Tolkien reminded me of something that Curious wrote here last fall in response to Lúthien Rising’s report on Tom Shippey’s Marquette presentation, History in Words: Tolkien’s Ruling Passion. Curious’s comments begin:

>>While I enjoy Shippey’s perspective I sometimes get the impression that he wouldn't take anything I said about Tolkien seriously unless I, too, were a philologist. I also get the impression that he overemphasizes the influence of philology on LotR because Shippey is, after all, a philologist. Of course, I could just be revealing my own biases, since I have no aptitude for languages or philology. But then Tolkien did not write LotR for an audience of philologists.<<

I think this raises a very interesting literary-theoretical question (one which I tried to deal with in my essay "Towards a Better Tolkien Criticism," which should be out any day now in a collection called Reading The Lord of the Rings): which reader's interepretation is more likely to be correct? I can come up with a number of types of readers, each of whom will have slightly different information with which to interpret. What is the authority of each reader?
For example, we can differentiate between readings generated by the Philologist, the Fanboy/girl, the Modernist, the High-Culture Reader, the Movie-Obssessed Fan, the Film Critic, etc., etc. Each will bring something different to the table, and each will work within a different interpretive community and follow a different set of interpretive practices.

So let's say, arguendo that the meaning of the word "weapontake" has something important to do with the interpretation of the scene in which the word occurs (I don't think we have actually established this, of course). A philologist can then supply additional information that other kinds of readers might not have (that the word was a "Northern" variant, that it had a specific legal meaning in Old English, that it was likely to be subject to folk etymology). Theoretically, then, the philologist's interpretation would be richer and more likely to be correct than that of the fanboy/girl.

This is an attractive argument for several reasons. First, since Tolkien was a philologist, we might think that a similarly-trained critic would have more insight into Tolkien's mind. Second, and most compelling, more information (like that possessed by the philologist in this case) would seem to lead to a better interpretive result than less information. So we would then empower the philologist within the interpretive community by giving more weight to his or her interpretation than to the interpretation of the fanboy/girl.

Although the above argument empowers me, I think it has some logical gaps. If our only goal is to try to figure out 'what Tolkien meant / wanted the passage to mean,' the above works fairly well. But if we do not accept author intent as the only meaningful type of interpretation (and I think the philosophical critiques here by Barthes and Foucault, as annoying as they are, have not been effectively refuted), then we need to look at other kinds of interpretation. And even if we do accept author intent, we have to take into account the very insightful comment by Curious "Tolkien did not write LotR for an audience of philologists."

So, we might reasonably suppose that the other categories of readers might, though their interpretations, help us figure out how LotR works -- what meanings does it create in the minds of its different readers? What kinds of interesting, powerful "mis-readings" can they generate without philological knowledge? If philolgical knowledge is essential, what about knowledge about Roman Catholicism, or about WWI and WWII, or about what was going on at Oxford in terms of philosophy during Tolkien's life?

It's a natural reflex to say "more is better" and say that a good Tolkien critic needs to have all these things, but there is not world enough and time and, (to fall back on cliches) to a man with a hammer, all problems look like a nail. So the person who has invested in philology sees that as the best tool, but so does the person who knows WWI fiction and Tolkien's place within it; likewise the fanboy/girl who knows the History of Middle-earth and can point out errors (such as that I thought Elrond had the Ring of Water until a month ago, and I am, obviously, a huge fanboy).

I think the only solution is for critics to let readers know where they stand and what their backgrounds and assumptions are. If you are interpreting LotR for the naive reader who knows only the films, say so. If you are writing for the philologist, say so (and I'll assert that knowing the philological background greatly enriches your understanding of Tolkien's world). If you are writing for a hard-core Christian audience, say so. Then the other readers who come along can figure out where your interpretations come from.

In the end, then, I think that it takes all kinds to have a successful and living debate and discussion about the works of a dead author. That there are so many people who want to do this, who love to discuss the books so much that they undertake arduous study for its own sake and the sake of their minds, speaks very well for the possible resurgence of literary studies, if only they could be done in such a way as to invite in the intelligent, insightful and enthusiastic people who would love to discuss literature.

Saturday, November 05, 2005

Tolkien's Use of 'Weapontake"

[Update: I'm pasting Scott Kleinman's comment up to the end of this entry because it significantly expands and improves my discussion.]

Some interesting discussion at Unlocked-Wordhoard and Gypsy Scholar. In fact, the discussion is on far more substantive issues than this one word, and I hope to be able to discuss the actual content soon, but for now I'll have to be satisfied with being a philologist pedant.

The context of 'weapontake' in Tolkien is the "Muster of Rohan," when all the men able to bear weapons are assembled in preparation for the ride to Minas Tirith. It seems from the passage (RK, V, ii, 72) that Tolkien is using the word to mean the assembly of all the able-bodied men of Rohan in companies. Although a folk etymology might construe the passage as meaning that the king provided the weapons--i.e., the able-bodied men arrive and are issued weapons from the king's armory-- (as was the case in Peter Jackson's depiction of the Rohirrim in both The Two Towers and The Return of the King), I don't think Tolkien intended that meaning, and I don't think that would be accurate for Anglo-Saxon England (I could be wrong here).

"Wæpen-getæc" is an interesting word, and I wonder if Tolkien might not have worked on the definition of "weapontake" for the OED: he wrote the definitions of a number of words beginning with "w", including "walrus."

The etymon for the word seems to be ON "vápna-tak," though this is used in a different sense. It is, according to Bosworth-Toller, a Northern word; in the south "hundred" was used (which is why I think that folk-etymologizing the word and assuming that it means that men showed up somewhere to "take" their weapons is probably wrong), and its being Northern would explain the ON etymon. Bosworth-Toller gives the primary source as the laws of Edward the Confessor.

The point here is that Tolkien is likening the Rohirrim to the Anglo-Saxons yet again (even though the "Northern" word is a bit of a curve ball here, since the Rohirrim are linguistically Mercian), suggesting that at the king's call, all the men of Rohan were expected to assemble for military service and form themselves into companies that were, apparently, led by the professional soldiers of the king's household. This practice is in contra-distinction to the customs of Gondor, where a large, standing, professional army was in place.

The larger point is that Tolkien is not merely being archaic for the sake of archaism: he is being particularly precise, using exactly the right word (both in terms of definition and cultural connotations) that he needs for this particular situation. Tolkien's sense of the "right" word, which includes sound and etymology, is actually more 'theorized' than the word choices of the great Modernists to who he is often unfavorably compared (do you really think Faulkner, or Hemingway, or Woolf knew anything of the History of English; they were great talents, but they were working by gut instinct. Joyce is a somewhat different case, but he was no historical philologist--although knowing and sampling so many languages made him more sensitive to the interconnections of European languages). .

[Here's the comment from Scott Kleinman.]

Here are some thoughts on weapontake. The Old Norse use of vápnatak seems to have involved the confirmation of a vote at an assembly. In England, the word came to be used for the assembly itself and was, in areas with heavy Danish populations applied to the local judicial body. It was probably also used for the place where the assembly met. As the administrative and judicial boundaries within English counties became more stable, the sense was extended to refer to areas of jurisdiction in addition to the bodies themselves, rather than just the body or the place of the court. We can thus see a sort of evolution in the word from the taking up of weapons to an area of jurisdiction, and it might be possible to locate Tolkien’s usage on this timeline.

It seems to me that, when Tolkien writes that ‘all who could be spared were riding to the weapontake at Edoras’, he could mean a point of assembly, but probably not an official jurisdictional unit for the region of Edoras (the precise meaning of which is also quite interesting, especially given its use in two different senses in Beowulf 1035-1045). In other words, it’s in the middle of the timeline. But, of course, there is no necessity to assume that the weapontake of the Rohirrim accurately reflects the wapentake of Anglo-Saxon England, since the Rohirrim do not exactly match the Anglo-Saxons. Tolkien modernises the spelling (unlike Edoras), which, conceivably takes it in a new—or an alternative—direction consistent with a possible folk etymology: a taking up of weapons as part of a military gathering. The modernisation of the spelling curiously gives the word a more archaic effect by restoring the original connection to weapons. Perhaps this was necessary. Words like Edoras don’t seem particularly archaic to anyone who doesn’t know Old English; they’re simply foreign words like Minas Tirith. But if Tolkien wanted to give the sense of something familiarly English, but archaic, the term weapontake worked pretty well.

As a side note, Higden’s definition is very interesting. Here it is in John of Trevisa’s 1387 translation (with the thorns changed to ‘th’ and yogh to ‘y’): ‘Wepentake and an hondred is al oon, for the contray of an hondred townes were i-woned to yilde vppe wepene in the comynge of the lord.’ I take this to mean: ‘A wapentake is the same as a hundred, for the country of a hundred towns were wont to present weapons at the coming of the lord’. Clearly the word was prone to folk etymology, and it is not impossible that Tolkien had something similar in mind; i.e. Théoden would take the weapons offered by the those who owed fealty to him.

Monday, October 31, 2005

On Un-Articulated Standards

Here at Wheaton we did a lot of hiring starting about five years ago, so we have several large "classes" of junior faculty who are going to be coming up for tenure soon. Even though Wheaton has a very good record for tenuring faculty (only two denials in the years I have been here, though that's slightly misleading, since a lot of people don't come up for tenure at all if, after a few years, things look bleak--i.e., your research comes to a screeching halt when you come to Wheaton). Due to this "demographic" fact, there is a lot of discussion going on (mostly sub rosa) about standards.

Most of the junior faculty I've spoken to are very frustrated about the lack of formal, articulated standards. 'I want to know exactly what I am supposed to do,' one said. 'Just come out and tell me: "you need one book, two reviews, teaching evaluations in the range of x-y... whatever it is, just tell me."'

The senior faculty, and in particular the department chairs and the tenure committee, are unwilling to do this. When I started at Wheaton and asked about the research standards for tenure, I was told 'Probably about an article per year or a book, but that can vary.' When I pressed, I was told that 'We don't want to set a formal policy, because then if a very good person comes along who for whatever reason doesn't meet those standards, we'd have to deny tenure when we don't want to."

Now I know that this was meant to be comforting: there was wiggle room, the colleague was saying. Don't worry so much. But that's not what I heard. I heard: "There are no fixed standards, so the committee can find an excuse to deny you tenure no matter what your research or teaching accomplishments are."

I think junior faculty are hearing the same thing. Rather than seeing the lack of clearly articulated standards as a safety valve, they interpret this fuzziness as a very real danger to their tenure. But for the junior faculty this open-endedness generates the feeling of always being on a treadmill. At the time of my tenure I had one book published, one under review (I got my contract two days after my tenure), seven articles, a software program, a grammar book, a couple of reviews and the beginnings of a new journal. I still didn't feel that could be certain that I had done enough. I think most of the junior faculty would prefer to be given a set of clearly articulated standards, as difficult as they might be.

But I am pretty certain that the tenure committee would not create any fixed standards and that the faculty as a whole and Wheaton's AAUP chapter would try to strike down such standards if they were articulated. Many faculty would see such standards as the creation of the administration that would put at risk faculty control of the tenure process (at Wheaton we are possibly unique in having no administrative veto within the tenure committe: there are 7 members of the committee, 5 faculty plus the President and Provost. You need five positive votes to get tenure, so theoretically the faculty can over-rule the administrators. Of course the President and the Board of Trustees have ultimate veto power, but that has, as far as I know, never been used).

The trade-off between flexibility and predictability (and fairness) is difficult to judge, but personally I would favor high but articulated standards rather than the current flexible but fuzzy process. I know for a fact that the members of the tenure committee take their jobs very seriously and that the faculty as a whole take very seriously the task of electing a good tenure committee, and the process could not be more thorough. But it is not objective, and not at all predictable (except that I've predicted every tenure case correctly since I've been at Wheaton), the way it would be if there were clear, articulated standards.

Now the process could never be completely objective, because we have no agreed-upon metrics, and to a certain extent I don't want some kind of seemingly objective but actually still fuzzy metrics to be adopted (i.e., each page in a journal from this list counts as 1.5 pages of the journals from this other list; numeric teaching evaluations divided by departmental average divided by ratio of grade-point-average to college mean; service on committee X equals Y number of points, etc.) Such systems are always gamed, and they very often outlive their usefulness (for instance, supposedly due to its being founded only in the 1970's, the journal Anglo-Saxon England isn't on the Dean's approved list of first-tier journals at certain institutions, even though ASE is obviously the flagship journal for Anglo-Saxon studies).

The current process works if people trust the committee and trust the process. But that very trust is eroded by the paranoia (which is in a sense justified, if only because so much is on the line for junior faculty coming up for tenure) that is generated by un-articulated standards. It is difficult situation that has no obvious solution, which is a bitter pill for my junior colleagues to swallow and this very stress-filled time in their lives.

Tuesday, October 25, 2005

But then are you really anonymous?
or, a question of propriety

[UPDATE: The consensus seems to be that people do this their anonymous reviewing. I'm still not sure that I am going to, if only because I really don't feel like combing through seven years of email to remind myself what I've reviewed. But maybe I will get a hankering for vita completeness at some point.]

A couple of weeks back there was a lot of discussion about Daniel Drezner's denial of tenure of UofC (my guess is that one factor was probably that he wasn't socially mal-adjusted enough). In reading up on the situation, I came across Drezner's vita (it's linked to the page above in pdf form) and noticed that at the end he lists the journals and presses for which he does anonymous review. I've been wondering about that (obviously not obsessively, just off and on) since then.

I'm sure there are differences field to field, but it seems to me if I say that I review for journals x, y, and z and presses m, n, and o that much of the veil of anonymity is removed. I myself have figured out who my reviewers were for various articles and books (Hint: when you are an anonymous reviewer it is probably a bad idea--and certainly bad form--to criticize someone for not citing one of your articles that hasn't appeared in print yet, and in any event is shows a lack of class to criticize someone for not citing you), and I'm sure people have figured me out occasionally.

I guess the idea would be to illustrate that you are involved in your field and respected by the academic community enough to be called upon to do anonymous review, but it seems to me similar to the same kind of resume-padding that happens when people list the societies they belong to. I guess listing anonymous review duties is a little more important, in that it shows that you're active in the field, but it seems like a bad idea to me. However, if it is standard practice and I'm hurting myself by not doing it, then I will try to dig up references to all the journals and presses I've done work for over the past eight years (what a fun job that will be).

So, do others out there put their anonymous review activities in their vitae?

Tuesday, October 18, 2005

All I need now is a nice garret window, a sputtering coal fire, and consumption

I've been out of commission for a few days with horrible, stabbing pain in my ear and jaw. On Monday I was actually sitting in my office grading papers with a hot pack on my head and thought, "Isn't this just pathetic and Victorian?"

Since when do 37-year-old adults get ear infections? I can't even remember having an ear infection as a child. Arghhhhh.

(I'm grateful that neither of my kids got this; better me than them).

Am now getting things under control thanks to geniuses who thought to combine an antibiotic with a ß-lactamase inhibitor. Blogging should return soon.

Sunday, October 09, 2005

A Problem Worth Solving

Scott Nokes concludes this very worthwhile post by asking:
"How do we [literature faculty] re-connect with the public? How do we encourage quality research over quantity? How do we move from philosophy-lite to depth of thought? How do we re-concieve literary studies to allow a re-naissance?

These are big, important questions and not likely to be answered in one or a few blog posts (though perhaps some kind of open-source process could be attempted). But I have a suggestion for a starting point (keep in mind that my kids were sick all last week, and just as they got over the virus, they gave it to me, so I am both sick and exhausted after chasing them around all day--library, goose-chasing, climbing the 'Whomping Willow," catching butterflies and swordfighting until someone was cracked across the bridge of the nose with an insect net. Also, I have, at last measurement, a 102 fever).

One problem with literary study, at least in the minds of the many scientists and engineers with whom I'm friends (and in one case, married to) is that it seems not to be going anywhere. Physics has a goal: a unified field theory. Biology has a goal: to explain life from molecules up through ecosystems. Chemistry has a goal: to be able to understand the workings of all possible molecules (and figure out what that set is). Even Mathematics has goals, though some of them, such as "really understanding prime numbers and why they are distributed the way they are" makes my head hurt thinking of how you'd go about it. Engineering is almost entirely about measurable goals. But in English, my friends say, you read the same texts over and over again, for hundreds and hundreds of years. And you come up with one theory that replaces another theory that replaces yet another theory. That's good, in that it keeps you and your future grad students employed, but it doesn't seem to be going anywhere.

[I'm deliberately leaving out the counter-argument of medievalists who point out that, after a book like Mechthild Gretsch's The Intellectual Foundations of the English Benedictine Reform, we are moving towards a better understanding of the period. My friends and colleagues are reacting to the interpretive side of English, not the philological or literary-historical.]

Well, I think there are worthy problems that we should be attacking. Some of these have been sloughed off to the dreaded realm of the Linguists: chased out of mainstream literary studies by Babara Hernstein-Smith and Stanley Fish in their jihad against Stylistics (some day I need to write a post about how the establishment rewards people like H-S, Fish and, in medieval studies, Larry Benson, who launch an attack against some new approach that makes the establishment uncomfortable). But others just aren't being dealt with at all.

The biggest, it seems to me, is the philosophical problem of enumerating formal criteria for determining the ‘meanings’ of words or works. Barthes, Foucault, Derrida and the host of other contemporary critics who have followed them have argued that there is no way to determine if a particular interpretation is correct (i.e., you can't appeal to the author's intent, you can't get outside the text). And yet criticism proceeds apace. Stanley Fish attempts to solve this conundrum without contradicting the French post-structuralists by arguing that meaning is determined by “interpretive communities,” groups of individuals who are socially and culturally authorized to confer ‘meaning’ upon utterances, interpretations and literary works.

Fish’s argument is deeply unsatisfying because it is merely a kind of anthropology of interpretation. Give him his due: ‘interpretive communities’ with varying memberships and degrees of political power do obviously help constrain or enable various meanings: convince enough Anglo-Saxonists that your interpretation of a text is correct, and that interpretation gets taught to a generation or so of students.

But Fish’s argument does not account for the differing formal characteristics of texts that might serve to limit the flexibility of interpretive communities in conferring meaning. For instance, no matter how powerful an interpretive community might be, it is hard to imagine that community being able to establish the meaning of Beowulf as a discussion of lemurs, elm trees or porridge. The free-floating meaning that Fish accepts is limited not only by the power of the interpretive community, but also by the text itself [I have an essay that discusses this problem in relation to Tolkien in this collection, which should be out any day].

To me, these formal, textual limits (whatever they are) are far more interesting than the fact that interpretive communities can constrain or enable various possible interpretations. And it seems to me that trying to figure these out is the kind of thing that literary scholars can and should do. I also don't know exactly how to go about doing this (so I'm not yet flacking my own work here), but I'd sure like to. If we can steal some terminology from Daniel Dennett: the number of interpretations of Beowulf that can be sustained by the text is Vast, but that number in comparison to the number of interpretations that can't be sustained by the text is Vanishingly small. And yet the great majority of proposed interpretations (i.e., those upon which the interpretive community can act to authorize or rule out of bounds) stay very much within the boundaries of the first space.

My hunch is that any possible solution to the problem will have elements of Wittgenstein in it and might also explain one of my major questions about Borges' "Library of Babel": how would the intelligible books in the library be arranged? I also think that we'll end up talking about replications and inheritances and the connected facts that a) there are infinitely more ways of being dead than being alive and b) all living things are descended from other living things, not from dead things (to translate into literary studies: there are infinitely more un-convincing interpretations than there are convincing interpretations and convincing interpretations probably arise out of previously convincing interpretations) -- but is coming close to flacking my own work, so I'll stop).

This obviously isn't the only problem (or perhaps even the most pressing problem) facing literary studies, but I sure would like someone to solve it. And I think efforts towards understanding the interaction of formal characteristics with "interpretive communities" and their desires would be much more easily communicated to (and taken seriously by) people outside of literary studies.

So, all you hotshot graduate students reading this, hurry up and solve this problem, ok?

Thursday, October 06, 2005

Gigging for Binaries

[UPDATE: I composed and posted the next post before I'd read the insightful comments below and a post by Scott at his site. I hope to be able to develop a response tomorrow, depending on how the grading / installing new kitche faucet /re-gasketing woodstove goes]

Comments on this post and also a previous post (and isn't it sad that I can't find it on my own blog?) have argued that although there is much to criticize about literary theory, at least the analysis of binary oppositions is a valuable tool.

[very quick rundown for all of you normal people with real lives who aren't up on the terminology: it's a major trope of post-modern literary theory that logical systems or structures in Western culture (philosophy, Christianity, constitutional democracy, etc.) rely upon the separation of the world into (artificial, according to the theory) binary oppositions, such as light/dark,male/female, white/black, self/other. De-construction is an attempt to force these binary oppositions apart by arguing (well, most of the time asserting) that the first, culturally favored terms are actually reliant upon the second, culturally dis-favored terms. This shown, presumably the logical structure of the "system" is called into question. So, for example, if the 'masculinity' of Christ is emphasized in the Anglo-Saxon poem "The Dream of the Rood" (Christ is an active warrior, not a sacrificial victim in the poem) then the cross itself is 'feminized' (I'm not mocking this particular argument; I think it is one of the best examples of the genre and actually points out something interesting about the poem). ]

I've deconstructed binaries with the best of them, and I can locate an abject, dominated Other with my eyes closed and one hand tied behind my back. But I've begun to question whether the whole process is actually interesting. And my answer is no, no it's not interesting any more. Because all of these binary-opposition-deconstructions always map onto the same system: there's something powerful oppressing something not powerful but nevertheless relying upon it. Patriarchal / matriarchal, center / margin, straight / queer -- the analysis has created the exact kind of universalizing system that Derrida was trying to argue against (not that I care whether or not Derrida would be happy, but it is ironic).

My gut feeling (hey, at least I'm being honest) is that anything that is so easily applied is almost certainly wrong. Well, 'wrong' may be a smidge too harsh: anything so easily applied is likely to be operating at too superficial a level. It reminds me very much of biology back before Wright and Mayr and Dobzhansky: there's lot of hand-waving about the "superiority" of this or that animal in the struggle for survival. Read enough pre-30's biology and natural history (ok, I have weird reading habits for an English Prof.), and you realize that any adaptation of an animal could be (and was) read as some kind of superiority. It's only when you get population genetics and the oscillations of predator/prey relationships and cost-benefit analysis that you escape that very easy dead end and get work like that of Rosemary and Peter Grant, which moves "adaptive superiority" out of tautology by giving very specific, detailed, historical analysis of both individuals and populations. Then the discussion gets even more interesting, with "Panda's thumbs" and "spandrels," etc.

Likewise I think it is all too easy to go out and spear a few binary oppositions and then convince yourself that you've helped to expose the unworkable logic of whatever evil system that you're trying to undermine. The whole process now just makes me uncomfortable: I feel like we're waiting for someone to pop out of the bushes and yell "tautology!!!" (well, that's how we played it where I grew up).

It seems to me that what we have here is a whole lot of people grasping around, desperately trying to find a method, and this is what they've come up with. It's easy, it's self-aggrandizing (you're not just noticing an interesting coincidence in an obscure poem; you're undercutting several thousand years of philosophical domination), and there aren't a lot of competitors now. Marxism did have a method, but it got tangled and endlessly complex and there was always some weirdo who would challenge you on some point of doctrine. I think most English Professors were relieved no longer to have to deal with someone yelling in lecture "You're either a bolshevik or a menshevik, make up your f-ing mind" (quoted from memory from some literary theory book; I think it's Stephen Greenblatt).

Also, gigging for binaries actually isn't that different from some of the methods of New Criticism (the slime-fanged bogeyman of all theory people): New Critics could go on and on about the shifting patterns of light and dark, or the ambiguity (favorite word) and multiplicity of meanings of the A on Hester Prynne's shirt. At some level it's the same process: here are two things that appear antithetical; let me show how they are instead inextricably linked.

You can see how, with the very idea of "method" in disrepute (because it's part of the "hegemonic" half of some massive binary opposition) and thus complex methods not being taught, something simple like gigging for binaries would fill the vacuum.

Now the argument I used back what I was writing my dissertation was that how these things happen ( the center relying upon the margin even as it devalues the margin) is interesting. But I think even considering the quia has become tedious: as soon as I hear or read someone start in on the binary opposition thing I think "I already know how this movie ends" and my eyes glaze over.

I can't put my finger on when it happened, but it just isn't exciting the way it used to be. I think I need to move on. It's not you, binary opposition de-constructors, it's me.

Friday, September 30, 2005

Theory ( *shudder* )

Scott Nokes and I have been having a running discussion about our profession and its problems. His latest post references a discussion at in which someone characterizes me as "largely anti-theory" (Scott says I'm not). Scott goes on to explain why he thinks "theory" is important and comes up with a very good metaphor likening literary theory to the foundation of a house: you need it to be there, but you're not particularly interested in looking closely at it every day:
All of which leads us to the purpose of foundations (or in the obvious metaphor here, theory). Foundations need to be applied to buildings. What would we think of architects who fetishized foundations to the point that they were laying foundations around the landscape, refusing to sully them by placing buildings over them? Naturally, we would think such a person mad or a fool. Yet there are those who refuse to apply their theories to actual works of literature. In some quarters (though it has gone out of vogue), you can still find people who haughtily reply that they "do theory," not literature, or even more pretentiously, "high theory." As a colleague of mine said recently, "We already have people who do that, and do a better job. We call them philosophers."

I don't disagree with Scott; I just want to add a few things about why theory has such a bad odor among all but its most ardent practitioners (and in passing, plug a book idea I've been brooding about).

Theory deals with important philosophical problems (as applied to literature) that often get passed over by "untheorized" approaches. For example: when attempting to figure out the 'meaning' of a text (scare quotes because different people mean different things by 'meaning,' and I'm not interested in getting into that argument right now), many people appeal to the authority of the author: The Beowulf poet put a specific meaning into the poem, and it's our job to figure out what it is. But, says the literary theorist, how do you know what the Beowulf poet put in? By reading his text, of course? Isn't that circular: the text tells you about the poet which tells you about the text, which you can use to create your idea of the author.

Ok, what about an author, like Tolkien, who left criticism behind? The theorist points out that you still have to interpret the words that author wrote, whether they are criticism or literature, and when you make decisions about how to interpret them, you are often relying upon an idea of what the author "was really like" that is shaped in no small part by the author's texts.

You can see a possible source for that tiresome "X is impossible" cliche that I mocked in a previous post criticizing President of the MLA Domna C. Stanton. These are irritating philosophical problems about literary interpretation (and they've been around since Plato in one form or another).

But on the other hand, these are interesting problems: in my experience, smart, non-English-professor people are happy to debate these different approaches in informal circumstances if the questions are framed this way. The same goes for the "political" types of criticism: how much of Chaucer's creation is a work of his own, unique, individual genius and how much is a reflection of certain political and social structure of his time (Did Chaucer really hate Jews, or is the Prioresses' Tale just a reflection of 14th-century English culture, or is Chaucer criticizing this mindset by attributing the Tale to the Prioress, whom he perhaps is satirizing, etc.

Nevertheless people are alienated by theory, and for good reason, I think. Theory is taught and communicated as a series of quotations from authorities. First you master what X said, and then Y's critique, and then Z's development... it is just like those incredibly annoying rambles through the authorities that we read in The Wife of Bath's Prologue or The Nun's Priest's Tale or the Tale of Melibee: Seneca says this but Macrobius says this but The Philosopher (Aristotle) say another thing and here's a quote from Paul's Letter to Timothy and here's another from Ecclesiasticus... you get the idea. For those in the know this may be (and I am not actually conceding this point, but arguendo...) an efficient means of communication, but it also serves to exclude everyone who has not mastered the authorities even when those people might have something very interesting and relevant to contribute to the discussion if it were framed properly.

Also, at this stage in its development, theory doesn't provide any answers, or, more accurately, the only answer it provides is that everything can have a political interpretation. If that's the case, then, as you'd expect, each political school latches on to theory to undercut their opponents and support their positions.

So why, you ask, do academics stick to theory? Are they all wannabe politicial scientists or half-baked sociologists ("sociology without all that pesky data" is what I called one school of Tolkien criticism in an honest but perhaps impolite moment)? Theory supplies "method" to replace the philological method that was politically discredited after WWII and the New Criticism, which ran out of steam after the five millionth celebration of "ambiguity." If there's one thing that all the branches of contemporary theory seem to agree upon (and Scott hits upon this in an another post) it's that one can go gigging for binary oppositions (light/dark, good/evil, male/female, hot/cold) and then "deconstruct" them by showing that the first, priviledged term requires the other term to make sense.

This was probably exciting the first couple of hundred times people did it, but now it is tedious beyond belief, and I immediately begin to do Anglo-Saxon calligraphy on my notepad when someone starts going on about binary oppositions.

But I do think that the big questions theory raises are interesting, and answers to them (and yes, I am a positivist, if we can figure out how to explain quantum tunneling, we can figure out how to deal with 'author intent') would be really desireable. But "Theory" doesn't have an answer, and I don't think as it is now configured, it will ever have one.

Which leads to two desiderata: First, a book that takes on all the big theoretical questions and lays them out for people without the whole gloppy mess of citation of authorities and academic jargon. Instead of trying to give a reader a grasp of who said what, the book should attempt to show the problems: For example, there's a continuum between universality (this book means that same thing to very reader) to solipsism (this book is about me) and we have no good way of carving up that continuum into philosophically defensible chunks. Or: various theorists assert that 'meaning' is constructed by social relations in the culture (i.e., all the authorized people decide that Beowulf is about "wisdom and strength" in balance, and so everyone thiks that), but there are obviously some formal characteristics of a work which limit the freedom of interpretation (an interpretive community can be as strong and it wants and people aren't going to be believe that Beowulf is about cheese). I'm talking to a colleague about writing such a book, first for our students.

Second, a theory that actually can provide some new answers or can give a different take on various played-out questions. That's what I tried to do in How Tradition Works and if the press would ever get around to sending me my galley proofs, you could see for yourself if it was successful.

[Nota: it's somewhat ironic that someone would call me "anti-theory." When I started out in medieval studies everyone thought I was too theory-focused because of my academic pedigree (Allen Frantzen was my dissertation director and mentor, and he was the first person to being theory to Anglo-Saxon studies). But I don't take "anti-theory" as an insult, since I'm hoping that the person who wrote this had picked up that I'm skeptical of theory's claims and that I'm willing to criticize the way too many people in my profession use it]

Wednesday, September 28, 2005

Job Here at Wheaton College in 18th Century

Here at Wheaton we are searching for a new colleague to teach 18th century literature. Here is the official ad:
Wheaton College, in Massachusetts, seeks an Assistant Professor for a tenure-track position of English literature and culture in the long eighteenth century and its borders, any area of specialization. All members of the department teach courses outside their primary specializations, and we look forward to hearing how the range of your interests can enrich our program. Teaching load is 3-2, and junior faculty may apply for a pre-tenure semester of fully-funded research leave. All members of the department are committed to teaching first-year writing. Wheaton continues its dedication to hiring a diverse faculty and encourages applications from women and people of color.
Send letter, vita, and self-addressed postcard for acknowledgment, postmarked by November 8, 2005, to Katherine Conway, Chair, Department of English,Wheaton College, Norton, Massachusetts 02766. Ph.D. expected by time of appointment. AA/EOE

[N.B.: We are not the Wheaton College in Illinois that has a strong religious focus. We are the Wheaton College in Massachusetts, half-way between Boston and Providence, RI].

To flesh out what I think we mean (I speak only for myself on this blog) by "the long eighteenth century and its borders": a candidate must love, study and teach the literature and culture of the eighteenth century, but he or she can also be interested in literature back into the late 17th and forward into the early 19th; a candidate's interests also do not in any way need to be limited strictly to British literature. In some departments someone who comes near to and crosses boundaries might be seen as a turf threat. Not to us.

I would strongly recommend that applicants NOT take my blog as any reflection of the department as a whole (except in the sense that the department supports an obvious pest, malcontent and loose cannon like myself). You should research the entire department (start here ) and the college as a whole. I do feel confident in saying that to be happy and successful at Wheaton, you need to love and be committed to teaching.

Also, I can say with all honesty that cannot imagine a better department of English, anywhere, in which to work. There is not a single person in my department whom I do not genuinely like, and that is rare (perhaps even unique) in English. I love the college, the department, my colleagues and my students, and so I am not unbiased when speaking of Wheaton.


For the remainder of the search I am going to refrain from commenting on the academic job market. This is entirely voluntary on my part; no one in my department has suggested it. I do so (out of the obvious self interest of wanting the best possible colleagues) only because I worry that a candidate could be led astray by imagining that the things I write here represent the department.

Monday, September 26, 2005

Domna C. Stanton, President of the MLA: You Are NOT Helping

Today, after reading a reply by Scott Nokes to this post, I went to my mailbox and found this month's MLA (Modern Language Association) newsletter.

Domna C. Stanton, a scholar in French and Women's Studies who apparently is now president of the MLA, in her "President's Column" manages to illustrate beautifully why my profession is slouching towards irrelevance.

Although there are many more than two problems with this column, I'm going to focus on two that should be particular embarrassments to the profession: ideas whose dumbness is illustrated in the very argument of the column and poor writing.

The title of the column is "The Paradox of Academic Freedom." Stanton argues (ok, that's being charitable, "asserts" would be a better description) that it is essential for "us" to defend "academic freedom" while at the same time "we" must admit that a definition of academic freedom is impossible.

So we're going to defend something we can't define, and we will do this by defining it, but it won't be a real definition. This campaign is sure to be a smashing success.

Let's look at Prof. Stanton's description of academic freedom:
It is historical, not a transcendent notion; its meanings are contextual, relational and open to change.

Well good luck defending it, then. I'm not a big fan of transcendence, myself, but if we can't assign transcendent value to freedoms that are at their foundation speech and thought, then we might as well pack up our stuff and quit. How can you possibly defend academic freedom if you won't argue that it is a universal good, a benefit not only to the individuals who practice it, but to the society (any society) in which it is embedded? Why should the Chinese or Cuban or Saudi Arabian authorities give their scholars freedom of thought and speech if we can't even make the argument that it is a fundamental human right to think about, teach and communicate what, to the best of your knowledge, you think is true ?

Stanton stumbles around this point, bringing in the utterly useless Universal Declaration of Human Rights and trying to tie academic freedom to this particular piece of paper (and noting that it's not really possible). By choosing to argue from a textualist position (that the rights are Universal because some dead people declared it so in 1948), she chooses to occupy the weakest possible ground to make any kind of substantive argument. Beorhtnoth is a tactical genius compared to Stanton.

But this doesn't stop Stanton from coming up with a laundry list of specific, US political developments that she doesn't like (some are reasonable: the "Academic Bill of Rights" is a terrible idea, though an incredibly clever bit of political judo, turning idiotic speech codes and stifling "hostile environment" rules against their authors), others are childish (the Federal government had the power to subpeona library records before the Patriot Act, so whining about this now seems a little, well, opportunistic; that said, the Patriot Act is a bureaucratic power grab and most of it should be allowed to lapse). But as best I can tell, her reasons for opposing these particular problems but not others doesn't arise out of any principle beyond 'I don't like these.' (She doesn't argue against speech codes or "hostile environment" lawsuits even though these things actually are in force and repress freedom while the Academic Bill of Rights is proposed legislation that hasn't passed anywhere yet. But I digress...)

The point is: bad ideas have consquences. If you accept the bad idea that really, really important freedoms, like Academic Freedom, are merely historical and contingent, then you are going to lose really important arguments.

There is indeed a lot of knowledge whose "meanings are contextual, relational and open to change," but not all knowledge. A proton has a positive charge. One carbon and two oxygen atoms form one molecule of carbon dioxide. In Anglo-Saxon, "witan" is a preterite-present verb. These are facts. You can get all freshman-philosophy on me and argue that because they are embedded in language and convention that they are just language, but that view cuts no ice outside of the MLA, as well it shouldn't.

Stanton has made the mistake of apparently actually believing in a bad piece of contemporary dogma, and it has led her to undercut her own argument about a really important problem facing the profession.

Nice work.

Even worse, when the president of the MLA demonstrates poor writing skills, citizens not in the organization might reasonably conclude that perhaps they don't need to take the organization seriously. I quote:

As crucial as conceptual clarity may be, and as difficult--perhaps impossible--as it may be to realize this idea(l) in any society, academic freedom nonetheless needs to be defended whenever it is under attack. In this column, I examine some of the problematics of the concept in the present context and at the same time affirm the need to combat recent instances of the infringement of academic freedom that are central to the concerns of MLA members.

Where to begin?

Prof. Stanton, did you really just use "idea(l)"? What is this, 1988? That whole multiple-meanings-indicated-by-silly-typography shtick stopped being clever fifteen years ago. Why not just write The Secret of My Succes$ and be done with it? Embarrassing.

What about that parenthetical "--perhaps impossible--" : is this supposed to be a parody? I know you work in French, but come on, even the French aren't doing that whole "X is impossible" thing anymore. Does it embarrass you that the Postmodernism Generator can write this sentence?

You didn't really just make the played-out adjective "problematic" into a plural noun, did you? What is wrong with "problems" if you need a plural noun? What exactly are "problematics," and why do we need this term here?

Did you really mean to use that "I examine ... at the same time ... [I] affirm" construction? Are you really examining and affirming at the same time? Don't you examine first and them affirm later? Doesn't the phrase "in this column" make the "at the same time" phrase unnecessary?

Even worse: "As crucial as conceptual clarity may be... academic freedom nonetheless nonetheless needs to be defended." This makes no sense. How is the cruciality of conceptual clarity in contrast to the academic freedom needing to be defended? Why did you begin that sentence with "as... may be" if you're not giving a contrast in the second half of the sentence?

Prof. Stanton, you write like a committee.

That's probably enough. I could continue through the whole column, but after a certain point it's just mean. Confused thinking, poor writing and political special pleading: is it any wonder no one pays attention to the MLA?

And I am not an MLA basher (Really. Stop laughing.) I think a strong, effective MLA would be very valuable to American society and to our profession. But what we have ain't that.

Instead, the people at the top of our profession (in the MLA as a whole, but also in my own field of Anglo-Saxon studies) are failing us. They are not communicating effectively to the public. They are involved in how-many-angels-on-the-head-of-a-pin debates within the academy that are visibly stupid to those outside (and that wouldn't withstand the intellectual scrutiny of even a first-year grad student in philosophy). And most damningly, they are letting us become irrelevant because by the time the intellectual bills need to be paid, they will be comfortably retired.

Friday, September 23, 2005

Why is Literary Scholarship Going Through a Dry Spell?
You are only allowed to say "a drought" if you smile when you say that

[UPDATE: Please go here and read everything that Scott Nokes has to say about this. I hope to have more later, though Mondays and Tuesdays are my busiest days.]

Over the past week or so, as I've been trying to put a whole pile of projects to bed (revised King Alfred's Grammar off to a potential publisher, Dark is Rising Companion proposal to another publisher, two sets of galleys for book chapters done, compilation of the works for David Bratman's "Year's Work in Tolkien Studies" finished, rough draft of bibliography for 2004 done...), I've been thinking very hard about my plan to write a short, straightforward Handbook of Philology for Students of Literature. I'm seeking the right collaborators (and may have found the right one, but that depends on job searches, etc.) and figuring out what will go into the book. I want it to preserve as much philological method as possible while at the same time being less than 200 pages and easy to follow -- no assuming that readers have internalized all of their ablaut before reading the book. I also want it to be up-to-date, not just a recapitulation of 19th-century work, but I don't want it to be a "linguistics" textbook; I want it to be a "philology" textbook (i.e., I don't care, or mind, that the field of linguistics, even historical linguistics, has moved on from the study of ancient texts. I'm not trying to reconfigure that field; I'm trying to pull together things useful to my field).

But that's not really the subject of this post; only the set-up. In at least laying some of the intellectual groundwork that I'll need to do before I even start thinking seriously about the book, I've been reading W. P. Lehmann's Historical Linguistics and his A Reader in Nineteenth-Century Historical Lingistics (you should have seen the looks I received when I pulled that one out of my bookbag and began reading it at my daughter's gymnastics class). I've been reading, and going over the great works of Rasmus Rask, Jakob Grimm, Franz Bopp, Karl Verner, Eduard Sievers and Ferdinand de Saussure, seeing how their work build on each other and coalesced into an effective, intellectually powerful discipline. It must have been amazing to be at Leipzig during the early years, or later at Paris, when these things were being figured out.

Combine this with my reading all of Tolkien's published scholarship a few weeks ago: all that great work being done by him and R. W. Chambers and Lawrence and Klaeber in the 1930's and even into the 40's. Then add in the other book I'm finishing right now, Ernst Mayr's Systematics and the Origins of Species. Mayr uses the (Name date) citation format, and Tolkien's articles were almost all written within fifteen years. I started noting the incredible number of citations from the late 20's, through 30's, up until about 1941 (I'm guessing that the papers published in 41 were those where research and writing were done before the war started). It was a time of amazing intellectual accomplishment in both literary study and in evolutionary biology.

Why hasn't this happened, in literature, again during my lifetime? There have been little bursts of interest, little fads, and out of these we get the demi-gods of 80's and 90's academia: Derrida, Foucault, Butler, Said, Bhaba, Fish. But these folks didn't build anything coherent: there's no new method (well, I guess you can go gigging for binary oppositions), there's no edifice of knowledge comparable to what was built by the great nineteenth century linguists, or by the "last philologists" of Tolkien's day, or by Mayr for biology. Maybe there were just giants in the earth in those days.

That's certainly possible, but I also think we might be seeing the results of a system that has been terribly stable (and hence ossified and boring) for a long time. There is no way a 33-year-old like Tolkien would be appointed Rawlinson and Bosworth Professor of Anglo-Saxon today. No way that a major endowed professorship will be given to a hot young scholar instead of someone in their early sixties, likely to retire in less than a decade. I understand the impulses (most of them laudable) that cause this.

But in the late 20's and early 30's, WWI and the Depression had changed everything. First of all, an awful lot of young men were dead, leading to expanded opportunities to those who were not dead. Hierarchies were shaken up and new blood given new responsibilities. This led to a brief flowering of intellectual life in certain fields. Changes in technology, communication and economics also shook up established fields and led to their flowering.

Out of this ferment, men like Tolkien and Klaeber and Mayr rose to new prominence when normally they might have been expected to wait and take their turn for many long years. They then had opportunities to do great work and re-shape fields that were beginning to be moribund.

But what today can shake up English studies? One hopes to God that it won't be violence and the death of millions as it was in 1918 and 1939. Economic pressure has exactly the opposite effect one would have hoped for: instead of improving the discipline and expanding its reach, everyone has hunkered down, protecting turf. I though technology might help, but even though I can edit a book with contributors from five continents and never meet any of those contributors, technology isn't shaking up the fossilized intellectual system that still operates as if we are in 1974. We need some breakthroughs, some reconfiguration of the field around new ideas. But where that will come from in this climate of the weird combination of intellectual timidity and grandiose claims, I don't know.

Wednesday, September 21, 2005

On Set Pieces

Since I am on a small trend of discussing things in academia that annoy me, might I mention the habit of many academics of using "set pieces"?

Now we all have set pieces that we use in lecture or even discussion: a certain, expected question comes along, and we can launch into an effective piece of oratory, with well-chosen examples and even memorized quotations.

For example, I have a set piece on Tolkien's engagement with the German race laws of the 1930's: in response to an inquiry as to whether or not he was Aryan, Tolkien replied that he was not aware of any ancestors who were Hindustani, Persian, Gypsy or related dialect-speakers and then went on to say that if they were in fact asking whether or not he was Jewish, he did not know of any ancestors of that talented people, but if such impertinent questions become commonplace, he would no longer regard his German name with pride. Later he wrote to his son Christopher that would have been a better soldier at 40 than he had been at 22 due to an abiding hatred of that ruddy little ignoramous Adolph Hitler: ruining, perverting and making forever accursed that noble northern spirit Tolkien had so admired...

I also have a set piece on Thomas Jeffereson's thought to make Old English the legal language of the United States and his design for the great seal of the US to depict on one side the pillar of fire that led the Isrealites to the promised land and on the other side the Anglo-Saxon warriors Hengest and Horsa, who led the "migration" or "invation" of the Angles, Saxons and Jutes to England.

Set pieces work well, though you must be judicious: I have a few students who tend to "major in Drout" and thus may have seen a set piece before.

But, I assert, you must never use a set piece on one of your colleagues, particularly if that colleague has heard you do that set piece before on undergraduates. It happened during my dissertation defense, when one of my committee members launched into a well-prepared bit on Cædmon's Hymn that I had heard delivered to a class less than a year before. When I stopped gripping the edges of the desk, I decided the best thing to do was visibly zone off during the presentation -- NOTE: This is a terrible strategy. Do NOT do it at your Ph.D. oral, no matter how pissed off you are. Really. Bad. Idea.

Yesterday the same thing happened to me in a meeting, and once again, I was floored. It reminded me of the scene in The Blues Brothers, where Carrie Fisher has John Belushi crawling around in the mud while she points a machine gun at him, and he tries to charm her back, and for a moment there is a priceless look on Fisher's face, a look of sheer fury that you would even dare think of trying to explain (and then John Belushi, being John Belushi, briefly makes up with her). The look on Carrie's face was the look on my face while I sat there and listened to this set piece.

Patronizing is one of the deadliest of the academic deadly sins, and is it all the more dangerous because academics fall so naturally into the trap: it's a short, short trip from Knowledgeable to Patronizing--you hardly have time to find a good seat before you're there. So don't even board the Set Piece Train in front of your colleagues.