Russell Jacoby and Alan Jacobs take a stand against the tendency of academics to "complicate" or "problematize" an issue rather than come up with an interpretation of it. As Jacobs notes (why such similar last names!?), Jacoby himself is, in his denunciation, actually "trying to complicate our understanding of academic habits of mind." But is Jacobs not complicating what Jacoby has to say about complicating? Will my brain stop spinning, or will it keep on rotating furiously?
What's "complicating"? It's taking something that everyone (in your field; in the world) takes for granted and showing that it's not so simple. This is always a necessary first step in academia, so the question is whether you show how it matters that things aren't what they seemed. If they simply aren't what they seemed, end of story, your professors will yell at you and, back to the drawing board. Or so you'd think, but then you see papers that don't go further than, look, neat, who knew? But is this a problem?
I'm not sure. Do all college professors really need to be offering up new ideas? If someone is 100% fantastic at regurgitating every last detail of his field and related fields (and, say, proficient in many languages), is this such a disaster, as long as some other profs are doing the innovating? Plenty of innovator-types would have problems with the regurgitation once it meant leaving their narrow fields of interest.
I think the issue of "problematizing" or "complicating" is the result of the pressure for innovation, the idea that simply passing knowledge on to the next generation (what a teacher is expected to do in a high school) is not an end in itself. And while innovation's great, it's not for everyone, and even more than that, an innovative idea doesn't necessarily pop up in one's head just when the final paper is due, or, alas, the dissertation proposal. So a rhetoric develops that offers the aura of innovation to projects that need this to float, which functions as almost a formality before announcing, and I will now tell you what happened during World War I, because how many people know this anymore, and it's important stuff.
However--and this strikes me as key--complicating/problematizing is typically a first step to coming up with something new. If listeners tune out as soon as they hear either expression, they will miss both the regurgitation (which, to repeat, I don't think is necessarily so bad) and the innovation. "I will complicate the notion of..." doesn't always signal "I will offer up a timeline of World War I, with a few small additions."
OK, I have more thoughts on this, but, off to complicate...
AGREED. I felt bad for hassling a certain student yesterday day about why s/he was interested in his/her subject and what it would contribute, but I felt like I had to do it given the nature of our class.
ReplyDeleteAfter all, I myself was also given the third degree about whether or not I was saying anything "new". Why can't it be okay to just bring up something that hasn't been looked at in a while. Isn't that new enough?
J. H. Hexter has a good essay on why historians care so much about research. Basically, because it's the only way to establish a status hierarchy--there's no effective way to compare teaching skills, but peer-reviewed judgment of research allows professors to establish a pecking order. Originality, perhaps, is the easiest way to judge the quality of research? So perhaps one can defend this focus on originality on this basis.
ReplyDeleteP. S. In literary studies, original interpretation may be at a greater premium over archival research than in history.
ReplyDeleteThere's a trade-off, it seems, when it comes to whether one seeks originality or whether the goal is being 'learned.' My sense from grad school is that having an original idea matters more than, say, reading five languages and knowing the date each major war in your country of interest broke out. It's not to say that knowledge doesn't matter--it does, quite a bit--but it all comes down to the new idea, the new argument. If you don't have this, your efforts thus far were at best a hobby you can bring up at cocktail parties in your future life working in law or PR, like when George Costanza wishes he were a Civil War buff. If something goes wrong, I will be a Dreyfus Affair buff, by which I mean, "buff" does not equal career.
ReplyDeletePersonally having a new take on something appeals to me, and much as I want to be able to read a novel in Dutch or Hebrew, this has yet to happen. (Not that I have a new idea, but I'm working on it, as we all are.) So in terms of temperament, the status quo works for me.
But I wonder whether this has always been the case, whether dissertations were always supposed to present something new and not just something old presented in a manner comprehensible to readers of the time the thing was written. Or, whether professors were always people who'd had, at least at some point in their lives, a new idea. How many truly groundbreaking ideas can there be? And if everyone, writing each paper, at each level, must present their work as innovative, does this devalue what it means to have a new idea?
I feel like what I'm getting at might be a question of generalized versus specific knowledge, more than anything else.
Perry Miller, in The New England Mind (I believe) discusses Harvard theology theses, where the point seems to have been to construct an argument and defend it ably rather than to be original--resembling, therefore, the eight-legged essays of the Confucian exam system of Imperial China. The dissertation, it seems to me, preserves this craft aspect requirement--but superimposes on it, yes, the goal of originality. (Again, in history we can substitute original research for original thought--an easier task.) I think you are correct that the incentive for originality over learnedness is too great in modern academia--particularly in 1) the search for tenure, and 2) the search for jobs at Big Research You. But I think there is space, once you have tenure, and if you don't yearn to profess at Harvard, to become learned.
ReplyDeleteI knew a guy who did his English dissertation at Rutgers by editing some poet's letters. It would help if there were more room for such dissertations.
ReplyDelete"But I think there is space, once you have tenure, and if you don't yearn to profess at Harvard, to become learned."
ReplyDeleteBut should it go in that order? At least in the sciences, it's supposed to be that one can only innovate when young, but how true is that for the humanities? And if you haven't read enough, how do you even know if you're original to begin with?
Can we stipulate a virtuous cycle where one's originality and learning increase in tandem? I think it's reasonable to stipulate that gaining a PhD requires you to demonstrate a modicum of both learning and innovation. It's a question of weighting ... and I think I can testify that the search for originality requires you to gain learning! -- that is, I have engaged in an entirely unscrupulous and superficial process of skimming secondary literature in search of substantiating my original theses, and finding out whether they're original at all, but it has served to make me a bit more learned. I'm even willing to venture a general thesis that the search for originality is as effective a way to inspire learning as any. Subject to abuse by second-rate minds, same as any system--and at its least effective when reduced to a dogma--but not bad.
ReplyDeleteBut the definition of originality matters here. Us rhetoricians and traditionalists remember that, via imitatio, originality arises from learning--but there is a modern deformation that takes originality to involve erasing/forgetting/never learning the traditional bases of knowledge. I think your objections to the fetish for originality would lessen if a more traditional definition of originality applied.
When I have new ideas, it is often only to discover that they are neither new nor correct, and I would've known that had I been more learned. I'm pretty sure the educational system from grade school on up is biased towards innovation and against storing up great concentrations of knowledge (witness me: American history major who knows nearly nothing about the Civil War but still graduated with good grades and honors), so it seems wise once you're at the top of it (that is, getting a PhD) to shore yourself up against this bias eventually.
ReplyDelete