I began to write this from my room in a Travelodge. I was recently at a small conference where undergraduates came together to discuss their research, and I was presenting mine. But as the day went on, I noticed something. After some talks, people were walking out of rooms not fully understanding what had just been said. This, of course, is to be expected. I did not follow presentations on physics because I do not have any great aptitude for the subject, and thus didn’t study it past GCSE level, so third year projects on the Higgs Boson, though absolutely fascinating, were beyond my somewhat limited intellectual grasp.
As I was writing my notes for my speech, I decided that I was going to be as accessible as humanly possible. People were going to understand what I had to say and they were going to like it. For what it’s worth, I think I probably succeeded on the former, perhaps not the latter. But I would be damned if I didn’t try. The overall feeling I had when writing my speech was that if people couldn’t understand a word I was saying, then what on earth was the fucking point in saying it?
I’m not arguing in favour of academic dumbing down, or pitching always to our lowest common denominator. But the idea that what we produce as ‘thinkers’ (insofar as that is, in theory, what we are supposed to be while here at university) should only be accessible to people like us at all times misses the point of what we are supposed to be doing. Just because the academy’s Actually Existing Focus is on specialism does not mean that this ought to be the case. The argument that we’re at a level where this has to be the case because we just know so much seems to me to contain a level of hubris about ourselves right now and our contemporary knowledge that might be worth taking a step back from.
Without going full Spectator, looking back on my time at university, too often it has been the case that amongst students, obscurantism, linguistic orthodoxy and sheer pedantry are peddled in place of actual wisdom or attempts to be understood. I know that Orwell has covered this, but treading in Orwell’s shoes is never a bad thing for a writer, so here he is:
(ii) Never use a long word where a short one will do.
(iii) If it is possible to cut a word out, always cut it out.
(iv) Never use the passive where you can use the active.
(v) Never use a foreign phrase, a scientific word, or a jargon word if you can think of an everyday English equivalent.
Off the top of my head, I’ve seen jargon deployed liberally, I have heard ‘contra’ used when ‘against’ would do, Latin fired off at will. Personally, I am perennially guilty of using the passive.
Too often, too, the obscurantists turn to safe linguistic ground rather than engage in any actual persuasion. Background is never considered, and a certain sort of vernacular (read: very often working class vernacular) is pounced on as problematic before any critical engagement occurs or any attempt is made to engage in discourse. It need not be a material background; either – it can be a cultural one. Those of us who grew up on Tumblr had access to a certain sort of ‘liberal arts’ vocabulary, using ‘discourse’, ‘problematic’, ‘critique’ and such years before the setting in which they tend to be most usually fired out – namely, the seminar room, and being exposed daily to contemporary thinking on gender and other forms of identity discourse. I have seen students pounce on working class peers for ‘classist’ language (which, outside of the internet, with its ‘Priest’s desire to excommunicate’, would scarcely pass muster as remotely bad) rather than attempt discourse about it. In cases like these, the language of critical theory, designed to overturn power structures and challenge the world, serve as a sort of currency, and at worst a weapon with which to beat down those with whom you disagree. The argument becomes not ‘I know more than you’ or ‘I have a better argument than you’ but rather ‘I have access to more words than you do, and we exist in a setting where these words have power.’ Students, on the whole, want to do good and be good, and exist in a setting that broadly speaking encourages these tendencies. This, of course, is a good thing, but there can be no denying that this can be utilised in utterly pernicious ways. ‘Fuck you, I can theory-craft an argument that positions you as a bad person’ is too often the base of arguments, and anyone with a three-year humanities degree will have the ability to do this to someone without one. I mean, I can bend theory to support pretty much anything you want. I can argue that up is down and black is white, and cite a 20th century theorist to support me, and yes, I’ll be wrong and my argument will be tenuous and bad but you, (in this scenario, a younger student or someone without my cultural/academic background) will not have enough linguistic currency to call my bluff. It’s basically just a game of Six Degrees of Kevin Bacon but with the end goal of making you Bad, and me Good, and thus Right.
You could do more good with slightly kinder eyes and a willingness to help than you can with ten thousand instances of denunciation and expulsion. Right is not a zero-sum game. To tackle this, I think that the university needs to adapt slightly. For 2,000 years the university system has existed to contribute to and shape the sphere of human knowledge, passing it down to the next generation. Perhaps a renewed focus on what to do with this knowledge, not in the market-driven, ‘hey, here’s what an employer wants’ sense, but in a more meaningful sense of what, ethically, knowledge and advanced learning ought to be for, is needed.