Sunday, February 26, 2017

Roundabout canons

Every academic discipline has a canon. That is to say, a series of texts that most of those who are active in the field have read, or at least have some sort of working understanding of. The exact composition of these texts vary from field to field (and over time), but at any given moment you can be sure that there is a set of books most practitioners within a particular field of knowledge knows about. The canon as a general category, whilst undefined in its particulars, still exists.

It is markedly more defined at local levels. It is especially defined at local sites of education, where there are syllabi that explicitly specify which texts are included in the mandatory coursework. Teachers are expected to know these texts well enough to teach them, and students are expected to read them well enough to mobilize some parts of their content through some sort of practice. Such as writing an essay on just what the mandatory texts have to say.

Invariably, there will be some students who are just not feeling it when it comes to going through the academic motions. Invariably, these students will turn to the internet for an easy way out. Invariably, some of these students will yoink a text from the internet and turn it in as if it were their own.

Thing is. If the texts and/or the subject matter remains the same over the years, patterns will emerge. Students will be faced with the same task of producing some work on a topic, and they will conduct the same web searches year after year. And, if general laziness is a constant, they will find the same first-page results and turn them in, unaware of their participation in an ever more established tradition. [A fun sidenote: I have a few blog posts which receive a boost in traffic two times a year, which coincide very closely to when their subject matter is taught at my local university.]

What I wonder is - how many times does a particular web-copied text need to be turned in before those in charge of grading start to recognize it? Or, phrased another way: how many iterations does it take for these easy-to-find texts to become part of the local canon?

A canon is wider than merely those lists found in official documents, such as syllabi. Informal inclusion is a very real phenomena, and when a particular text keeps showing up again and again and again -

Now there is food for thought.

Wednesday, February 22, 2017

Postmodernism, a primer

There has been a lot of talk about postmodernism lately, and the only thing larger than the distaste for it is the confusion about what it actually is. While it might be tempting to label this as a postmodern state of things, it's not. It's just confused, and confusion is not postmodernism. The latter might lead to the former, but that is the extent of the connection between the two.

If you've ever read a textbook that in some way deals with postmodernism, then you've probably encountered the introductory statement that the word consists of two parts - post and modernism. Post- as a prefix means that whatever it is fixed to happened in the past. When it is fixed to modernism, we get a word that means "the stuff that happened after modernism". Modernism came first, then postmodernism - in that order.

There are two main reasons for including introductory remarks of this kind. The first is that it has become tradition and convention at this point, and it's easier to latch on to what has already been established than to be creative. The second is that you cannot treat postmodernism as an entity unto itself - it has to be understood in relation to what came before. If you do not understand modernity, you will not understand postmodernity. The one came from the other, and it could not have happened in any other way.

It is vitally important to underscore this intimate relationship. It is a historical progression which is not merely chronological - the tendencies and practices set in motion in the modern time period kept going in the postmodern time period. They are linked, similar and connected.

The modern project was (and is) one of enlightened critical thinking. Traditional institutions, mainly those of monarchies and churches, were no longer to be seen as the absolute authorities when it came to the truth. Instead of relying on ancient authorities (or very present authorities, as it were), the moderns wanted to rely on science and reason.

An example of this shift from ancient authority to a more modern way of thinking is Galileo and the notion that the Earth goes around the sun. Using the tools at hand, Galileo figured out that Earth is not the center of the solar system. The traditional authorities, who held that the Earth was in fact the center, did not agree, and much ado was made about it. In the end, you know how it all turned out.

This ambition to test things by means of science and reason wasn't limited to one person and one particular way of looking at things. Over time, it became the default mode for everything - everything could be questioned, measured, re-examined and put to the test. Those things that were found to not hold up to the standards of scientific testing were thrown out, and those things that did hold up were expanded upon.

The scientific implications of this are fairly obvious: you can get a whole lot more done if you are allowed to freely use the scientific method, without having to make sure everything you find corresponds to what the authorities want you to say. Science builds on science alone, and its findings are all the more robust for it.

The social implications, however, are less straightforward. If long-held beliefs about the cosmos as a whole could be questioned and challenged, then so could long-held beliefs about things of a smaller and more private nature. If the church was wrong about the Earth being at the center of the solar system, then it might also be wrong about marriage, sexuality, and other social institutions. Everything is up for questioning. Everything.

This process of questioning everything kept going, and over time more and more things that were once taken for granted were put to the task of defending themselves. Everything that was once solid melted away, and what came instead was something completely different. Where once kings and bishops ruled, there are now scientists and bureaucrats. And marketers.

Mind you, this is all part of modernity. This is the part that came before postmodernism became a thing. Postmodernism is what happened after this process had been around for a while and become the status quo.

The thing about questioning everything is that you can't really keep doing it forever. At some point, you arrive at the conclusion that some questions have been answered once and for all, and thus that there is no need to go back to them. You begin to take things for granted, and enshrine them as the way things are supposed to be. There are other, more important things to do than reinventing the wheel. There is an order to things and a tradition to consider, both of which are as they should be. The product of modernity is a new range of authorities which dictate what is to be taken for granted and what is to be questioned.

Postmodernism is a return to the very modern urge to question everything and make present institutions answer for themselves. It is, in essence, a return to the modern impulse to trust reason and science rather than tradition or authority - even if these very same traditions and authorities have used reason and science in the process of becoming what they are. But instead of asking whether the Earth revolves around the sun or not, it asks: why do we do the things we do the way we do them, and might there not be a better way to go about it?

Postmodernism happened after the modern project. Post-modernism. But it is still very modern. It is modernity turned upon itself.

If you, after having read this, are slightly more confused about postmodernism, then that is good. It will have primed you for this next statement:

Academics stopped talking about postmodernism some decades ago, and are baffled at its return to fame in news and popular culture.

As final words, I say only this: its resurgence is not postmodern. It is merely confusing. -

Friday, February 17, 2017

All is good that is good

It is often said that it is impossible to argue about taste. De gustibus non est disputandum. Some people like some things, other people like other things, and no amount of arguing is going to change this one indisputable state of things. This is where it is at, and thus here we are.

Nevertheless, we often find ourselves in situations where we want to convey why we like something. In matters of literal taste, the argument is simple: just present the person we want to convince with a tasting of the good stuff, and let the taste buds do their thing. Either we succeed or we do not; the outcome depends entirely on factors outside our control. Regardless of outcome, the attempt was made.

When it comes to more abstract things, such as music or writing, a similar approach is also available. Give someone a tasting of the music and writing, and see how they react. Either they get it, and your work is done, or they don't get it, and -

It is possible you at this point want to argue why that thing you like is good. Why the poem your friend is utterly indifferent to is actually amazing, why that song owns the sky and everything below it - why they should like it, too.

This situation presents something of a problem. If you really really like something, then its awesomeness is so self-evident and obvious that it is difficult to find some mean of reducing it to mere words or communicative motions. No discursive gesture would convey just how good it is, and attempts to convey it anyway often stray into unrelated territories, causing confusion or disagreement. Which, one might reasonably assume, is the opposite of what you wanted to accomplish.

A first move from here might be to simply state that you like the thing. This may or may not be useful information to the other person - it all depends on your particular relationship and suchlike. But it provides a baseline for further attempts to convey the goodness.

A second move might be to say that someone else likes the thing. Preferably, this third person is someone you both like and acknowledge as someone whose opinion matters. If they like it, then there's got to be something to it, right?

A third move might be to make a more generalized claim about mass (or niche) appeal. If it's famous, then it must be good, or it wouldn't be famous; if it's niche, then it must also be good, as it is an expression of the virtues of the niche.

As lines of argument go, these are rather flawed. But they are also very common. They are human.

Thing is. Giving reasons for why things are good or bad is hard. There are no readily available frameworks for it, and those frameworks that do exist require a non-trivial amount of effort to get in to. Most of them hide behind camouflage strategies such as the name "literary critique", and get progressively more invisible from there.

Maybe the proper thing to do is to cut our friends some slack. Give them the benefit of the doubt when their eyes get that enthusiastic gleam. -

Wednesday, February 8, 2017

A thought

The strange thing about thoughts is that most of them are irrelevant. You think them, they flow through the mechanisms of cognition, and then nothing. Nothing comes of it. In the grand scheme of things, whatever thought happened in those irrelevant moments could be replaced by any other thought, and nothing would have changed. Thoughts occupy time, and that is about all they do.

Except, of course, when they do more than that.

Thing is. Most thoughts are never recorded. They happen, take place, and are gone. Some of them are important, some are irreverent, some would make a difference if only they were jotted down somewhere.

But we never get around to thinking we ought to record them. And then they are gone.

Just thought I'd remind you that you still have the option.