I Don't Want To Be A Nerd!

The blog of Nicholas Paul Sheppard

Retrovolution

2014-04-11 by Nick S., tagged as buzzwords, education

I had intended to wait until completing my Coursera course on university teaching before writing another comment on massive open on-line courses, but today read some words from recent ex-Vice Chancellor Jim Barber in The Australian's higher education supplement (Seven lessons of survival in an online world, 9 April 2014, p. 29) that prompted me to write earlier. Barber, along with some elements of the course I'm studying, seem to to believe that there is something terribly wrong with university education, has been for years if not centuries, and that the whole business is about to be swept away by fantastic new approaches and technologies that will have our students dancing in the streets with joy, not to mention heads full of knowledge.

Coursera's course being presented by American-accented lecturers at Johns Hopkins University, I was much reminded of the complaint that "Americans don't understand irony" as I learned about the need for innovative teaching techniques from a talking head with slides, and "discussed" the value of learning in small groups with hundreds of other learners on the discussion boards. Barber himself thinks that anyone "who continues to believe that the purpose of a lecture is to transmit information [needs] to be dispatched to a re-education camp" though he doesn't state what he thinks the purpose of a lecture actually is. (He may mean "class" rather than "lecture", since I take the very definition of the latter to be the oral transmission of information from the speaker to the listeners.)

Teaching Americans about irony and lecturers about communication aside, I actually found the course interesting and informative, with a good balance between delivering established knowledge, enabling student thought and discussion, and providing exercises that put relevant skills into practice. But this leaves me only more puzzled as to what Barber and his fellow revolutionaries are on about: I learned engineering using much the same combination of techniques twenty years ago, and it's no surprise to me that people use them, because they work (at least for me, and the many students who've successfully completed courses at my current instituion). Sure, they're on a web site instead of in a classroom now, but I wonder where the revolutionaries have been if they think that such techniques appear only in science fiction.

Sorel Reisman, writing in the April 2014 issue of IEEE Computer (The Future of Online Instruction, Part 1, pp. 92-93), seems to me to have a much better grip on the state of on-line education than many of its enthusiasts: he observes that MOOCs are simply learning management systems that support large enrolments, and that learning management systems are themselves simply content delivery systems tailored for educational content. He himself thinks that any real advances in on-line education must come from what he calls "adaptive learning", where the learning system adapts to the needs of individual learners. (Coursera's course recommends more or less the same idea under the name "personalisation", but the focus there is on how human teachers can provide it.)

A recent conversation with an experienced high school teacher told me that the same phenomenon exists in other schools: every now and again, revolutionaries come to school and ask teachers to "update" their methods with techniques that teachers have been using for decades, possibly under a different name. Perhaps such practices could use a buzzword of their own, maybe retrovolution?

Winning the lottery and other certainties

2014-04-06 by Nick S., tagged as employment

Every now and again, someone who knows that I'm a software developer, but is not a software developer him- or herself, tells me about how much money certain software developers make by developing mobile applications (or, in one recent case, web sites). Knowing quite a few software developers (including myself) who are somewhat less wealthy than the heroes of these stories, I've come to suspect that telling the stories is akin to telling an out-of-work actor that Tom Cruise makes millions of dollars starring in Hollywood blockbusters. Sure, a select few people do make millions of dollars by hitting upon the right software or being picked up for the right films, but most of us surely have much more modest prospects — just ask your average actor.

The Sydney Morning Herald recently ran a happy, but measured, story along these lines in its "My Career" section (Dotty over phone apps, 29-30 March 2014, p. 13). It's not completely clear to me what the hero of the article is actually employed to do, but the thrust of the article is that he has successfully converted an interest in computing into a career by finding work with an iPhone app developer. In that sense, he doesn't seem much different from me or a lot of other computer scientists: we started out with an interest in computing, got degrees in it, and ended up working with computers in one capacity or another. I imagine similar things happen for people with interests in other things, at least some of the time.

To its credit, the SMH story doesn't make it out to be quite so easy as my non-developer colleagues sometimes seem to think it is, and nor is its hero made out to be particularly wealthy. Nonetheless, My Career and similar publications only interview winners: people reading the employment pages of the newspaper aren't likely to be seduced by the lives of sessional academics, out-of-work actors and other frustrated folks.

I suppose a gung-ho business type might say that I lack the entrepreneurial spirit required to create an opportunity that would make me rich. Perhaps I do: I'm typically risk-averse and I didn't take up engineering because I was interested in running businesses (or, for that matter, because I expected to be wealthy). But I think such gung-ho-ness might also be glossing over the part of the definition of entrepreneur that involves being the one who bears the risk of an enterprise. Cherry-picking the fates of successful entrepreneurs, who accepted the risk and won, ignores the fate of the many entrepreneurs who accepted the risk and lost. The result is the dubious supposition that entrepreneurial behaviour is a sure path to wealth.

Coincidentally, the April 2014 issue of IEEE Spectrum contained an insert full of advice to job-seeking graduates that nicely illustrates this sort of cherry-picking, apparently unconsciously: three short articles outlining all the wonderful job opportunities that exist throughout the world are followed by a long article (How to Stand Out in Your Job Search) describing how graduates can stand out from the hundreds of other graduates also seeking such opportunities. Sure, someone out of those hundreds is going to end up with the job, and you have to be in it to win it, but surely no level-headed assessment of such competition could conclude that everyone, or even many people, are destined to win.

Unwieldy communities

2014-03-31 by Nick S., tagged as communication, education

I recently signed up for an on-line course in university teaching with Coursera, in part because I was curious to see how massive open on-line courses ("MOOCs") work and in part as a lower-committment alternative to studying for a full-scale graduate certificate in higher education that I decided I wasn't currently able to afford or commit to. I might write more about the MOOC experience when the course is over, but I was first inspired to make a few observations on very large communities.

Logging in to the course for the first time, I was immediately impressed by my own smallness. There's nothing like glancing over hundreds of posts from other learners introducing themselves to remind oneself of what a tiny part of the world one occupies, even the relatively elite world of university teachers. Having happened to re-read Douglas Adams' Hitchhiker's Guide to the Galaxy recently, I readily identified with the Total Perspective Vortex used to torture prisoners by showing them just how insignificant they are compared to the universe in its entirety.

I quickly saw that I was only going to be able skim over the posts made by other learners, and that I couldn't expect other learners to spend any more time appreciating whatever I was going to contribute. I've come to similar realisations reading Usenet articles in the 1990s, and observing the comments sections of popular news web more recently: with so many articles and comments out there to read, and many of them being less than enlightening, reading all of them is a fool's task.

Here lies a problem for the idea that blogs and comments would radically democratise media and political discussion: it simply isn't feasible to hold a conversation with millions of participants. Matthew Hindman details the result for political blogs in The Myth of Digital Democracy (2008): only a tiny handful of blogs have a wide readership, and they're mostly written by the same kind of people who previously wrote widely-read newspaper columns.

Going back to my course, I came to see the main value of posting to the discussion board to be not in intimate conversation with hundreds of my fellow learners, but in working through my own thoughts and putting them into a form in which they might be digested should someone happen to read them. (I take much the same view of this blog.) When reading the discussion board, I can only hope to get an overview of what everyone else is talking about, with only the occassional pause to read an eye-catching item in more depth.

So I hope my classmates won't be too offended if I miss any posts that they've slaved over, only to have them drown in a sea of other posts. It'll take more than a nice web site to expand our brains to encompass conversations with a hundred other people.

On the myth of the machine

2014-03-19 by Nick S., tagged as philosophy

I've recently been reading a bit about science vs humanities, having worked my way through Neil Postman's Technopoly (1993), Joseph Weizenbaum's Computer Power and Human Reason (1976), Lewis Mumford's The Myth of the Machine (1967, 1970) and finally something of a rant about the alleged STEM crisis from Hal Berghel in the March 2013 issue of IEEE Computer (p. 70-73). Each complain about what they see as a "mechanisation" (as suggested by Mumford's title) of society, driven by a narrow pursuit of economic efficiency and technological progress at the expense of real human interests.

I've never quite understood some of the antagonism that seemed to exist between disciples of the sciences and the humanities around the middle of the twentieth century, and arguments over the merits of quantitative vs qualitative research. Maybe everyone was over it by the time I began studying for my undergraduate degree in the 1990s, having finally accepted that there are many interesting fields of endeavour and many valid approaches to research with their own strengths and weaknesses.

I, like a lot of other scientists and engineers, have a lot of affinity for hierarchical reductionism, in which any particular system is studied and explained in terms of its immediate sub-components. So biology is explained in terms of biochemistry, which is explained in terms of chemistry, which is explained in terms of physics, for example. As far as any credible science can tell, humans are indeed made up of sub-atomic particles and the forces that act on them, but hardly anyone supposes that sub-atomic physics is an effective tool for describing or understanding, say, art or politics. At the same time, to claim that humans somehow transcend or defy the laws of physics is a likely recipe for bullshit.

The real problem for humanists, perhaps, is that few people feel the need to hire out historians, philosophers and art critics in the way that they hire out accountants, physicians and engineers. Yet almost everyone is interested in art and history to some degree, and meaningful participation in society surely requires some knowledge of that society's culture, history, philosophy and much else besides. In a sense, we're all amateur humanists, but we leave science and engineering to the professionals. Consequently, the humanities become invisible to narrow economic analyses that track only the transfer of material wealth from one person to another.

The real enemy here is narrowness, whether it be an economist's pre-occupation with material wealth, an engineer's pre-occupation with machines, or a humanist's pre-occupation with soul. If you want to achieve some narrow task, a machine is indeed likely to be an excellent tool for performing it efficiently and well. But who wants to be a machine?

Who's greedy?

2014-02-19 by Nick S., tagged as commerce, digital media

The Conversation's David Glance recently argued that copyright reform will drive innovation rather than attempts to stop copyright infringement. It's not clear to me why copyright reform and prosecution of copyright infringers should be the mutually exclusive exercise that the article's title implies — after all, why not just get rid of the law altogether if no enforcement of it is necessary? — but my main quibble with the article is its treatment of what Glance calls the "de facto application of copyright law", that is, the rules that people actually apply in practice irrespective what the Copyright Act might say.

Glance claims that "society is behaving collectively to determine what they consider fair and reasonable" in the making of recordings. His examples, however, refer only to the behaviour of copyright users, and it seems to me that he is really describing situations in which "copyright users are behaving collectively..." since the copyright-owning part of society is nowhere to be seen. Said examples include the claim that television viewers don't download infringing copies of television programmes (specifically, Game of Thrones) because they don't want to pay, but in order to circumvent time and place restrictions that they don't like.

Ed Hotan's comment contradicts this, though Hotan himself seems to support the downloaders: Game of Thrones was, in fact, available in Australia at the same time as it was in the US; its viewers just didn't want to pay the fee being asked by Foxtel. For Glance and Hotan, this seems to justify (or at least excuse) a television viewer who makes the unilateral determination that zero is a "fair" price according to some unspecified theory of fairness.

Now, the price being asked by Foxtel was pretty steep if all you wanted was Game of Thrones, since Foxtel's pricing is based on purchasing a whole year's worth of television. But the price being offered by downloaders was also pretty low, and doesn't appear to be justified by anything other than "I want it now". So who's being greedy here?

I've come to sense a certain loss of perspective in these debates, especially since the security of digital media ceased being my day-to-day concern at the conclusion of my last research contract. Hearing the howls of outrage at Australians' alleged inability to watch Game of Thrones at the same time as viewers in the US, one might take the watching of US television to be a human right. But jeez, guys, it's just a television programme, and not much more than an aimless compilation of sex and violence at that. George R. R. Martin himself seems to have lost interest in the original characters and plot by the time he got to Book 4.

If we don't want to pay the producer's price, we do have the choice to do without, just as most of us do without luxury yachts because we don't want to pay the prices being asked for them. So, if the media industry really is composed of greedy pigs not worthy of the fees that they want us to pay, why not really stick it to them by ignoring their output?