Retrovolution
I had intended to wait until completing my Coursera course on university teaching before writing another comment on massive open on-line courses, but today read some words from recent ex-Vice Chancellor Jim Barber in The Australian's higher education supplement (Seven lessons of survival in an online world, 9 April 2014, p. 29) that prompted me to write earlier. Barber, along with some elements of the course I'm studying, seem to to believe that there is something terribly wrong with university education, has been for years if not centuries, and that the whole business is about to be swept away by fantastic new approaches and technologies that will have our students dancing in the streets with joy, not to mention heads full of knowledge.
Coursera's course being presented by American-accented lecturers at Johns Hopkins University, I was much reminded of the complaint that "Americans don't understand irony" as I learned about the need for innovative teaching techniques from a talking head with slides, and "discussed" the value of learning in small groups with hundreds of other learners on the discussion boards. Barber himself thinks that anyone "who continues to believe that the purpose of a lecture is to transmit information [needs] to be dispatched to a re-education camp" though he doesn't state what he thinks the purpose of a lecture actually is. (He may mean "class" rather than "lecture", since I take the very definition of the latter to be the oral transmission of information from the speaker to the listeners.)
Teaching Americans about irony and lecturers about communication aside, I actually found the course interesting and informative, with a good balance between delivering established knowledge, enabling student thought and discussion, and providing exercises that put relevant skills into practice. But this leaves me only more puzzled as to what Barber and his fellow revolutionaries are on about: I learned engineering using much the same combination of techniques twenty years ago, and it's no surprise to me that people use them, because they work (at least for me, and the many students who've successfully completed courses at my current instituion). Sure, they're on a web site instead of in a classroom now, but I wonder where the revolutionaries have been if they think that such techniques appear only in science fiction.
Sorel Reisman, writing in the April 2014 issue of IEEE Computer (The Future of Online Instruction, Part 1, pp. 92-93), seems to me to have a much better grip on the state of on-line education than many of its enthusiasts: he observes that MOOCs are simply learning management systems that support large enrolments, and that learning management systems are themselves simply content delivery systems tailored for educational content. He himself thinks that any real advances in on-line education must come from what he calls "adaptive learning", where the learning system adapts to the needs of individual learners. (Coursera's course recommends more or less the same idea under the name "personalisation", but the focus there is on how human teachers can provide it.)
A recent conversation with an experienced high school teacher told me that the same phenomenon exists in other schools: every now and again, revolutionaries come to school and ask teachers to "update" their methods with techniques that teachers have been using for decades, possibly under a different name. Perhaps such practices could use a buzzword of their own, maybe retrovolution?
Unwieldy communities
I recently signed up for an on-line course in university teaching with Coursera, in part because I was curious to see how massive open on-line courses ("MOOCs") work and in part as a lower-committment alternative to studying for a full-scale graduate certificate in higher education that I decided I wasn't currently able to afford or commit to. I might write more about the MOOC experience when the course is over, but I was first inspired to make a few observations on very large communities.
Logging in to the course for the first time, I was immediately impressed by my own smallness. There's nothing like glancing over hundreds of posts from other learners introducing themselves to remind oneself of what a tiny part of the world one occupies, even the relatively elite world of university teachers. Having happened to re-read Douglas Adams' Hitchhiker's Guide to the Galaxy recently, I readily identified with the Total Perspective Vortex used to torture prisoners by showing them just how insignificant they are compared to the universe in its entirety.
I quickly saw that I was only going to be able skim over the posts made by other learners, and that I couldn't expect other learners to spend any more time appreciating whatever I was going to contribute. I've come to similar realisations reading Usenet articles in the 1990s, and observing the comments sections of popular news web more recently: with so many articles and comments out there to read, and many of them being less than enlightening, reading all of them is a fool's task.
Here lies a problem for the idea that blogs and comments would radically democratise media and political discussion: it simply isn't feasible to hold a conversation with millions of participants. Matthew Hindman details the result for political blogs in The Myth of Digital Democracy (2008): only a tiny handful of blogs have a wide readership, and they're mostly written by the same kind of people who previously wrote widely-read newspaper columns.
Going back to my course, I came to see the main value of posting to the discussion board to be not in intimate conversation with hundreds of my fellow learners, but in working through my own thoughts and putting them into a form in which they might be digested should someone happen to read them. (I take much the same view of this blog.) When reading the discussion board, I can only hope to get an overview of what everyone else is talking about, with only the occassional pause to read an eye-catching item in more depth.
So I hope my classmates won't be too offended if I miss any posts that they've slaved over, only to have them drown in a sea of other posts. It'll take more than a nice web site to expand our brains to encompass conversations with a hundred other people.
Software development without coding
I recently found myself with contradictory reactions after reading an article on "citizen developers" on the ABC's Technology & Games site. Peter Fuller writes about the potential for ordinary users to develop their own software using "application platform-as-a-service" (aPaaS) technology. This technology is supposed to allow what software engineers call "domain experts" to construct their own domain-specific software without recourse to professional analysts and developers, at least for relatively simple applications.
My first reaction was that Fuller might be making a misguided attempt to promote software development as fun and easy. This reaction was probably also influenced by hearing the assertion that maths and science ought to be "fun" in order to attract school students in another recent ABC report, which had Australia's Chief Scientist bemoaning an alleged fall in education standards. I've long wondered if such advice might be misguided: mastering mathematics, science and engineering requires substantial effort, and anyone expecting fun and games is surely kidding themselves. Mastery might be all of rewarding, interesting and useful, but it's not fun in any conventional sense. As one of my harder-partying friends observed during our undergraduate days: "I can't really call myself a hedonist; I'm studying engineering."
Upon further reflection, I began to wonder if aPaaS or similar technology might also provide an opportunity for users to take control of their computers where they are willing to make an effort, but don't have the time to turn themselves into professional software developers. Could aPaaS be one approach to the critical computing that I pondered last month?
Having not investigated any aPaaS software myself, or observed any non-developers using it, I can't say for sure which reaction comes closer to the truth. My experience with integrated development environments hasn't been encouraging: they seem to encourage even professional software developers, let alone students and amateurs, to produce lazy code that satisfies the formal syntax of the language but omits error handling, meaningful comments and other qualities of well-made software. And I'm pretty sure I've heard similar claims about non-programming developers before — mostly recently, in the form of "mash-ups" — but I'm yet to see much useful software that is actually made this way.
Most likely, though, aPaaS can be used in both modes (as can integrated development environments): careful users can use them to increase their productivity and the control that they have over their computers, while superficial observers confuse cobbling together a few lego blocks with engineering. Fuller makes a similar point with a cooking analogy: many of us can put together a satisfying meal for a few friends, but we employ professional caterers when it comes to preparing a six-course meal for a hundred guests at a big event.
What's a STEM crisis?
I've recently been reading a bit about a possible "STEM crisis", or lack of one, mostly in IEEE Spectrum, but also on The Conversation. "STEM" is an acronym for "Science, Technology, Engineering and Mathematics", and the crisis, if it exists, is supposed to be caused by a shortage of graduates in STEM disciplines.
The disputants seem to me to be asking two somewhat different questions. STEM enthusiasts like professional societies and chief scientists start with the assumption that STEM is a good thing that we should be doing more of, and argue that we should therefore have more STEM graduates to do it. Economists and out-of-work STEM graduates start with the observation that there are already numerous un- and under-employed STEM graduates, and argue that we should therefore have less of them.
These two views are perfectly consistent if one accepts that we, as a society, ought to be doing more STEM. If so, the enthusiasts are really saying that there is a crisis in the amount of STEM being undertaken. STEM graduates experience this crisis as an inability to find work.
How much STEM should we be doing? In the Conversation article cited above, Andrew Norton assumes that we should be doing exactly that STEM for which buyers are willing to pay (manifested in Norton's article by how many STEM graduates employers are willing to hire). Taken at face value, this is more or less the answer one gets from basic free market economics: if doing some STEM gets the greatest value out of all the ways buyers could use the resources involved, the buyers will pay for that STEM. If doing something else with those resources gives the buyers a greater benefit, the buyers will do something else.
I think that most people, however, would agree that a significant proportion of STEM has the character of what economists call a "public good". Public goods are items like defence forces and street lighting for which is difficult or impossible to charge people according to their individual use. Markets may under-invest in public goods since would-be investors can't extract payment for them even though buyers exist who would actually use them.
Norton implicitly assumes that the government has estimated the value of public STEM and invested a suitable amount of tax money into it, creating a matching demand for STEM graduates in government-funded programmes. I suspect that the enthusiasts, however, place more or less infinite value on STEM. For them, there will always be a "STEM crisis" because no amount of government or industry investment can ever realise such a value.
Boldly going where many have been before
A couple of weeks ago, The Australian's higher education section quoted Anant Agarwal, president of edX, saying that "education hadn't really changed for hundreds of years" (27 March 2013, p. 26). I don't know which schools and universities Agarwal has visited over the past two or three hundred years, but the statement drew my attention to a tried-and-true technique of would-be revolutionaries: deny that anything that happened before today was of any consequence.
For those dreaming of the day that computers revolutionise education, university lecturers are apparently still getting about in black robes and discussing the finer points of Galenic medicine in Latin with their exclusively white male students. It makes me wonder who's really out of touch here.
Of course modern schools and universities also continue some practices that would be familiar to their mediaeval forebears, including the human teachers and lecturing and tutoring that I take to be the subject of on-line educational scorn. But perhaps there's a reason for this continuity: it works. Would anyone suggest that the Roman alphabet is due for a shake-up just because "it hasn't really changed for hundreds of years"?
Writing about older computer workers' difficulties with finding employment in her book Cyberselfish, Paulina Borsook speculates that older workers might be disadvantaged by the lack of excitement they show when presented with a new buzzword that looks suspiciously like technology they worked with ten or twenty years ago. So we're using thin clients to access our data in the cloud now? Sounds rather like the dumb terminals and mainframes that we covered in the history of computing discussed in my operating systems class last week. Unhampered by any knowledge (or at least experience) of history, younger workers impress by the excitement they show when encountering an idea for the first time.
Similarly, massive open on-line courseware seems so much more exciting if one has never encountered — or makes a habit of ignoring — the textbooks, video lectures, educational software and on-line learning management systems that existed before it. And Heaven forbid that any of our ancestors ever had a good idea.
