I Don't Want To Be A Nerd!

The blog of Nicholas Paul Sheppard
Posts tagged as education

Dream jobs and nightmares at home

2013-02-08 by Nick S., tagged as education, employment

Today I read two articles expressing more or less opposite views (or at least hopes) of employment in engineering. On one hand, IEEE Spectrum presented its annual round-up of dream jobs, beginning with a lament that "unflattering stereotypes persist, and they're tired [and] out of touch with reality". The Register, on the other hand, reports the views of one Mike Laverick that you need a home lab to keep your job — that is, you need to be exactly the kind of technology-bound stereotype that Spectrum wishes to take on.

Spectrum is, of course, cherry-picking a very small number of individuals who have what it describes as "dream jobs" involving travelling around the world, working on exotic projects and/or making noble contributions to humankind. The Register might be more representative of the common mass of engineers working on (presumably) worthwhile but unglamourous projects for mundane employers in their home town. The commenters on The Register's article certainly sound a lot more like the lab-at-home folks than the high-flying Rennaissance men and women featured in Spectrum.

I don't have a lab at home, and, indeed, resent the notion that I ought to spend my spare time training up for a job in which I have high formal qualifications, years of experience and continue to work in day-in day-out. Laverick's attitude seems to me to pander to what I think are wrong-headed views of programming that prioritise familiarity with the latest buzzword over the fundamental engineering skills possessed by truly competent programmers. Yet buzzwords are doomed to come and go, and, I, at least, feel I have better things to do than pursue them in a lab at home. What other trade or profession demands that its members practice their craft not just in their professional lives, but in their spare time also? (One even hesitates to imagine the goings-on in the home labs of, say, surgeons and nuclear engineers.)

So I'm probably more typical of the readership of Spectrum than of The Register. Indeed, it isn't immediately obvious that Spectrum's dream workers have a better lot than I do (though I suspect they earn more money). Simon Hauger appears to do more or less the same kind of work that I and numerous teachers do, while Marcia Lee appears to does more or less the same kind of work that I and numerous other software developers do, albeit for a better-known company than most of us (the Khan Academy).

Perhaps the most important aspect about Spectrum's dream jobs is that they show some vision for engineering beyond engineering itself. While Laverick and his followers are beavering away in their home labs in pursuit of yet more engineering cred (or at least buzzwords), Hauger, Lee and the rest are thinking about how science and engineering serve education, art, adventure and human development. This might sound like a utopian dream, but, if engineering isn't serving something like this, why are we doing it?

The University of Western Sydney set to deploy black boxes

2013-01-06 by Nick S., tagged as education

The University of Western Sydney ("UWS") recently announced that it would give all new students an iPad. Numerous commentators on The Conversation and elsewhere have — probably rightly, in my view — panned the initiative as an example of marketing over substance.

UWS' own information on the initiative provides a vague assurance that "the iPad initiative will assist academic staff in the delivery of cutting edge learning and teaching." The concrete examples that follow are limited to online lectures and library services, which have been available for a decade or more at universities around the world, and work fine with devices that existed long before the iPad.

The Conversation quotes one Phillip Dawson observing that the iPad may help bridge the "digital divide" (though he thinks it is an expensive option). I can certainly see a lot of sense in providing facilities that mean that students, no matter what their background, are able to participate in their courses and complete the work required by them. UWS, however, seems to have fallen victim to the black box fallacy in thinking that iPads are the solution for all courses. Given that much university work involves writing essays, doing mathematics and (in the courses that I teach) writing computer programs, what are students expected to do with a device without a keyboard?

Dawson goes on to observe that students can expect "this sort of technology will be an integral part of the learning experience at UWS", which seems consistent with UWS' own announcements as well as the comments of Simon Pyke on a similar initiative at the University of Adelaide. If so, I pity the academics at UWS (and the University of Adelaide) who I suppose are being asked to teach to the technology instead of being offered the technology that best supports their teaching. I fear to write what I would think if someone told me that I had to teach programming using an iPad, which I understand to have no keyboard, no compiler, and no ability to run programs until they have been approved by Apple.

I'm pretty sure that Apple will be the biggest winner out of UWS' purchase. Apple will sell thousands of devices, and add UWS' imprimatur to its educational credentials. Maybe the students will get a piece of equipment with some value as a content delivery and communications tool, but to what are they going to turn when they want to practice the critical thinking, scientific skills, art and communication skills that they actually came to university to develop?

Checking facts and faking expertise

2012-12-21 by Nick S., tagged as dependence, education

Jason Lodge recently asked on The Conversation: is technology making as stupid?.

Of course this depends somewhat on what one considers to be "stupid". As Sue Ieraci's comment observes, "every generation appears to value its own ways of knowing and relating above those of the generations above and below." Lodge's article starts with whether or not rote learning has been displaced by ready access to sources of information such as Google. If so, we might be becoming "stupid" insofar as intelligence is measured by an ability to remember facts.

I, and probably Jason Lodge also, would be surprised if anyone still considered rote learning to be the pinnacle of "intelligence". Well before the World Wide Web even existed, there was far more information in the world than any one person could be expected to remember, and how many teachers these days would consider their students to be "intelligent" merely for copying something into an essay or computer program? Modern educators therefore prize skills like knowing how to find information, determining whether or not it is reliable, and synthesising it into a coherent response to a question.

I think that being able to recall a certain breadth of factual information is nonetheless useful: imagine that you had to resort to a dictionary to look up the spelling and meaning of every noun you came across! And imagine what a teacher I would be if I had to look up the textbook every time a student asked a question!

I suppose that knowing what needs to be remembered, and what can be left for looking up, is a skill of its own. A Java programmer who can remember the difference between "int", "double" and "String" is surely going to be far more productive than one who can't, for example, but it's probably safe for the same programmer to know that he or she can look up the documentation should he or she ever need to parse hexadecimal numbers using the java.util.Scanner class.

When giving advice about presentations to my research students, I often advise them that they ought to be able to talk knowledgeably about their subject without having to look everything up as they go. The title of the article aside, I guess Lodge is really asking whether or not technology has made us complacent about what constitutes "knowledgeable". Has ready access to search engines and the like, he asks, made us imagine we are experts in subjects that we can't actually talk about except insofar as we can look them up?

Universities to be replaced by technology?

2012-08-20 by Nick S., tagged as education

The Conversation recently published a couple of articles on "Mass Open Online Courseware" (MOOC), firstly by David Glance and later by Simon Marginson. This courseware, created and made available by various prestigious universities in the US, is supposed to enable any willing student to undertake a subject for free and obtain a "statement of accomplishment", or similar document. Both Glance and Marginson appear to believe that MOOCs are "disruptive", and wonder if such things will mean the end of university education as we know it.

Being employed as a teacher in a university, I, of course, might not have much to look forward to in being replaced by on-line courseware produced by teachers with much more prestige than I. As Gavin Moodie documents in his comments on the above articles, however, various sorts of courses have been available before -- all the way back to textbooks -- and people like myself are yet to be replaced.

Much of what I read about technology in education, particularly at the pop level, says a lot about technology and not much about education. I consequently found a lot to like in Tony Harland's critique of the supposed rise of a "net generation" amongst students and the resulting technophilia in Chapter 6 of his recent book University Teaching: An Introductory Guide. Harland quite rightly points at that it is highly unlikely that "students have undergone rapid evolution into some new type of hominid" whose learning needs are radically different from those of students of previous generations.

Why teach when I could be developing software?

Applying for lecturing positions, I've sometimes found myself responding to selection criteria like "An interest in developing the use of new technologies and approaches in teaching and learning" (this particular example comes from a position description for a Lecturer in Computer Science at Charles Sturt University in 2011). At the risk of making myself unemployable at such institutions, I'll admit to feeling unsure of how best to answer criteria that seem to me to make technology an end in itself.

Of course I use technology in my teaching where appropriate technology is available, and I believe it would help the students learn or meet the administrative needs of the university. I'm sure I'd have a very tough time teaching programming without a computer lab. I've even had a thought or two about how I might write some software to provide some useful teaching tool. But, as a teacher, am I supposed to be focusing on the development of technology, or the development of teaching?

A few years ago, I went to a research seminar presented by a mathematician friend of mine, in which he took the now-extraordinary step of writing out his material on a whiteboard instead of bringing a computer pre-loaded with slides. I thought it worked fantastically well: I, at least, find it much easier to follow mathematics by watching someone write it out line-by-line rather than being confronted with a slide full of equations.

Should my friend be chastised for failing to develop technology to support his presentation, or for ignoring disruptive trends in presentation technology? Or did he just use the best technology for the job?