"Feels free" and the data collection business
Ashlin Lee and Peta Cook contributed another article on surveillance to The Conversation this week, this one highlighting what they see as the inadequacies of the Reset the Net campaign. They say that "while the campaign is laudable in its efforts to raise the issue of surveillance, there are some glaring oversights present", mainly because the campaign neglects the huge amount of data collection undertaken by non-government actors, including some of the campaign's own supporters.
All this drew the usual cluster of comments bemoaning the surveillance society in which we supposedly live. The trouble is, as I saw it, the targeted advertising for which this data collection is essential is what enables all the "free" services that are so popular with Internet users. Consequently, avoiding or eliminating it is not so straightforward as naïve anti-surveillance commenters (and, indeed, Reset the Net) seem to suppose.
George Burns followed up with a suggestion that early "cypherpunks" and academic free-content advocates provided the foundation for the present dominance of corporate advertising by insisting that content be provided free of charge. It's hard to say whether or not cypherpunks and academics in particular were responsible for the preponderance of advertising on the Internet, but the widespread expectation that Internet services be provided free of charge is surely a major contributor to it.
Working in copyright and technology, I occassionally heard someone suggest that music retailers could combat copyright infringement with a business model that "feels free", which I supposed to mean some sort of comes-with-music or ad-supported approach in which buyers don't pay for individual tracks. There may be some merit in such models, and "feeling free" certainly works well for Google even if its success in many other endeavours might be debatable. But "feels free" implies "ignorant of the cost", leaving Google and Facebook users acting surprised and offended whenever the data collection activities of these services are mentioned.
I've previously contemplated re-badging so-called "free content" as "ad-enabled content" to more accurately reflect the mechanism by which it is resourced. A harsher critic might suggest that "surveillance-enabled services" would make the message even balder. Either way, it's hard to see how data collection, corporate messaging and other annoyances can be addressed without confronting the business models by which the services in question are delivered.
On being replaced by technology
By way of celebrating fifty years of IEEE Spectrum, the June 2014 issue investigates some technological trends that it hopes will bring us "the future we deserve". Tekla S. Perry (pp. 40-45) describes a part of this future in which computer-generated humans become indistinguishable from actors captured on film. Explaining why we need to create fake humans when we already have seven thousand million real ones — and plenty of them out-of-work actors to boot — takes some doing. Perry makes some interesting points in this direction, but I nonetheless winced on behalf of all of those already-underemployed actors who might be wondering if Tesla's future leaves them with anything to do.
Fears that we'll all be put out of work by automation go back a long way. Contemptuous dismissals of such fears, and attendant references to Luddism, probably go back nearly as far. The really interesting thing about replacing the work of actors (if it were to happen) is that we'd be replacing something that people actually enjoy doing, not just some tedious chore that they do for the money. As much as an anti-Luddite might assure me, for example, that the growing economy will find me a new job if university teaching were to be replaced by technology, would I find the new job as inspiring as the old one?
One solution for those who enjoy now-automated tasks is to simply continue to do them as a hobby, just as I and other mediaevalists hand-make costumes, beer, embroidery, and other things even though machines can make the same with much less effort. But that does seem to doom us to spending the best eight hours of every day in uninspiring work done just for the money, fitting our passions into our spare time.
By coincidence, The Drum had Alan Kohler take on automation and unemployment in the same week that I read Spectrum. According to Kohler, "automation is suppressing employment, wages and inflation and will do so for a decade or more to come", giving headaches to central bankers attempting to set policies that increase employment while controlling inflation. This is all great for the owners of said machinery, though, who can obtain all of the revenue from their output without having to pay any workers.
Kohler's argument is too sketchy, and my knowledge of economics too weak, for me to say much about his claim. But the potential for automation to create inequality is also a recurring theme in Spectrum's examination of the possible downsides of its futures: those who control technology can use that power to create even more technology and gain even more power, while the rest languish in technological powerlessness.
The threat in Kohler's and Spectrum's dystopias isn't that automation will one day throw masses of people out of work, as the archetypal Luddites might have feared. It's that automation will slowly transfer dignity and power from the broad mass of people to an elite few who control the system. I doubt that many people miss the drudgery faced by mediaeval peasants, who have now been largely replaced by machinery in developed nations. But will we be so glad to give up the passion, autonomy and self-respect that inspires artistic and professional lifestyles?
On the corporatisation of social spaces
Writing in The Social Interface last year, I supposed that mainstream media's following of Facebook and Twitter rather than synthetic worlds came down to the numbers involved: the former simply have many more users than any single instance of the latter. A second possible explanation occurred to me after reading Wagner James Au's The Making of Second Life (2008) recently. The book was written at around the time that I heard all those stories of entrepreneurs and companies opening for business in Second Life, and Chapter 10 has a bit to say about their fates.
Au records that Second Life users largely ignored the corporate spaces, preferring to remain in the areas created by traditional non-corporate Second Life users. The owners of these spaces, one might therefore suppose, have little incentive to talk about Second Life in their own spaces. Meanwhile, many corporations have thousands of Facebook "likes" and Twitter followers, so why wouldn't they prefer to talk about those? Could it be that Facebook and Twitter's visibility comes about because they turned out to better homes for major media corporations, or at least because their user community was more welcoming to said corporations than the user community of Second Life?
Off-hand, I can't think of any way to test such an hypothesis — certainly not from the armchair in which I write this. But I did happen across a couple of observations consistent with it.
Firstly, I recalled my own recent observations about very large "communities": as nice as it sounds for everyone to participate with everyone else, it just isn't possible to do it. We therefore conduct our public business through large institutions, even if few people have much affection for them. Facebook provided a home where Second Life did not, so there the intitutions are and so is everyone else but a few corporation-averse hold-outs.
Secondly, The Register's Richard Chirgwin drew readers' attention to some marketers' lament that up to 80% of sharing of links and articles occurs via e-mail and text messaging, which marketers have no means of tracking. So the marketers would certainly prefer it if we were all on Facebook. At least one of the marketers involved seems to be so impressed with Facebook et al. that Business Review Weekly quotes her as saying that "dark social [e-mail] is a very interesting development" even though, as Chirgwin observes, people of sufficient age have been using e-mail and text messaging for at least a decade before anyone had heard of what we now call a "social network".
For many of us, I'm sure that getting away with 80% of our communications unmonitored by marketers is a sign of hope. And we can remind ourselves that we aren't defined solely by our profiles in major media outlets: Second Lives and e-mails aren't failures just because they don't enjoy the media profile of Facebook or Google, any more than my local baker is a failure because he only sells bread to people in my suburb. If Second Life and World of Warcraft entertain millions of people and keep their operators in business, why worry if some other corporation isn't paying much attention to them?
Unlearning the quest for the latest fad
Towards the end of Here Comes Everybody (2008), Clay Shirky writes about the differences that young people and old people face in adapting to new technologies and circumstances. He seems to think that older people are at a disadvantage because they need to "unlearn" the conventions that they learned when older technologies were in vogue, while young people are ready to take up new technologies from the get-go. On the other hand, he acknowledges that young people are prone to seeing revolutions everywhere they turn.
Shirky might be right within the confines of his discussion, which refers to a particular set of communication and collaboration technologies. I nonetheless think I'd prefer to use the word adapt rather than unlearn: the latter word suggests to me that we've somehow been stuffing our heads full of useless knowledge. But any unlearning seems to me to do at least two disservices to our skills and knowledge of yesteryear.
Firstly, it suggests that those skills and knowledge were pretty superficial to begin with. It's the kind of thinking that presumes that programmers of my vintage, for example, must be totally unable to write Java or mobile applications since we learned to program in C and C++ on desktops. But what we really learned was object-oriented programming and problem-solving, which are just as useful now as they were in 1980. Anyone hung up on the name and syntax of a language probably wasn't a very good programmer in the first place.
Secondly, it's a surrender to technological determinism. We place ourselves at the mercy of the latest technology and its purveyors, unable or unwilling to decide for ourselves which technology (or abstention from it) is really the most effective one for our needs.
I read Jared Diamond's The World Until Yesterday (2012) at around the same time, and found his views on aging somewhat more heartening. Diamond argues that younger people have greater vitality and creativity, while older people have greater depth and breadth of knowledge. These qualities, he thinks, ought to complement each other rather than have us all pining to be twenty. Amongst academics, for example, it's the under-forties who produce the brilliant new theories and proofs in narrow fields, while it's the over-forties who synthesise the knowledge across multiple fields. (Admittedly he's unclear on how this applies to less heady occupations like construction work, athletics and hospitality.)
In this vein, one of my students recently asked how the lecturing staff at my institution were able to teach so many different subjects. Because we've had twenty years to learn it all, I suggested. Furthermore (I might have continued had I been talking to Shirky), we don't need to forget everything we know in order to learn some cool new skill or piece of knowledge: we add the new skills to the old.
I suppose that an enthusiast of the latest technology might say that Diamond, being seventy-something, would say that, and I, being forty-something, would agree with him. Then again, Diamond and I might equally say that our critics, being twenty-something, would say the opposite.
Adventure, human nature and the fashion of the day
I was recently without Internet access at home for a week, apparently due to flooding at my local telephone exchange. I've heard that some people get very upset at losing their connectivity even for periods much shorter than a week, most recently in a Conversation article from Michael Cowling claiming that "we are all connected, every minute of every day, and without your phone you are on the outskirts of everybody else’s new, more digital, world." The local newspaper also ran a suitably angry headline on a stand outside my local newsagent towards the end of the outage. (I didn't read the newspaper itself.)
Frustrating as the lack of connectivity might have been on occasions, I actually found myself enjoying the adventure of a daily trip to the local library or city mall, where I could check my e-mail using WiFi services provided the local council. (I used to wonder what use public WiFi would be given that we all have Internet connections at home anyway, but now I know.) I was reminded of the days of dial-up modems, when connecting to the Internet was a minor treat, and I maintained a list of Internet-things-to-do to be serviced by dialling in for a couple of hours every day or two. The only really annoying thing, in fact, was that I fell behind in my Coursera studies due to an inability to download course videos over the public WiFi network. I was almost disappointed when the fault came to an end and the adventure was over (though I did catch up on my studies.)
One might suppose that I'm quite a different person to the smartphone-driven folk that inhabit the world described in Cowling's article. I'm certainly older. On the other hand, I presume that the video that Cowling presents to support the quote at the beginning of this entry is staged — not even the youngest and most gadget-conscious of my acquaintances or students behaves anything like the folks shown in it, and I'm sure that most people would regard those folks' behaviour as anti-social and obnoxious.
I recently went on a camping trip during which I was told that a young camper fitting Cowling's description had, in the process of this camp, discovered that she could, in fact, enjoy time without her gadget. One can speculate that I've just had twenty years longer than her to find this out, not to mention first-hand experience of a time when everybody went without a mobile phone all the time.
Perhaps being without the Internet appeals to a similar part of us to that to which camping appeals. I don't suppose I'd want to be camping indefinitely, though maybe I could if I had to given that I'm of the same species as ancestral humans who reached every scrap of land except Antarctica without motorised transport, electricity, or even agriculture. Similarly, my younger acquaintances can surely go without their phones for a bit, and might even enjoy it up to a point, given that all of us did just that only twenty years ago. We just need to remember that there's more to us than the fashion of the day.
