I Don't Want To Be A Nerd!

The blog of Nicholas Paul Sheppard

On embedding morality

2014-10-27 by Nick S., tagged as freedom, philosophy

I've just finished reading Evgeny Morozov's To Save Everything, Click Here (2013), which is something of a rant against what he calls "technological solutionism", or what I might otherwise call "techno-utopianism". Morozov is against a lot of things — so many and in such wide variety that it's hard to know what he is actually for — but one of them is technological systems designed to encourage or coerce good behaviour. Being a researcher in information security, the entire purpose of which might be said to be to coerce behaviour, I felt this idea required closer examination.

Morozov fears that deploying technological and psychological tools (he seems to find Richard Thaler and Cass Sunstein's Nudge (2007) at least as disagreeable as techno-utopians) that affect behaviour might rob humans of their moral responsibilities. Not only might such systems deprive humans of the ability to engage in civil disobedience, he imagines, but they might cause our moral sense to wither away altogether from lack of any opportunity to apply it.

Thaler and Sunstein themselves offer what I think is the most devastating critique of this line of reasoning: the designers of any system, technological or otherwise, cannot choose not to choose. The designer(s) of a system can make various things more or less difficult, or more or less prominent, or more or less valued, and so on, but they cannot design a system with no design. (And refusing to design anything is just accepting whatever choices are embodied in the status quo.)

Deep down, Morozov probably knows this, and he does make a few suggestion as to how he thinks certain systems might be improved. But what about the danger that our moral senses will atrophy through lack of exercise?

I heard a similar thought expressed in regards to digital rights management during a seminar in about 2009. The speaker (whose name I forget) told us that certain critics of digital rights management claim that it inhibits the moral expression of media users by not allowing them to decide for themselves whether or not to obey copyright law. This might sound noble enough, the speaker noted, but pointed out that not many of us worry that the locks on our doors might inhibit the moral expression of burglars. Most people really do want to inhibit moral expressions that they deem harmful; they just disagree over what is harmful, or what is the most effective way of dealing with any particular harmful expression.

In any case, I was recently wondering if establishing a prohibition might exercise our moral sense just as much (or even more than) not establishing one. When confronted with a rule that I don't understand, I ask: why does this rule exist? The answer may enlighten me about the point of view of the person who made the rule, or may cause me to suggest an improvement to the rule. Perhaps this is my engineering brain trying to figure out how things work. But I generally only feel comfortable with breaking the rule if I've consciously determined it to be bad one, or me to be in an exceptional situation.

No one is likely to advocate establishing prohibitions on everything just to make people think harder before they do something. But nor is anyone likely to advocate removing all rules in order to provide everyone with the opportunity to think about the same. For a start, what guarantee is there that they will think about whatever moral principles might be at stake? And what if someone (such as a burglar) exercises his or her freedom to impose rules on other people?

A better answer is that we need to think when we design the system, which is surely what any good engineer or lawmaker strives to do. There are numerous examples of designers getting it wrong — but also many examples of designers getting it right, or at least better than not doing anything at all. Because refusing to design anything is surely abandoning our moral sense just as thoroughly as unthinking submission to someone else's design.