A Blog by Jonathan Low

 

Dec 9, 2015

Why Does A Field As Dense and Boring As IT Provoke Such Emotion?

Humans have a tendency to fear and hate that which they dont understand and cannot control. JL

John Naughton comments in The Guardian:

One of the most demanding occupations in the contemporary world is that of “IT support”. This is because it involves dealing with people who hover on the brink of suicide, derangement or gibbering rage, others begin to harbor murderous thoughts about the company that made the equipment that now torments them. (It is) a field whose work is intellectually impressive and rapidly produced, but also quite inbred and divorced from real-world concerns.
The IT Crowd notwithstanding, one of the most demanding occupations in the contemporary organisational world is that of “IT support”. This is because it involves dealing with people who hover on the brink of suicide, derangement or gibbering rage. For strange things happen to people when their computers go wrong: some become acutely depressed, suicidal almost – especially when they think that the novel or thesis on which they have laboured for years may have vanished into the ether; others begin to harbour murderous thoughts about the company that made the equipment that now torments them – which is why, even today, Bill Gates stars in many nightmares; and a few entertain conspiracy theories about the IT support staff who, they suspect, have arranged the malfunction simply to humiliate them. Accordingly, the sensible IT support person approaches his or her work circumspectly.
The reason that IT causes such distress is that, to the average human being, a computer is an incomprehensible device. Or, to use a technical term, it’s a “black box”, ie “a device, system or object which can be viewed in terms of its inputs and outputs (or transfer characteristics), without any knowledge of its internal workings”. In fact, of course, the main objective of the computer business is to turn every object it produces into a black box. That’s why taking a screwdriver to your iPhone may void the warranty, and why so many other devices have “no user-serviceable parts”. And it would be foolish to deny that that is exactly how most consumers want things to be. They do not value the “freedom to tinker” that geeks esteem so highly. They just want stuff to work.So far, so understandable. But there is a direction of travel here – one that is taking us towards what an American legal scholar, Frank Pasquale, has christened the “black box society”. You might think that the subtitle – “the secret algorithms that control money and information” – says it all, except that it’s not just about money and information but increasingly about most aspects of contemporary life, at least in industrialised countries. For example, we know that Facebook algorithms can influence the moods and the voting behaviour of the service’s users. And we also know that Google’s search algorithms can effectively render people invisible. In some US cities, algorithms determine whether you are likely to be stopped and searched in the street. For the most part, it’s an algorithm that decides whether a bank will seriously consider your application for a mortgage or a loan. And the chances are that it’s a machine-learning or network-analysis algorithm that flags internet or smartphone users as being worthy of further examination. Uber drivers may think that they are working for themselves, but in reality they are managed by an algorithm. And so on.
Without us noticing it, therefore, a new kind of power – algorithmic power – has arrived in our societies. And for most citizens, these algorithms are black boxes – their inner logic is opaque to us. But they have values and priorities embedded in them, and those values are likewise opaque to us: we cannot interrogate them.This poses two questions. First of all, who has legal responsibility for the decisions made by algorithms? The company that runs the services that are enabled by them? Maybe – depending on how smart their lawyers are.
But what about the programmers who wrote the code? Don’t they also have some responsibilities? Pasquale reports that some micro-targeting algorithms (the programs that decide what is shown in your browser screen, such as advertising) categorise web users into categories which include “probably bipolar”, “daughter killed in car crash”, “rape victim”, and “gullible elderly”. A programmer wrote that code. Did he (for it was almost certainly a male) not have some ethical qualms about his handiwork?
These thoughts are stimulated by reading a remarkable essay, The Moral Character of Cryptographic Work, by Phillip Rogaway, a computer scientist at the University of California at Davis. “Most academic cryptographers,” he writes, “seem to think that our field is a fun, deep, and politically neutral game – a set of puzzles involving communicating parties and notional adversaries. This vision of who we are animates a field whose work is intellectually impressive and rapidly produced, but also quite inbred and divorced from real-world concerns. Is this what cryptography should be like?”
His answer – and mine – is “no”. And it applies not just to cryptography, but to software in general.

0 comments:

Post a Comment