The Glass Cage: Automation and Its Consequences
The answer to this problem is indeed to build fail-proof technologies, but we all know that does not happen. Even the most ardent of the automation advocates would accept that we can not know all the eventualities in advance and plan for all kinds of contingencies. And, indeed, a more realistic appraisal would also acknowledge the fact that sometimes, we may not engineer for all contingencies even if we know them, because economics of production must take into account the likelihood of an event and costs associated. So, we would almost always build automated systems that can fail, and then attach a human operator tasked to act during such failure, but at the same time, this human operator would be, by the design of things, utterly incapable of doing so. Automation is, therefore, a self-destructive paradigm, unless we have learnt to think about technology in a different way.
Mr Carr really becomes interesting at this point. He is not a technology-averse exotic, but a sensible advocate of human-centred technology. His point is that we have got our priorities wrong in automation and we are asking the wrong questions. This is because technologies are not value-neutral, but a reflection of priorities of their maker. The automation technologies that we are designing are not about making lives better, but about displacing labour and human intervention in general, which is, using a popular expression, a project of the 1% against the other 99%. This is why we can not find a cure for Ebola but can boast about engineered babies, and send a mission of Mars but can not solve hunger problem. This is not the territory Mr Carr covers, but this debate about priorities are very central to his argument. Technologies, in the hands of workers themselves, are emancipatory forces, but in the hands of detached owners of capital, a tool for de-humanisation.
I perhaps stray from the argument of the book, but I am not reviewing it, just reflecting on my reading. And, this connects to my practice, in education. The questions we face are similar. How to put the teachers in control of technologies rather than trying to design technologies to replace or reduce the role of the teacher? How to transform the goal of using technology in education from cost-saving to creative engagement? How to move beyond the institutional or investor priorities in technology deployment, and make it a human thing? Mr Carr provides a persuasive argument of rethinking our ideas about technology and what it does to humans.