In my previous post, I described technocracy as something that is positive in project and product management, and in team organization. In this post, to supply a boundary to my previous text, I will make the case for the opposite. Technocracy is government by technical experts. Such technical experts may be engineers or medical doctors for example. Technical experts, engineers, medical doctors, and other such technocrats are notoriously blind to the limits of their expertise. It is due to that blindness that during the second world war an expert in logistics could play a pivotal role in the deportation of millions of Jews to the German death camps.

This is not to say that all engineers, medical doctors, and experts in general would, if given the chance, commit war crimes and atrocities. It is however a glaring historical example of how technocracy can go horribly wrong.

The reason for this, in my view, is that there is no moral, ethical, hermeneutical force driving technocracy: technocracy, at its core, has no moral compass to guide it — only a target, a short-term goal to reach. Whether that goal is to efficiently and profitably run a hospital (leading to non-paying patients dying in the street), efficiently removing a perceived threat to the German way of life (leading to the holocaust), or efficiently implementing a less objectionable project; is of little import to the technocrat.

Norman Cousins once said: “History is a vast early-warning system”. He was, of course, right, but history is also routinely ignored. We ignore it, however, at our own peril: those who do not learn from the mistakes of the past are bound to repeat them, to paraphrase George Santayana.

In project management, government by experts has its merits. Those experts, however, need to be aware of the requirements that the project is intended to meet; the need the project is bound to serve. They must further be bound by ethics: software architects and software engineers need to keep in mind that their architecture and the development of the software serve the purposes of the business they work for and the customers of that business — however annoying the demands of the marketing or sales departments may be — as well as the needs of society, and humanity.

Losing track of the ethical implications of one’s work may lead, for example, to human lives being valued less than an $11 “extra” cost for a fuel system that would not have the car explode in case of accident. The Ford engineers and decision-makers that made that particular decision were not evil people: they arguably simply, but criminally, lost track of their over-arching responsibility which was not to the company, but to the safety of the person using the car.

This is not a repudiation of my previous post: it is mise en garde that my previous post should not be taken out of context.