The BBC today discusses Urban OS, an operating system intended to control traffic lights, air conditioning, water pumps and other features of the urban environment without human intervention.
While I don't subscribe to the view that sooner or later a vast computer system ('Skynet') will begin to wage war on humans, I do worry about the removal of human intervention in such areas as the traffic management and the water supply.
Quite apart from the threat of software bugs which cause crashes, and human exploitation of the system (see The Italian Job), what about the deeper logic flaws which cause the system to operate in an unexpected way?
It was reported earlier that an algorithm on amazon caused a book price to suddenly rocket to 23 million dollars. Do amazon not test their software? Of course they do. But no software can be tested for every possible situation. This would be like trying to get an infinite number of monkeys to type the complete works of shakespeare.
I suggest we don't lose sight of the need for human supervision and intervention in these basic systems.