Over 10 years ago I attended one of my first Abnormal Situation Management (ASM) Consortium quarterly meetings. One of the heated topics discussed at this meeting was the "lights-out" scenario for future plants. Time has passed and we still have humans "in the loop". Recently, this topic once again is gaining attention and for some very good reasons. There have been significant improvements in remote monitoring that now enable companies to reduce human exposure to potentially hazardous environments. Risk reduction is certainly a direction we want to go but are there hidden problems? How does this really work? Did we once gain forget that there is always a human component in the system? The advanced technology we are seeing today comes with a tsunami of data and complexity that means something to someone - hopefully. Failing to take into consideration how the human fits into this system could have serious consequences.
There recently was a thought provoking article on "lights-out" automation as a potential competitive advantage
http://www.automation.com/automation-news/article/training-improving-ope...
So I asked the Acuite team for their thoughts on the arguments made in the article.
This is the rebuttal-
This is a classic case of the "substitution myth" of automation - the idea that you can just replace humans with automation, one-for-one, without impacting the rest of the system. The author is right insofar as that focusing solely on operator effectiveness measures is not always the right strategy, and that many tasks could legitimately benefit from automation. But what he misses is that any reasonably complex automated system will itself have to include measures to support operator effectiveness. It's not an either-or proposition. 30 years of research on automated systems have shown that if people have to monitor, maintain, and handle exceptions for automated systems (which they always do), then you need to design for effective coordination (e.g., displays to show what the automation is doing and why, and methods for intervening and re-directing the automation in exceptional situations, and training to allow operators to use them). The greater the complexity and scope of the automation (e.g., the "lights-out" variety), the more critical this need becomes. Automation doesn't remove the need for human-centered design, it just shifts the target.