-->

When Computers Fail


We rely on computers to fly our planes, find out cancers, design our buildings, audit our businesses. That's all well and good. But what happens when the computer fails?
So begins Nicholas Carr's article on automation in November's Atlantic Monthly: The Great Forgetting.

0 1  T H E  P R O B L E M  O F  A U T O M A T I O N

Carr's illustrates his thesis primarily with pilots, who's training and practice uses automation so extensively that their expertise erodes and their reflexes dull. While automation has improved overall flight safety significantly, a new type of accident has come on the scene. These accidents occur when autopilot systems fail, and rusty pilots take the controls and then fail because they forget how to fly.
[Automation] alters the character of the entire task, including the roles, attitudes, and skills of the people taking part...Automation complacency occurs when a computer lulls us into a false sense of security...When a computer provides incorrect or insufficient data, we remain oblivious to the error.
Carr describes similar challenges in medicine and accounting. Interestingly enough, although Carr mentions "automated building design" in the header, he doesn't include an example in the article.

Perhaps its because I'm closer to buildings than I am medicine or accounting, but I find fault in this. I would argue that the use of digital design and construction has increased understanding of building tectonics. I'm not the first person to make this argument; many firms have seen industry learning and acclimation expedite when entry level employees model building components and details.

In short, I don't think we automate. I don't think we go on autopilot.

But with processes like robotic building layout and concrete printers, might we be on the cusp of this?

0 2  E N G A G E M E N T  A S  S O L U T I O N 

Carr describes a modification to automation that comes from psychologists.
You can put limits on the scope of automation, making sure that people working with computers perform challenging tasks rather than merely observing.
We must be purposeful, requiring human intervention (perhaps at random internals!) to maintain our chops.

2 comments:

AUS Adam said...

Laura

I disagree, I have seen on numerous occasions scheduling, and some cost estimation software incorrectly calculate counts, area's and volumes, fail to filter and other such issues.

Most cases it is the fault of the user, however many fail to double check. Harder to check but which I have seen is with certain estimation software providing inaccurate estimates for complex geometries, curved walls, and other such items that have caused issues in the long run.

Certain systems in engineering, engineers are aware of these bugs and train new users in getting around them to get accurate results.

In many instances analysis software simply cannot provide the most efficient design as the system does not have allowances for all factors or parameters.

Engineers and designers shouldn't things for granted because they are responsible for them. A pilot is not responsible for his autopilot going down or malfunctioning.
Engineers cannot blame their software if a building falls over :) or the costing is out by large margins or construction is delayed due to inadequate supply of concrete. Well they can but it won't go down well in court.

However I think machines taking more control will only serve to highlight this issue.

Laura Handler said...

Adam, that is an excellent point. Funny that I failed to consider those points of automation as I went on about automation complacency.

The article proposes some changes to software design:

1. Include easier way to validate that automation is successful.
2. Provide random interruptions to automation, which require user input..so they don't lose their skills.

(By the way..although the pilot is not responsible for his autopilot malfunctioning, he is responsible for his reaction and timing when it does.)