Adonis Diaries

Posts Tagged ‘car industry

Reflecting again: On design errors in human-machine interfaces

Note:  I occasionally edit, translate, and re-publish articles that I deem them worth disseminating: Worthy articles are meant to be read.

Matthew Squair posted this May:  “Having recently bought a new car, I was driving home and noticed that the illuminated lighting controls were reflected in the right hand wing mirror. These sort of reflections are at best annoying, but in the worst case ,they could mask the lights of a car in the right hand lane and lead to a side-swipe during lane changing.

This is one of the classic system design errors that is well understood in domains such as the aerospace field.   Not so much in the car industry apparently.

But what really interests me is the fractured nature of engineering knowledge that this problem illustrates. I guess there is an implicit assumption we make that “we’re all getting smarter”.  But if that’s the case, why are the same errors committed over again?

Henry Petroski points to a study by Silby (1977) of bridge failures:  The study shows a 30 year-cycle between major bridge collapse and posits that, in any technology, we go through a cycle of learning, mastery, overconfidence, and subsequent failure due to over reach.

I’d point to the fragility of corporate memory within organizations and design teams:  I recognize that in the current  environment of rapid organizational change, it’s extremely hard to provide mentoring and oversight for young engineers, who unfortunately “don’t know what they don’t know“!

This remorseless cycle of destruction is exacerbated by codes and standards that record ‘what’ must be done from a compliance standpoint, but not the why”  Without the reason for compliance there is always the temptation…

I do agree with Petroski that failure breeds reflection, insight, and knowledge and that engineers, (especially young engineers), need in many ways to experience failure themselves or learn through the failures of others.

Evaluations of cockpit transparencies for reflections are required as part of the development of a new aircraft. These effects are particularly a problem for fighter aircraft with a large curved canopies and where the pilots’ displays sit comparatively close to the canopy.” (End of quote)

I have published over 30 articles on related to Human factors in design.

Human Factors professionals attempted to establish various error taxonomies, some within a specific context, during their study and analysis of errors that might be committed in the operation of nuclear power plants for example, and other taxonomy that are out of any specific context.

One alternative classification of human errors is based on human behavior and the level of comprehension; mainly, skill-based, or rule-based or knowledge-based behavioral patterns. This taxonomy identifies 13 types of errors and discriminates among the stages and strength of controlled routines in the mind that precipitate the occurrence of an error, whether during execution of a task, omitting steps, changing the order of steps, sequence of steps, timing errors, inadequate analysis or decision making.

With a strong knowledge of the behavior of a system, provided that the mental model is not deficient then, applying the rules consistently most of the errors will be concentrated on the level of skill achieved in performing a job.

Another taxonomy rely on the theory of information processing and it is a literal transcription of the experimental processes; mainly, observation of a system status, choice of hypothesis, testing of hypothesis, choice of goal, choice of procedure and execution of procedure.  Basically, this taxonomy may answer the problems in the rule-based and knowledge–based behavior.

It is useful to specify in the final steps of taxonomy whether an error is of omission or of commission.  I suggest that the errors of commission be also fine tuned to differentiate among errors of sequence, the kind of sequence, and timing of the execution.

There are alternative strategies for reducing human errors by either training, selection of the appropriate applicants, or redesigning a system to fit the capabilities of end users and/or taking care of his limitations by preventive designs, exclusion designs, and fail-safe designs.

You may start with this sample of two posts:

1., and 2.

Note 1: Petroski, H. Success through failure: The paradox of design, Princeton Press, 2008.

Note 2: Sibly, P.G., Walker, A.C., Structural Accidents and their Causes. In: Proc. Inst. Civil Engineers. 62 (May 1977), pp. 191–208 part 1. 1977.




June 2023

Blog Stats

  • 1,522,346 hits

Enter your email address to subscribe to this blog and receive notifications of new posts by

Join 770 other subscribers
%d bloggers like this: