i,.I_L - NASA Technical Reports Server (NTRS)

1 downloads 0 Views 2MB Size Report
Jan 19, 1998 - a safety point of view. The technology changes on the flight deck also have ... to address if we are to maintain and improve aviation safety in a changing ...... S. Card,. J. Hochberg, and. B. Huey, editors,. Human. Performance.
COINITIV[

Learning

STSTEilS

IN|INIIIIMi

LAIOIATOIY

from Automation Surprises and "Going Sour" Accidents: Progress on Human-Centered Automation

/?i,.I_ L///,j David Cognitive

Engineering

Institute The

/ 7 c/__-

D. Woods

Systems

for

Ohio

Laboratory

Ergonomics

State

Nadine

_t- c /6

University

B. Sarter

Aviation Research Laboratory Institute of Aviation University

of Illinois

Final NASA Cognitive

Ames

Engineering

at Urbana-Champaign

Report Research

C/enter

in Aerosp/lce NCC 2-592 V

January

Applications

19, 1998

S'

/

1

Advances

in

technology

transports

has

an economic deck

and

also

system

had

have

and

many

had

different

new

sometimes

What

are these

from

them? Do

they

Or

Are

the

with

or near

is

other

accidents

arisen

or human

error?

third operators

and

aviation

shown 1).

should

we

-

the

flight

(Figure

what

possibility

the

experience,

have

as well

both

of the

Operational

jet

from on

aspects

and

of small

they

that

learn

represent

automation?

independent

small

a record

provide if we

many

effects

changes

automation,

a

a few

is otherwise

to address

positive

commercial

glitches

revealed

by

misses?

represent

problems

on

technology

have

cockpit

a series

been

occasionally

between

just

glitches

these

and

there

automation

performance.

problems

breakdowns

changing

areas

where

there

of outstanding

us with

evidence

are to maintain

and

are cracks

designs about

improve

and

to be

systems?

deeper

factors

aviation

safety

that in a

world? do

provide

the

reverberations

insight

technologies

into

and a

series

generic

of

(Sarter

by surveys,

operational

et al., 1995;

of

systems

automation

Degani

of human

on

over-automation

in what

we need

on

effects

problems

patched

Based

reverberating

perhaps

these

How

The

incidents,

accidents

Or do

of view.

represent

specific Do

point

problems

coordination

of have

surprising

instead

levels There

aspects

investigations,

new

effects.

a safety

research and

and

Introduction

and

technology

issues

about

(Winograd

and

investigations Woods,

1992;

experience

Eldredge

et al.,

change

1994; and

1991;

2

developing

Woods,

of

pilot 1995;

on

the

flight

deck

human-centered

1997)? interaction

1997a,

incident

data

Tenney

et al.,

1997

with

cockpit

b), supplemented

from

other

1995;

Wiener,

studies 1989),

(e.g., we

have

found

that

automation are

are

more

symptoms

that

human-machine the

Many

synthesized

new

al.,

1993;

of

the

results

summarizes

that

ability

interface

al.,

1996).

can

help

back

safety

a broad

and

concerning

In addition, studies

care

this

we

of

(e.g.,

Cook

kind

and

of

volumes,

find

physician

medicine

1996;

comprehensive

we were

range

This

the

research

for

and

safety

in other the

crews

crew-automation

groups.

future

roles

that

with

assess

flight

organizations,

changing

of

Moll

Woods,

research

Billings

player,

to

and

are (1996)

participation

able

in

improve

research team

safety.

results that

was

examined

the

implications

of observed

for

investments

to

in

effort

helped

us step

the

personnel.

aviation

managers, back

to maintain new

deck

problem,

to discuss

training

of operational

a FAA

investment

error"

and

this;

flight

pilots,

where

new

"human

of the

research,

modern

of stakeholders

areas

our

It uses

to

maintain

coordination

investments

point

implications

and

from

investigations.

perceived

line

related

emerged

arise the

the

has

accident

problems

of a team

project,

with

industry

implications

and

by our

between

difficulties

carrier

in critical

with

difficulties

phenomena

from

Woods,

pattern

us deal

more

In this

with

These

disaster.

and

two

why

tremendously

the

results

reports,

of

to step

facilitated

behind

and

the

incident

automation

The

towards

systems

in

and

paths

interaction

glitches.

patterns

Obradovich

discussed

understanding

make

patterns

crew

of individual

deeper and

et

research,

strategies

a series

surround

et al. (1994).

paper

related

that

computer-based

and

Woods

This

of

with

Charante

and

than indicate

kinds

interaction

1996).

problems

cooperation

same

van

the

and

systems

(Abbott

domain,

improve including

manufacturers, assess

and investments

the

enhance in

et

implications aviation automation

and of safety are

figure 1

Insert

Figure

1.

Human

The

Reverberations

here

of Technology

Change

on

the

Flightdeck

for

Performance.

2 Impact

One

about

way

of Technology

to recognize

is to listen

to the

interacted

with



directly



through

the

voices

many

Change

pattern we

in conversations their

that

heard

different

on Cognition

underlies

in our

judgments

the as

Collaboration

automation

and

investigations.

operational about

and

people

impact

human

In these

and

error

studies

we

organizations,

of automation,

expressed

in

surveys

about

in incidents

that

occurred

cockpit

automation, through

their

through

reported

their

behavior

performance

coordination

between

in

crew

simulator

and

studies

automated

on

that

systems

the

line,

examined

in

the

specific

flight

contexts. We

will

summarize

adopting

the

research

results

actual

point

Coordination

Pilots of

and many

automation

issues

made

2.1 Automation

results

of view

and

statements

the

of

the

of different in their

multiple stakeholders

words.

to us in different

converging

The

and

statements

studies

by

expressing

are

paraphrases

by the of

contexts.

Surprises: Breakdowns

instructors modern that

described cockpit

were

Between

strong

Crews

and

revealed

systems. but

and

sometimes

They

Automation

the

clumsiness

described silent

and

and aspects

difficult

complexity of

to direct

cockpit when

time

is short.

the

tools

The

We

that

users'

are

saw

and

supposed

perspective

expressed

by the

Wiener,

on

is it doing



What

will



How



next?

did

I get

into

Why

did

it do

this?



Stop

interrupting



I know



How



Unless

I stop you

computer

and

new

provide

generation

pose

in

challenges

imposed

"added

functionality."

of automated

describing

by

incidents

systems

is best

(extended

from

this

stare

user

Questions Woods

and (or

automated

taken)

by

the

to do.

The

sources,

for example,

changes

where on

its

environmental The

gap

the

a gap

are

system

p. 68).

these

point

they

inputs

such

pilot

being

the

doing, can

its

surprised

Sarter later

actions begin and

what

what

the

they

are

from

several

or indirect

mode

status

internal

1995;

of

arise

changes

Woods,

by

automation

and

errors

its

(Sarter,

surprises

as mode

inputs,

and

surprised

understanding are

program

_ul'prises

are

a mismatch

autonomously

crew

automation

between

what

human-

of it as a quirky,

Automation

user's

of

computer

think

crews

the

such

observer "A

must

system.

to do,

(Sarter the

to

where

between

for

as

user

1995,

of

conditions in

the

misassessments

erroneous

interpretation

results

that

trigger

one

agent

autoflight

set up

initial

in. why

term

i.e., situations

and to

systems

the

I want.

this?

illustrate

like

1997),

doing

can creep

(Lanir,

statements

lead

from

is so obscure ..."

busy. it to do what

statements

miscommunication which

to get

defined

Billings, not

I am

machine

and

person

and

while

at it, changes

interface

but powerful,

mode?

way

interaction

whose

a).

this

is some

questions

going

current

they

me

there do

These

users

them

face

now?

do

based

the

pilots

1989):

What

with

how

to serve

questions



taken

heard

and

behavior

logic

and

and

Woods,

when

the

sensed 1997

aircraft's

behavior does not match the crew's expectations. This is where questions like,

"Why

won't

It seems

that

displays

it do what

the

of data

on

behavior.

The

an

behavior,

they

is that

the

leave

a

The

greatest 1.

directions 2.

generally

strongly

in users'

act

mental

situations,

and

3.

feedback

about

relative

to the

Automation

state

crews

"funnel"

of evidence

observe

crews

simulations, linked

to the

and

find

design

sometimes

undesirable

aircraft

between

expected from

when of

the

and the

actual

situation.

aircraft

behaves

point

without

descent

altitude

without

actual

aircraft

behavior,

leveling

off.

it may

an

undesired

result

where

the

misunderstanding

not

occurs.

1996).

for

automation

surprises

is

direct

of how

without

their

activities

immediately

preceding

and

machine

future

partners

behavior

of

work

the

in

agent

world. one

automated about

own

partner,

the

are

interacting we

The in most

potential

their

models

of the

surprises

between

on

human

different weak

systems.

converge:

systems their

from

is reached,

top

before

arise.

misassessment

a target on

the

mode?"

of surprise

(cf., Billings,

that

this

automated

occurs

accidents

disaster

their

gap

the

through

into

to or recover

past

been

factors

from

gaps

detection

to avoid

automated

the

to respond

have

three

and

begin

I get

of the

point

detected

interval

shows

when

the

recovery

long

evidence

thus

is based

there too

and

of a problem

Unfortunately, persisted

activities

or flying

sufficient

notice

or

manner-flying

detection

not

does

has

can

this

descent,

did

of unexpected

crew

unexpected

initiating If the

the

"How

state

observations

Once

problem

in

the

is detected,

based

aircraft

generally

about

misassessment cases

crew

I want?"

kind

of

breakdown

systems. these with evidence

of automation

kinds

Our of

cockpit

automation

to the

the

coordination

investigations

coordination

of a variety and

in

of training

revealed

breakdowns. in performance users

a If we

full

mission problems

receive.

The

problems observed are interface

design

error.

If we

breakdowns significant misses

crew

and

at operational

and

errors

human

there

are precursor

is a chain interface

away

occasionally,

from

these

circumstances,

same

where or

but,

in

also

have

these

most

of human

coordination

cases,

a small

with

no

number

of

breakdowns

to the

automated

these

and

between

sequence

systems

of events.

the

and

sometimes

performance

where

flight

negative

problems

sequence

of

crews

ways,

occur

events

but is

in

later

outcomes,

problems

with

forms

that

coordination

in predictable

bad

find

contributor

between

performance events

predictable

we

we

human-computer

where:

circumstances

redirected •

these

"classic"

to certain

are a significant

affect

of

occasionally,

where

of the

innocuous

lead

Unfortunately,

there

characteristics

result

experience

occur

or accidents

words,

the

that

look

automation

In other



characteristics

consequences.

near



sometimes

other

occur

in

the

contributors

context

present,

of

and

more

events

vulnerable

spiral

towards

disaster.

2.2

The

These

Going

for

accident Cook,

(originally

occurs

and

at

from

1991).

an But

misassessments

and

the

situation

situation

least

when

evolving through

crew

accident

results

McDonald,

triggers from.

the

on

of

In

or a set of circumstances

event

manages

between

kind

based

unproblematic,

recover

effect,

in coordination

a particular

Woods

event

This

Accident

breakdowns

potential

and

Sour

studying this

come

of

a serious

is managed

into

hazard.

risky

is,

in

commissions the

and

that

create "going

room

class

isolation

that

the

operating

general

in

miscommunications, into

-

together

viewed

series

automation

sequence

situation a

and

of

from

principle, and

incidents;

or even

an

to be minor hindsight. possible

to

omissions,

human-automation incident

sour"

accidents,

appear or

the

team accident.

In

Several recent accidents involving While

they

terrain,

are

some

terrain

of these

since

crew

is

classically

the

example

sour

incident

data

scenario

complexity

and

to be a side

concern

rather

see

that

new

increasing

the

the

aircraft

of

et al., 1997).

effect

of complexity.

technology,

for the

into

flight

into

the

flight

one

vivid

and

developed

is increasing

the

sour

flight

Research

when

going

signature.

and

overview

Sarter

way,

potential

as managed

a brief

human-centered

this

as controlled

handling

(for

terrain

seems

the

technology-driven

into

aviation

show

be described

are

automation

flight

surprises

in

better

systems

the

raise

to

may

automated

of managed

The going

referred

cases

supervising

automation

in

a

operational

signature

(Billings,

1996).

After-the-fact, who

going

have

1994).

sour

complete

Since

the

opportunities

look

knowledge

system

to

incidents

of the

is managed

break

the

reviewers



"How

could

they

have

missed



"How

could

they

have

misunderstood



"Why

didn't

In

instructions

fact,

one

whether as, "All

for

reviewers, of the

together

The

test

and

lesson

coordination

what

learned between

logic an

with

advantage

the data

was

the

of

affairs

of the

incident

chapter

lead

et

it is easy

et al., 1994, piece

al.,

to see

benefits

of

6),

of information?" to us?"

to Y, given

the

inputs,

system?"

is a going of hindsight,

available,

(Woods

The

critical

X would

to outsiders

disaster.

Y, it is so logical

that

whether

necessary

to see

internal

(Woods

X, it was

dreadful

in hindsight,

towards

to comment

understand

and

state

hazard,

progression

allow

past

actual

into

hindsight

they

mysterious

why

was

sour

scenario

make no

one

is to

ask

comments

such

able

it all

to put

it meant?"

from

recent

the

automation

accidonts and

involving the

flight

breakdowns crew

is:

in

the



the

going



there

sour

scenario

is a concern

residual

that

this

risk_

in aviation.

data

and

Only

future

risk.

Investments

training

is an important

events

will

in turning

crews

to better

circumstances

category

offs

this

automation

automated by

guarding

of accident

a significant

whether

cockpit

pay

kind

represents

reveal

manage

produce

general

category,

portion

is a growing

into

a team

resources

in

against

this

a

of the

part

of the

player wide

and

in

range

type

of

of

accident

scenario.

Luckily,

going

systems.

The



the

sour going

expertise

the

that

are

compensates

coordination

team

and

strategi_

a great

We

heard

expertise

tactics

cautionary

burn

you."

Some

for teamwork.

complex

of two

factors:

personnel

and of

allows

trigger factors

this or

kind

of

exceptional

are

in general

and

notes

how

ways tricks)

of the

9

training

(through

systems

contribute

policies, job

to restrict

in

difficult

automation

from

successfully Some

to "be careful,

recipes.

use

line

situations.

them

the

to

procedures,

done

for some

embodied

expertise

departments,

reminding

or in particularly

design

operators' that

to get the

to pilots

are ways

how

automation

about

workarounds

Complexity

about

of

automated

Many

the

deal

develop

of some

in

very

progression;

Technology-Induced

features

simply

deficiencies

and

the

are

words,

and

a collection

for

individual

of automation

in

because

systems

incident

heard

of these

suite

blocked

we

the

strategies

even

together.

despite

it can

is usually

when

individuals

clumsiness

rare

human

only

breakdowns.

organizations,

the

erode

Expertise

investigations

usually

can

relatively

operational

or stop

come

3 Human

In our

in

significant

circumstances

are

progression

to avoid

problems

scenario

sour embodied

practitioners •

accidents

Some

of portions

situations. a Human

are of the

In other Factors

point

of view

and

produce

adaptation

Overall, the

technology

and

transition

training

training).

But



1994,

people

this.

strategies

Crew

skills

and

"They're

chapter

their

training

are

done,

expertise

primary

glass

limits

When

their

developing

and

amount

we

on

of resources

aircraft

the

to manage

for

as a set

limit

receive.

are tools

cockpit

that

behavior

there

systems

to a new

can

their

but

of the

constraints

pilots

tailor

automated

move

of human

5).

job

is one

many

because

organizations

to get

as pilots

there

we

consequences

for managing

experiences

managers,

et al.,

as a resource

to do

training

bad

(Woods

operational

ability

so few

recurrent

and

talked

(e.g.,

range

to

training

for

training

of

heard:

building

a system

that

takes

more

time

than

we

have

people." •

"There

is more

system

in different



"The

most



"We

need



"Well,



"We've



"We

are



"We

teach

the Economic

is new

those

forced

to rely

on

them

knowledge

the

how

to work

the

shrink

footprint).

When

people

to take

lead

in less seem

time)

show

that

requirements

on the

and

how

to use

it."

more

than

anyone

a policy." much

of] basic

modes

produce the

great

in training,

the

likes." they

learn

10

and

system

types as their

match

(the training in

expertise.

levels

in the

(better

the a

in training,

productivity

investments

human

of new

for people

in

reduce

or

improvements

in quality

developing impact

are

greater

to

footprint

benefit

than that

pressure

training there

rather

to believe

expenditures

it off."

line."

factors

(e.g.,

People

consistently

on

to click

it works

training

number

competitive

training

lower

especially

or capabilities."

with

recipe

[a certain system

forces

but

is when how

features

problem

time).

promise

to explore

that

of proficiency

same

data

use

investment

same

level

don't

to learn

handled

competitor's these

chances

and

training

thing

more

of the

it works,

situations."

important

we

rest

to know--how

same in

the

automation

However,

the

of automation role

changes

to more

of a manager

of enhanced human

3.1

safety

anomaly

requires

handler

that

we

(e.g.,

expand,

Sarter

not

et al.,

shrink,

1997).

our

The

goal

investment

in

expertise.

Complexity

The

second

reason

signature

is

automation

are



the

significant

(training

together.

only

and

a few

accidents

with

the

in

coordination

between

when

a collection

of factors

going

sour

human or

and

exceptional

For example,

is eroded

coordination flight

see

only

performance

factors crew

we

breakdowns

come

human



why

that

circumstances •

and

practice

due

to

local

factors

(fatigue)

or

systemic

with

training

investments),

is weak,

circumstances

are unusual

and

not

well

matched

experiences, •

transfer



small,

of control seemingly

between recoverable

Because

there

are always

because

these

incidents

identify

a host

behavior Focusing

on

any

manageable bulletin

to

policy,

add

While

these

lessons

of

of these

changes

-- just

shift

remind

crews

of

some

remedial

changes this

mismanage

may

incident

a minor

it is a symptom contributors

over

where

to

occur,

and

in

the

display

or bumpy, interact

a series

can

slightly,

X works

of

in

and

any

to very

trouble. local

a checklist,

circumstance

to

or machine

from

modify

up.

it is easy

team,

lead

add

incident

stages,

away

isolation

and

sour

in human,

sequence

points

how

is late

to a going

change

the

and

issue

Y,

reinforce

miss

the

a a

training.

be constructive

occurrence

successful

time

a small

signature.

of overall

actions

contributors

re-directed

one

automation

erroneous

evolve

have

and

multiple

of places

could

crew

system flight

in small

When

people

or non-routine complexity. deck

11

ways, and

they

automation

situation

into

It is a symptom performance

-

larger

seem

to

larger

trouble,

that

all of the

design,

training,

operational

policies

and

procedures,

certification

-

need

to problems

in human-automation

to

be

better

coordinated.

3.2

The

An

underlying

is the

Escalation

Principle

contributor

escalation

relationship higher

principle

where the

activities

the

tempo

of

required

demands

(Woods

greater

with

all

tend

beyond

of

normality

using intelligent times demands, the

per

these

se.

with

of his This

the

is the

attention

(situations

situations), are

new away

and

human-machine

an

associated

of clumsy

and with

autonomous

tasks, from

at or-

tempo

to be concentrated

afford

example,

information,

with

trap

For

unusualness

tend

or the

processing

of activities.

interacting

essential

process

information

burdens

least

or her

the

other

with

a fundamental

underlying

textbook

or

is

(including

beyond

can

the

control,

burdens

practitioner

There

or pace

members up

or

agent,

or diversions

interface

trouble

or

interface

the

in

greater

If workload

machine when

the

to go

situations.

a computer

trouble

team

communication)

of

1994).

attentional

among

margins

al.,

the

monitoring,

communication

criticality

the

operations,

to cope

for

et

coordination

the

at the new job

automation

or very

memory at hand

to

(Wiener,

1989)

4

Designer

Reactions

to Coordination

Erratic

Listen

to how

breakdown •

The "trouble

designers

in the

coordination

hardware/software free"

respond

Human

when

system

Behavior

they

between

are confronted

people "performed

aircraft).

12

Breakdowns:

and

with

evidence

of a

automation: as

designed"

(crashes

of





"Erratic"

behavior

behavior;

"brain

otherwise

skillful

The

too

(variations

burps," human

other

Those

just

phobic,

people

is,

system

people

computer

that

on

this

some

performance;

hardware/software

some



human

irrational

don't

understand

or too set

in their

are

"diabolic"

quasi-random

is "effective

or organizations

theme

degradations

human

in

in

behavior).

general

it" (e.g.,

human

and

those

logical

who

are

to too

us, old,

old ways).

or countries

"have

trouble

with

modern

technology." •



"We

only

them

out

pressure,

the

have

to go further

previous

Other

what

of it, but we

"I wanted the



provided

customer

for!"

(or

"we

tried

to

talk

to be customer-centered").

but ..." -

design,

asked

I was

supplier's

constrained

standard

by -

designs,

compatibility cost

with

control,

time

regulations. parts

capabilities

of

the

industry

of our systems

capabilities

and

recognize

(e.g.,

"haven't

kept

ATC

not

characteristics

what

is difficult

of

does the

to do with

up"

the

accommodate

newer

highly

with

aircraft

the or

automated

advanced advanced

ATC

aircraft

does under

not time

pressure). Some the

of these design

very

comments

world

complex

4.1

arrival

Escaping

Overall,

from

these

locked

into

a

failed.

going •

sour

human trouble

design

and

of

many

incidents

-"clear aircraft"

pressures

and

constraints

users,

economic

in

pressures,

procedures).

of Human

from

thinking

--either

Error

and

versus

developers that

this

reviewers

either

serious

multi-cultural

of comments

mindset

Too

free

for

and

departure

components

error

real

Attributions

kinds

independent box

(e.g.,

reflect

Over-Automation

show

technology

electronic stakeholders,

box

how and

failed

or

after-the-fact,

we

remain

people that

are

human attribute

to misuse

(La Burthe,

of

automation 1997)

13

or to

... contributed

to crashes

of

*

over-automation

-

"... statements

agains_automafion This

opl:n_ition

human

'per

this

by

demarcation

set up in order

Bootie,

p. 101).

The

1995,

primary

number

lesson

of industries

coordination cannot

be

thought to

expertise

or degrade

bottom

line

isolation Automation

and

(Hutchins,

the

ways

people

and

us smart"

research who

have

1996).

towards

constructive

to coordinate

the

human

Accident

analyses

contributor other

Woods,

industries

interpreted

that

70 or 75%

come

up

as a motivation

But some

view

these

problem,

and,

remedial

training,

The

lessons

progress

comes

machine

team

are

human 1993).

be considered Hutchins,

in

1995a).

a single

team

team's

coordination

of this

type

from -

that

(Norman,

system,

this

in

People

enhance

(e.g.,

as a joint

real

devices

can

it

in a large

1990).

cannot

adapt

(Kelly-

is

of scenario

developing

better

human-centered

design

1997).

suggest

to about

and

Norman,

technology

of

a breakdown

us dumb"

in

line

disasters

technological

Breakdowns

disaster.

arbitrary

represent

or "make

and

influence interaction

"an

and

artifacts

to coordinate

for

and

use

that

for malfunctions"

(e.g., the

is that

potential

(Winograd

technology

it, "make

people

blame

accidents

from

factors

as

of incidents

Technological

Billings,

path

the

them.

specialists

human-computer

interface

sour

separately

people

1995b;

an important and

the

term

analysis

going

of recent

from

on

to apportion

careful

about

assist

the

is that

between

supposed

The

from

of the

commentator

defining

Factors

1997).

misunderstanding

One

point

by ... Human

" (La Burthe,

is a profound

performance.

makes

se'

made

or dictate

of aviation

with

about

for

paying

statistics

as a result,

breakdowns

the

want

all pilot

human

mishaps. same

increased

superficially

they

in

as an

14

Similar

percentage. attention indication

to eliminate action

performance

through

the

human expanded

are

tabulations This

be

Factors.

of a human element,

in

should

to Human

a

error provide

procedures.

However, that

research

"human

learn

error"

about

people label

these

function, human

One

way

errors.

mode) are not

have

when

simply

just

human

error

or a machine

system

breakdown

track

system

configuration,

input

current

differently

Woods,

1995

increases

a; Woods

without

activities.

is gaps

system. organizational

this

of design-induced

Jones,

given

as

and

in

each

has

(ways

always

been

in which

1951

in

the

that

and

aviation

for

for

make design domain).

mode

error across

state

and

automated

monitoring

and

training.

The

for

models

concerned

and

interactions

limits

being

user

(Sarter

potential

of the

automated

technological, affected

with us

by

the

dumb) (e.g.,

-.

loses

interprets

operation

human,

system

15

who

of the

things

to improved

a user

opacity

and

errors

both

and

others,

(one

mode

system

mental

in a

are a kind

adequately

affecting

that

of

errors

a system of

case

configuration

about

in users'

design

Mode

it takes

resource

human

is the

feedback

complexity

especially

contributions Fitts,

the

the

an intention

Note

The

which

and

error

one

of modes

operators

interact,

error

1947;

5).

To

ways.

in

mode

chapter

to train

example

began

current

improve

misconceptions

factors

Factors

1994,

coupling,

systems

and

In

fundamental

to

it difficult

these

the

of a proliferation

resulting

makes

managing

and

et al.,

changes

The

system

result

on

as a consequence

modes

Human

depending

and

in

training

failure.

in that

1994).

influence

executes

were

al.,

found

to go behind

induced

operator

has

system

that

configuration.

of human-machine of the

of design

device

in a different

factors

in predictable

an

if the

need

we

risk et

that

organizational,

of a kind

occur

(Woods the

the

are

and

improve found

performance

be appropriate

it is, in fact,

issues

analyze

there

example

errors

to safety

of deeper

and

human

Mode

when

contribution

constructively

words,

classic

would

and

to identify

influence

and

human

researchers

In other

simple

that

issues

error

that

mode

the

is a symptom

these

performance. factors

on

and

the

others.

identification as one

Fitts,

However,

of

1946; it

its

Fitts is

a

profound

misunderstanding

a shift 'the

from-

'the

incident

was

if we

trade

pilot

There

are

always

jointly

sufficient.

of

incident

4.2

for

Design

human

designer

or

the

that

lead

such

as human

over-automation

are

We

each

factors

often for

-

no

progress

et

al.,

necessary are

to -

1994).

but

a part

progress

forms

implies

error"

make

only

of the

comes

managers

inappropriate

this

(Woods

to failure

or

that

operator

error

predictable

or through

or

error'.

designers

for Human-Centered

If diagnoses

error

potential

towards

to think

manager

organizational

again

factors

of technology

by pilot

contributors

performance

Strategies

caused

results

or designer

and

But

research

manager

multiple

the

use

by

error

understanding

clumsy

was

caused

contributors.

shape

of the

set

from

inadvertently

of error

to

through

organizational

the

pressures.

Design

error

(be

misleading

it operator,

and

designer

unproductive,

then

or

manager)

how

do

we

or make

progress?

A necessary and

design

three

step

(Billings,

basic

centered,

1.

first

1996).

attributes:

and

is to adopt

context-bound

research

A problem-driven

approach

the

difficulties

meet activity

2:

the

these

basis

for error

and

challenges

demands?

across

Human-centered

people

This

perspective

Human-centered

Human-centered

modeling

"human-centered" can

design

is

and

is problem-driven.

begins and

with

is the

in routine

and

and

can

in that arise?

nature

of

exceptional

design

of practice.

do people

collaborative situations?

is activity-centered.

16

activity-

in understanding

field

How

of

1997).

an investment

expertise

that

Woods,

research

in terms

problem-driven,

and

design

to

be characterized

(Winograd

What

research

approaches

use and

and What

artifacts coordinated

are to

In

building

and

designers and

often

the

see

on

activity

that

with

the brings

field

New

possibilities

people

emerge

in a field

with

computers

goals

and

becomes and

in different

ways,

in

tools

activities

(b) how

in that

the

field

do

Human

collaboration,

cognition,

classic

example

is

reproduced

finding

influences

the

improving

or

other

words,

differently,

the

same

can lead

Another

of this

problem

is not

Rather,

this

context

in which

just

state

problem

so

it appears. related

de-emphasizing

the

to the

or will

in

the

of

interacting

the

practitioner's

The

artifacts

activities

involve

becomes

question

shape

of their

performance -

to

(e.g.,

the

so that

they

depend

on

then

cognitive

goals

and

_

task

function

as

context.

A

the

work data

amount what

worse, of data;

17

fundamental

data the the

and

a problem

solve

Zhang

a formal

is the

a

How

cognitive

pieces

operating

shifts

pursuit

effect

from

Even

try to make

1995).

artifacts

needed

because

we

pressures

domain.

Science.

much

design

the

is context-bound.

and

example

is hard

of other

design

performance

to different

focuses

in press).

work

problem

This

focus

adapt

Cognitive

degrading

them.

other

in

representation

in

of performance.

the

and

cognitive

the

and

(Woods,

research

between

do

task

practitioners

3" Human-centered

human

Dominguez,

the

underlying

of people

of activity

(the

of analysis

but

and

systems

isolation,

activities

researchers

separate

and

and

focus

do computer-based

coordinative and

the These

(a) how

context

Flach

of practice.

activities

in

constraints

1988;

when

interaction

use,

In human-centered

to the

(Ehn,

human

of two

technology

together.

sensitive

of activity

of

or the

them

for

in terms

aspects

people

technology

actual

technologies

the problem

computer)

attention

new

studying

is

that

Norman,

description,

when

therefore

overload of

data

either 1994).

different

be

is informative

levels

At the sifted

consists

context

also

of

includes

heart

through.

depends

context

In

represented

problem. to

represented

problem,

and

and

much

on

more the

the than state

of the problem solving process and acting

in that

in most

impact

on

cases,

human 1997;

developers thoughtful

et

their

work

the

promise Eventually,

The

to make

the

computer

undergo

their

operational

complexities

words,

road

intentions"

(see

come

is a gap

between

actual

those

on

broadest

strategies contribution

that

level,

users.

into

play,

if at all,

frontier

or

to

practice,

activity.

of

only

which

human-

be

later user-

results

deck.

with

These

at this to

flight

hold

engineering

intentions

is paved

all

to users.

usability

automated

which

and

Knowledge

designers'

systems

because

seems

technology

the

intentions,

around

that

and

to

In

in other

user-centered

et al., 1997).

5 Progress

At the

the testing

the

good

collaboration

technology-driven

like

lead

addressed

technological

a technology

and

motivated

will

they

activity the

to potential

usability

system

such

primary

connect

to technology-centered Sarter

new

are

technology-centered

usability

accessible

they

eventually

cognition,

some

and

Well-intentioned

is pushing

which

its hypothesized

press).

Despite

albeit

on

because

because

focus

human

there

and

"the

people

(Winograd

the

is the

system,

technology

However,

centered

of the

performance

in

fundamentally

primary

interaction

stage.

and

are built

typically

Woods,

predict

in itself

influence interfaces

interfaces

they

remains

technological

to

1997;

and

developed.

technology

is organized.

creating

expectations

based

is human-centered

system

usually the

al.,

performance,

the

development developing

and

is developed

collaboration,

because

in of

technology

Sarter

people,

improvements

else

new

cognition,

feel

usability

goals

situation.

Today,

Woods,

the

researchers

organizations

Depends

have can

on ...

identified

follow

to safety:

18

in

an

a few effort

basic to

human-centered

increase

the

human



increase



avoid



evalua_

specific

increase

To

the

Design,

operational,

together

to adopt

and

certification.

- to work but The

goal

factors

5.1

Avoiding person of

create

diversity"

to turn

of

that the

test

ability

complexity

or

will

of

not

is the

Trying

to

change

banishing

to

the people

steps

use

them

to

state,

Factors

into

practical

error

tolerance

issue

modes

Failures

to manage

source

accumulated

eliminate basic

vulnerabilities with

19

and

other

because

no

single

and

in the

pursuit

customers, modes,

this

created

from

is the

options

problem

behavior

failures.

detection.

in complexity

complexity

"erratic"

associated

of the

(valid

features,

features,

-

design

as additional

all of these

But the

of design

and

multiple

for this

work

community

But

increase

all

methods

eliminate

accommodate

needed.

as part

complex.

complex

are

must

center

error."

source

and

systems

circumstances.

as "human

view.

more

cost

try to manage

The

Neither

and The

of: operational

trying

several

is a difficult

to make in

get more

accumulate.

will

potential

observability

Human

results

to detect

operational decides

and

for effective

excess

training

of their

organizations

to the

research

Complexity

person.

point

a challenge

Operational

are categorized the

creates

safety

analysis

Excess

must

the

regulatory

errors.

gradually

who

and

predictable

options

improving

to

for error

improvements

systems

user

methods

or organization

local

and

research,

is to improve

Avoid

by

contribution

economical)

that

in terms

error,

detection

human

industry

resource

of human

training

expertise.

This

with

and

intentions,

in human

improve

complexity,

at error

and

to errors,

in technology kinds

skill

invest

tolerance

operational

changes

activities •

system's

excess

create •

the

across

complexity is not

inside

an

operational

through

remedial

by the Instead

complexity. human

error

a

is a symptom involve

coordination

system that

of systemic

approach

Mode

Not

operations

modes

"basic."

Which

for safe area

and

different

Making

flight

and area

5.2

where

This

industry.

that

will

coordinated

about

information

need

they

in

all

transition

the

factors

in

contribute

to

only

of

different

and

which

of the

purpose

difficulties

to variations

and

define

indication

same

due

the

training;

complexity

Another

and

available

carriers

excess

the

change

or carriers

are

different

achieve

for

all pilots

taught

operation?

that

mode

across

teamwork

is

have

a set modes

are

different

as

essential

disarray

requires

coordination

across

an

competitive

in

ways

but

simplification

modes

changes

been

pilots

transitions

is of

(indirect

have

between

making

Error

and

in simplifying

mode

for

fixes

in

this

names

on

many

international, needs

to

be

in others.

interaction

in

are

represent

industry

place

Indirect

by

Still

which

the

system

decks.

progress

collaborative

of the

used

modes

modes

modes

multi-party

are

taught,

efficient

is that

One

is

are

performance.

preferences. all

in

meaningful

is illustrative

Not

solutions

parties

with

human

all modes

and

complexity. "basic"

start

affect

simplification

involved.

The

of multiple

must

predictably

factors.

and

better

mode

very changes

identified

importance

or

as a major

automation.

fit pilot

great

mode factor

Simplifying

models

is another

is

the

reversions). in breakdowns

these very

transitions high

priority

improvement.

Detection

Research

has

machine

systems

providing

better

through

shown

that is effective

feedback,

Improved

a very error especially

Feedback

important detection. feedback

2O

aspect Error about

of high

reliability

human-

detection

is improved

the

behavior

future

by of the

aircraft, be

its systems

balanced

with

investment going

One of

area

of need

authority

evolve

picture in the

without discussed generally

behavior

the

is

technical

available

and

context

of

Observability presence

1992,

is of data

sufficient

and

change

increases

and to

system's

and

state,

but

to increase integrated these

may

of machine

agents

surprises.

has with

machines'

how

automation

surprises

behavior

an

authority

critical against

future

need

activities,

a

feedback?

emphasizing

agent

can

guarding

and

feedback

leads

is

current

miscommunications

shown the

rather

As

that

crews

automation only

from

when

aircraft

abnormal.

of

low

to the term

observability

cognitive

captures

observation

that

distinct

from

in some

form

to have

feedback

is a concomitant

automation

symptomatic

This

of

complexity

in better

the

autonomy

automated

data.

there

observability

their

refers

about

situation,

on

that

how

technological

forms

sufficiently

term

from

not

data

becomes

result

in

detect

about

to invest

Increasing

increase

do not

and

complexity,

of the current

earlier,

performance

As

new

future.

an

displays

and

through

d.vnamic

human

feedback

systems.

increasing

Improving

improving

is improved

In general,

feedback.

But where

automated

observability

This

for

scenarios.

autonomy,

automation.

improved

area

sour

the

or the

the is

data

needed

to extract

relationship

to

availability, location.

in

among

fundamental

in some

something

observability

work

where

front

of

your

data,

refers

human eyes

observer feedback.

to

the

mere

perception,

to see

the

meaning

effective

which For

is

it"

"it

is

(O'Regan,

p.475).

Observability results what support Billings,

from

refers the

interplay

information

at what

attentional 1997).

to processes

The

between point

guidance critical

involved

in extracting

a human in time

(see

and

Rasmussen,

test of observability

21

user

useful

knowing

a system 1985; is when

information. when

that

the

to look

structures

Sarter, display

It

data

for to

Woods

and

suite

helps

practitioners

notice

expecting

One

(Sarter

and

example

of

generation flight

display.

"changes

injunctions

environment.

effective

Minor

new

concepts



transition-oriented



future-oriented

need

and •

reveal

what

pattern-based possible

For

example,

usable map

display

target--

an

situation

and

check support

vertical

better

you

on

one

or

the

pilot it."

indications

in

a

annunciations

adopt

to

Simple

in these

attention

and

primary

mentioned

at

in feedback.

current

contribute

stare

changes

changing is not

very

Researchers

and

fundamentally

new

activities.

navigation

display

trouble

navigation

future

rather

to make

profile

22

the

current

sequences

above

quickly have

a

big

up and

assessment.

comprehensible display.

picture

in a way

criteria

pick

to read

The

example

However, the

than

more

of vertical

provides

and

an overall

developments

on

only

transitions,

significant

at a glance

modes

detection. based

captures

is a tremendous

that

and

when,

to scan

of data

navigation

the

and

conditions

form

events

operationally

next

piece

some

about

generally

be able

vertical

especially

feedback

approach

should

horizontal

and

for

the

activities

mode

automation

happen

individual

integrated

reading

on

As

unless

test

is to highlight

to require for

annunciations

current

or abnormal

making

is likely

about

should

-- pilots

each

mode

redirect

develop,

current

goal

unexpected

integrate

to

looking

to be:

-- the the

observability

improvement

-- provide

configuration;

to

of the

crews

low

at or call out

ways

to cooperate

to inform

in

closely

significant

specifically

transitions.

sneak

tuning

any

need

approaches

The

not

were

of automation

mode

always to look

to provide

industry

indications

pilots

are

flight

tracking

can

for

generally

likely

with

very

is the

crude

they

a).

with

decks

These

what

1997

displays

problems

us,

than

Woods,

of flight

reported to

more

that

developing is much

of

and moving

of the

desired

the

current

supports

quick

displays more

difficult

to

because it needs

is inherently

to develop

of vertical

Going

begin

need

and

and

comprehensible

How

Let

us look

flight

and

In

1990).

new

industry

to support

economic

pilot

as a whole management

this

we

are

ensure

ready

better

a collaborative

awareness

to

demands

to support

researchers

that,

feedback

prudence

feedback

need

and

to aid

to

improved

pressures,

regulators,

this

that

is better

do

innovations

we

evidence

what

To

on

Feedback:

nears

they

take

over,

when

process

to prototype,

test

and

monitoring.

W e

the

next

provide

more

Transfer

of Control

on

This

window

of

observable

the

the

too

and

general

extra

trouble

developing

silently

trouble

until

to compensate.

or be unprepared

to handle

the

a bumpy

of control

and

transfer

problem

example

or

and

has of

been

this

a part

disturbance significant

of several

is asymmetric

lift

incident conditions

trouble.

unusual on

for

crews

capability

in

One

of the

of its authority

late

between

compensate

unaware

a well-coordinated

parameters notice

limits

breakdown

can

remain

resulting

or engine

in

comment

the

scenarios.

contrast,

of a coordination

can

over

excursions.

by icing

Bumpy

Automation

Crews

take

accident

could

on

automation.

may

relevant

progress

example

crew

caused

with

Better

at one

automation

control

conflict

up,

to Provide

(Norman,

once

concepts

The

automation.

deck

The

problem.

provide

carriers,

opens

5.3

accidents

forward

opportunity

display

recovery.

adopt

to move

the

the

manufacturers,

in context,

new

and

to make

detection

dimensional

automation.

Despite

we

among

test

incidents

is needed.

error

and

navigation

sour

that

a four

human

difficulty target. work

or

increasing

Or, in an or effort

team,

open

exerted

23

the

active

effort

needed

environment, by his

or

partner

her

would

to keep the

supervisor

partner

and

the

ask

about

the

safety

goals,

How an

difficulty,

can

we

open

of

extreme need

analogy

to a well

environment

and

set

the

the

visible

feedback the

use

investigate

better

ends

how

between

problems

that

of its envelope

we

when

authority,

to achieve

human

human

arise

or

or intervene

coordinated

to guide

coordination

feedback

problem,

team

can

provide

and

machine

working

more

automation

improved

overall

in

effective

partners?

For

is working displays

and

at the warnings

to indicate: when

the

automation

is

having

trouble

handling

the

situation

(e.g.,

turbulence); •

when

the

extreme

end

when

This

are

having

How

and

rather

trouble

when just

From

experience

these

questions.

and

gradual

escalation

signal trouble function,

one

thresholds

research

not

simply

we

often

or staged

may the any

but

crossing

know

the

to make

the

in the

very

function

not

level

constraints (simple or

or

or

action.

24

say

too

kind

automation well

yet

to define

what

way?

When

failing

to perform?

moving

indications late

How

is an

towards

a limit

alarm?

some

too

is how

sensitive

communicate

set

shift

whenever

performing

towards

surface.

intelligently?

a function,

crossing

for trim-in-motion) in

moving

question

in a context

a threshold

are

sounds

or

of a flight

design

this

effectively

Threshold

-

The

in performing

invoking

enough

that

target.

of authority

does

action

for control

to communicate

regions

than

warning

in competitior_

enough

"extreme"

extreme

of authority;

a performance

smart

agent

is taking

of its range

agents

specifies

system are

automation

too

extreme

on

the

alarms)

early.

answers are not

We

need

of feedback. is active much. action

(e.g.,

We to

want

to smart

a more

An

auditory

an

auditory

to indicate

accomplish

the

We

know

certain •

from

errors

can occur

nuisance

excessive



distracting

false

words,

silent,

through

multiple

up

feedback

more

systems

that

include:

that

talk

too

serious

tasks

are

being

that

comes

on

at

that

thing!").

-- "silence feedback

too

can

little,

into

perceptual

salience;

existing

attention?

prototypes

occur

too

talk

late

displays?

in other

in terms

or

Should

too as

much,

much

in

handled

a high

too

the

(e.g.,

noise

soon

automation

through this

Should

words,

Working

perceptual

visually

indications?

integrated

a

level

or it can

moves

be

towards

out

how

these

the

auditory

channel

be

a separate

new

the

indication

be

strongly

design

should

decisions

or

indication of

the

very

signal

requires

or high

capture

developing

of:

salience

relative

to the

larger

context

of other

possible

events

relative

to

signals,

along

a

priority along level

temporal

dimension

of other

issues

a strength

adjusting

the different

these

relative attributes

effective new

signals kinds

(when

or activities

dimension

of abstraction

Developing

and

alerts

similar

limits.

the

about

with

These

as voice

or a warning

misdesigned

Should

and

when

situation

speaking

authority



feedback.

such

warning

a difficult

In other



and

alarms,

trim

during

and

in designing

indications

constant



domains

situations,



pilot

in other

communication

wrong

too

experiences

feedback

going

(how

to ongoing based

about

on

communicate on

much

the

then),

or how

little

to say

and

at what

activities) data

on crew

automation

or indications of situations.

to

in One

the cannot

25

performance.

activities context improve

of

requires other

thinking

possible

feedback

or

signals increase

observability

by adding

time

arise.

as they

symbolic

but

crew's Instead, point

Our

will

need

to look

to the

need

for

performance bounds

balanced

in

identified

effective

design,

and

explored

in

detail

illustrates

the

complexity

5.4

Mechanisms

Giving

users

one

side

also

giving

their

the

reasoning

position. but

manual

potential

legwork focus this

more

the

those

decisions.

solution

agents

users

the are

say

computational

in working critical

machine

high-level users

need This

to

through

means

and

testing, that

that

will

set

must

be

need

The

large

given giving

26

in

to be

example

processes

team

players.

agent a

with way

of machine from

much

problems, However, the them

as

a resourc_

thus

other

authority control

machine's

than

agents

through provides

the

and over

them

to make

use

capabilities the

a

mundane

allowing

in order

in

improved the

of

is only Without

significantly

wrong

it in any

users

process.

into

what's

power free

reasoning

not

decisions. to be

human-machine

testing.

machine

to influence

i.e.,

all

Resources

agent's

to

which solution.

design

factors

the

situation.

tradeoffs

user

machine

able

relevant

be

for observability.

to direct

the

advantage,

potential,

make

ability

be

The

involved on

the

powerless

takeover.

great

the

might

remain

and

Automated

processes,

They

solution,

prototypes

in making

users

of the

will

it challenges

an integrated

some

at a more

data

of problems

for

one

displays,

in a particular

the

identified

case

More

because

scenarios

some

more

subsets

identified

mentioned

into

and

each

alarms.

to devise

of designing

visibility coin

sets

solutions,

through

more

is relevant

relevant

to Manage

of the

what

has

generate

feedback

feedback

example

on

sounds,

at coherent

to address

will

effective

digest

improved

one

targets,

be

on and

we

of this

more

not

or alarm

approach

displays,

to focus

analysis

some

on this

ability

indication

A piecemeal

codings

available,

a new

problem

to of to

A

commonly

automated

proposed agent

users

determine

Thus,

the

system The

and

machine

agent

productive

automated

agent

drawbacks.

Either

the

practitioner's

the

machine

in

several

and

the

not

do

entire

over

the

need

tools.

obvious from

brittleness

of or

Previous

work

aviation)

systems,

shown

manner

the

of a deteriorating

that

to be able

in a cooperative

the

(expert

has

the

having

many

a

more in

troubleshooting,

agents

or

benefiting

despite

of cognitive

algorithms)

has

middle

manual.

in

like

without

and

electronic

users

it,"

job

in the

machine

Instead,

problem,

joint

agent

is

support

planning

agents

agent

specifying

to

the

course

may of

the

possibilities

is difficult

interactions

between

may

challenges

be meaningful

very what

users

main

will

take

exactly

as a resource

of the

of interaction want

the

path

a human

which

takes

of

either

method, I'll

the

fully

This

operations,

types

or

cooperating

knowledge,

the

and

agents

or

where

cockpit

this

is a poor

to continue

by taking

to work

control

of

the

agents.

one

to

(space

architecture.

observability,

others

user

machine,

machine

it

the

adequately.

and

does

without

different

machine

may

or the

the

interrupt

in situations

automatic

that

with

do

and

to

a problem

the

- fully sense

deal

users

solving

of critiquing

you

machine

is not

in the

human

allow

in its entirety

modes

to

information

the

modes

two

"either

the

flight

role

problem.

say

automated

automated

Using

the

domains

cooperative

the

asked

is to

agent

only

the

situation

automation,

the

of

agents;

with

with

be

solving

challenging

into

this

problem

machine

system

can

sense of

the

in essentially

is a joint

for

over

the

is cast

operates

process

take

that

human

system

remedy

detailed

want

only

solution and user

are

goals,

determine

users. of

very

situational

27

things.

In

some

and

very

progress.

requires

is to

made

to make in

various

to

control

decisions

mean

in

general,

what some

portion what

of

factors,

level

the

users

a

problem, while

analysis the

in

corrections all

iterative and

and

sequence,

high

of

levels

cases,

Accommodating careful

In terms

nature

of

these of

the

of

the

machine perform

5.5

agent.

However,

this

in the

broadest

effectively

Enhancing

The

last

Human

area

contribution industry

for to

possible

investment

in

the

is human

to be reducing

myths

the

as investment

in

expertise.

automation

creates

In our

investigations,

of

automated

the

how

the

about

makes

models heard the

from

sources mode long

or level

instead their didn't

you

tried

to get

knowledge where

tangled

web

incidents

get

this

path turn

it to do what and

different

skill kinds

requirements.

personnel

as a resource

pilots

need

and

modes

Response: I had

demands of factors

having

For

programmed are most push

28

in

beyond

Many

persisted

their

too

intentions

to accomplish may

it was

ask,

supposed

heard

relatively the

W e

a particular

they

means

We

mental

to manage

getting

someone

do what

relevant

flight

situations.

out

direct

heard

logics.

how

where

it to do."

events

crews

trouble

example,

"It didn't

We

transition

to carry

or a more

about

automated

flight

successfully,

means goals.

and

to teach

automation

complexity

or erroneous

differing

were

the

knowledge

the

in

increased

function. of

is

is needed

that

new

complexity

in

to work

say

reflects

performance

skill

struggling

management

This

human

and

modes

of

it points

how

of automation

mode

when

shown

the

pilots

time

aviation

have

oversimplified

to another

it off?"

on

to develop

where

very

the

sources

that

how

departments

of switching flight

show

that

to accidents.

knowledge

means

of automation to

is to

human

investment

subsystems

for pilots

systems

offered

trying

that

ironic

the

less

operational

automated

training

automated

system

increases,

different

deck

is

of automation

many

heard

flight

it easy

of the

and

we

investigations

deck

impact

fact,

new

different

joint

improving

contributor

automation

In

of

at the

one

about

interest

investment

this

as a dominant

human

the

of scenarios.

It

performance

that,

range

if

expertise.

to human of the

is crucial

Expertise

safety

seems

process

routine

how rare

"Why to,

the

so I new

situations

- just

those

circumstances of

that

are

misassessments

practice

For

demands

managers

and

must

be fit into

of new with

heard

mismatch. basic

For

example,

set of modes

can

create

the

managing most

complicated

pressures

one

the

parts

shrinking

and

skills

that

that

for individuals

easiest

to learn

later

The

levels

cope

of

with

training on on

to learn,

of training

bind.

to

to be learned

to

footprint.

double

focuses

need

of new

transition

training

are the

training

developed

is to focus

the

deal

as a result

been

a progression

increases

a training

remainder

situation

through

is a great

and

have

tactic

systems

result

creates

that

leaving

ironic

automated

a small

tactics

and

the

knowledge

economic

many

sour This

departments,

roles,

about

to going

miscommunications.

of situations.

that

automation

vulnerable

kinds

combination

We

and

those

training

most

on

the

on their

just

line.

those

while

this a

This

parts

of

deferring

own.

the

This

tactic

works: •



if the

basics

provide

parts

or for

coordinating

if

there

is

continued

Another It

is

overwhelmed and

recipes

of

need

training economic

as much for

settings scenarios, and

acknowledge as their

pilots

to

through although

competitive

practice line the forces

limits

and

difficult

circumstances,

oriented scope squeezing

bind

and

checks

students

of this

they simulation of

is to teach

systems.

resources

what

29

more

supports

prevent

automated

the time

difficult

double

helps

of the

the

requirements.

training

and

complexity

managers

beyond

operational

tactic

learning

encourages,

minimum

this

aids

in more

that

with

efficient

that

automation

beyond

to cope

by the

training

the

used

time

the

base

environment

learning

tactic a

an

a coherent

this training

from Still,

approach

limits have

training time.

line is We

being

instructors and

allowed. learned

and

recipes.

try All

in oriented limited saw

to go spoke

realistic flight by evidence

the

of

an industry

time

and

limited

As

the

the

highest

do

so

under

program

or

priority

same

goal

squeeze help

more

and

rather

yield the

transition

practice

from

the

FAA

will

be

quality

a shrinking

training

in on

freedom

taken

in productivity effective

reduce

pressure

means

(reaching

training).

in human that

label

the

Trying

expertise

we

to

qualification

inadvertently

Economic

accidents

focus

advanced

further.

(more

and

and

increased

is, it can

investment

of incidents

has

(the

as this

even

in

is to identify

US industry

laudable

than

kinds

of limited

response

The

of improvements

faster)

prevent

one

with

However,

training

utilization

needs.

programs

AQP).

benefits

shrinks,

training

new

better

checks.

footprint

for

the

to get

recurrent

training

resources that

struggling

to

will

human

not error

after-the-fact.

Escaping

from

limits

of

this

minimum

oriented

towards

produce

an

training

should

growth

of

initial increase

can

the

full

management

knowledge

platform

Developing

becomes

practice

a pilot's

and

career? this

skills

range

we

resources In

approach. as evidenced

many Pilots,

expand

culture should

deck.

support

This

continued

improvement

of

beyond

systems

we

situations

useful

mental

of possible

that

see

pilots

models

conditions

an

that

depends

situations.

the

across

opportunities

a wide

ways,

the

in

general,

by pilot-created

3o

flight

that

range and

a

the

training

automated

in line-oriented

can

produce

automated

the

accurate

a wide

how

the

highly

and

is to recognize

or transition

continuous

with

requirements

of automated

to adopt

on

step

should

for mechanisms

because

mission

then

for managing

across

we

Initial

emphasis

is needed

effectively or

question

prepared

An

to master.

be applied

throughout

as the

knowledge

on part-task

The

serve

A first

Instead,

learning.

proficiency

expertise.

be able

is essential.

requirements.

initial

in

bind

continuous

proficiency

must

double

variety

aviation want guides

to practice of situations

industry to improve to the

is well their

automation

that

we

noticed

heavily less

in

in several

line-oriented

expensive

but

high

training

centers.

The

training.

New

training

fidelity,

part-task

industry

already

has

technology

training

in

devices

invested

the

form

of

is being

utilized

from

research

more.

5.6 Coordination

These

comments

on Human of

Among

also

Factors

changes

and

the

areas •



pressures

the

best

evidence

job

trainers

may

to use

it within

designers



the

with

of

improved

may

limited

encourage the

large

of circular

need

for

of them,

others

time

etc.).

Each

group

look

are

under

minimize

knows on for

the

so that

system

levels

industry

footprint;

placed

they

that

the

they

are

them.

So

when

solutions

in

other

we

can

train

people

window, to

get

ATC

to provide

to

better

accommodate

training

features

regulations

on items

is But

reaction

of lower

wrong

there

in

to enable

designed

across

considered

detail

that

alone,

has

31

all

reading

our

people

as a result

of

forced

of

these

to these

have

procedures,

improved

a great

to spend

aircraft.

areas

can

messages.

is symptomatic

traditionally

operational

are not

for glass

--

of glitches

areas design,

so they priority

is a deeper

to evidence

-- training, when

training

set of interconnected

solutions

coordination

this

of

high

capabilities,

for modified

in isolation.

kind

segment

for very

demands,

lobby

these

demands

for example:

efforts and

emerges

performance,

resource

advocate

training

autonomously

deck

with

that

re-designing

this

market

trainers

None

to flight

that

constraints

it is natural

advocate

may

precious

one

given

idiosyncrasies

designers

multiple

it all into

possible arises,

theme

each

deck,

may

to cope

(fit

of

flight

contribute

automation's •

in industries

previous

of glitches that

a general

Representatives

from

doing

illustrate

problems

performance.

constraints

Stakeholders

This

of a deeper

functioned

mostly

certification. deal,

be

and

Each this

has

created

the

However, the

In

generally the

risk

interaction

fact,

these

of failure

increasing

specify

the

For example,

inadequacy

training

of

a

this). in

have

become

more

control

(ATC)

While

of coupling

entity,

and

under

intense

economic

room

to make

and

limits

change

in

complex paper, the

the

it may

significant

precisely effects

advanced

procedure exact -

and

a price erode

because for other

aircraft,

seem

change parts

interact yet

ATC

one

system.

32

of

the

no

degree.

of the

as

an

has

undergoing well. at least member

Coordination

aviation

the

capabilities

as

team

are

becomes

the

is a system

air

multiple

longer

with

-

system

learnability

a clumsy

areas

area of

throughput,

to some part

each

pressures

of managing

in any of the

for

to increase

margins

parts

training

performance

in terms safety

all

aircraft.

other

integration

that

demands

and

may

in

Design

ATC

economic

desirable

that

means

deficiencies.

designers.

of

fact

multiple

automation

economics.

the

and

to try to

new

many

deck

on

design

these

and

demands

This

design

safety

comment

on

how

in part,

means of

flight

and

to the

pressure.

for

face

general

develop

technology

increased

possible

is due

of managing

automation

needed

this

on

departure but

in

part,

up

still

people between

(and

between

designers,

many

to show

aircraft,

coupling

integration

be extended with

the

linkage

advanced

involves

areas.

heard

a closer

industry.

scenario

automation

operator

the

advanced

are

In

constraint

of

can

progress

another

be

inter-related

perspectives.

the

to

aviation

sour

increases

have

and

the

going

how

We

in

individual

recognize

because

improvements

isolated

many

needs

part,

example

by the

automation

manufacturer

This

traffic

of

levels

'throw-it-over-the-wall'

There

perspectives,

safety

these

philosophy.

or between

reduce

between

level

an operational

high

exemplified

or coupling

areas.

the

extremely

system

A on -is has

6 Conclusion

Overall, there are accidents. and

We

the

need

into

symptom

of in

Second,

larger

we

-

breakdowns

and

in

a

tame

the



through

better

feedback



through

more

practice

of particular

kind

a minor

going

sour

coordination

symptom

of

people

or non-routine This

people

system

and

where

occurrence

between

(Woods,

incidents

of incident

scenario.

overall

levels

needed

details

against

the

organizational

can

the

to mismanage

trouble

are

behind

guard

seem

turn

operational

patterns

to better

automation

situation

which

broad

scenario and

is a

machines

complexity,

at

both

1996).

complexity to operational

personnel,

at managing

automated

function

as a team

resources

in

a wide

range

of circumstances, •

by making



by creating through will

the

"intuitive" better

produce

In general,

coordination

the

mechanisms

can

for

excess

complexity,

between

coupled

challenges

involved.

Since

the

they

should

should.

Since

that

industry low), the

it is easy status

quo

valuing

because benefits

to ignore

not

is sufficient.

be

where

the

quickly,

automation

design

problems.

growth

simplicity

learned

in complexity of

operation,

through increasing

areas.

sour

any

scenarios

change

pay

the

threat This

but

safety

level

is

going

particularly

33

a coordinated

exact

level,

costs,

of the

in

will

are at a system

the aggregate the

can

performance

to limit

of going

difficult

that

or predict

of human

trying

player,

designs

to detect kinds

act by first

extremely

claim

automation

predictable

we

checking

Meeting

automation

that

costs

it is easy some

is very sour

on

the

for each other

high

because

and

is

parties party

part

(actuarial

scenario easy

manner

argue going

to

of

the

risk

is

that sour

incidents

by definition

looks

like

being

human

climate

a unique

to

before";

minimum

progress, such

was

as

the

going

based

enough

before")

sour

continuing

progress

collaborative

has

and

been

to point

where

progress

out

to

how

constructive

a

directions

this

halt.

Yet

question

for

build

the

to

areas that

This

difficulties

forward

specific

a

risk.

The

is

factor

produced

if observed

addressed.

case

approved

crawls

then

enable

general

("you

progress

be

have

competitive

practices

operators can

and

Each

common

climate

is demanded to

more

environment

past

factors.

dominant

legal

and

are

that

research

on

scenario and

the

to financial

what

environment of our

and

is exactly

manufacturers,

collaborative goal

standards

contributing

with

certification exposure

its pains,

local

of events

creates

safe

despite

regulators,

The

The

change

"it

several

combination

error.

where

leads

involve

movement.

for

may

constructive help

create

a

is possible.

Acknowledgments

The

research

Ames

on

Research Dr.

Research

Center

our us

in many

FAA

team

that

flight

deck

systems

our

reflections

and ways.

examined

on

the

the

Dr.

1-209;

pilots,

investigations different

and

NCC

many

is based

Cooperative

Palmer

(Grant the

assessment

(under

Everett

to thank specific

this

Center

Monitors

wish

which

We the

are

implications

Kevin

Technical

and

indebted

to

between

and

debates

of the

research

References

34

2-592;

and

NASA

Dr.

designers

their

views our

Langley

who

contributed

W e

experiences members

crews

sparked

results.

Technical

Abbott).

fellow

they

NASA

Kathy

and

flight

by

NCC

Corker) Monitor

shared

interface

discussions

sponsored

Agreement

instructors who

was

and were

to with

on

the

modern critical

to

Abbott,

K., S10tte,

Lyddane,

G., Amalberti,

H.,

N.,

Sarter,

Interface

Between

Aviation

Administration

Billings,

Cook,

Approach.

R. I. and

Woods,

aviation

for the

Clinical

Anesthesia,

Cook,

R. I., Woods,

Report,

April

Degani,

T.,

and

Woods,

D.

Flight

Deck

Systems.

Lawrence

R., R.,

(1996).

The

Federal

18.

The

Search

For

Erlbaum

A

Human-

Associates.

of automation

intravenous

Lalley,

T., Pearson,

Implications

of total

anesthesia

surprises

(TIVA).

in

Journal

of

8:29s-37s.

McDonald,

Corpus

Shafto,

cockpits:

of

prepared

Some

M., initial

International

Systems

J.S.

Cases. for

Sweden:

(1991).

Human

Cognitive

Performance

Systems

Anesthesia

Reporting

Kirlik,

A.

observations.

Patient

in

Engineering

Safety

(1995).

Mode

usage

In T. B. Sheridan of

Conference.

Work-Oriented

Foundation,

Automatic Boston,

Design

of

in

(Ed.),

Proceeding

Control;

MA:

automated of

Man-Machine

IFAC.

Computer

Artifacts.

Stockholm,

Arbetslivscentrum.

D., Dodd,

Flight

and

Federation

(IFAC-MMS)

P. (1988).

Eldredge,

N.J.:

Imrich,

F., Newman,

D. C. June

Hillsdale,

(1996),

S.,

1991.

A.,

the

Modern

Automation:

D. D. and

Laboratory

and

R.,

Aviation

future

A

R., Fabre,

: Washington

D.D.

Anesthesia:

E., Hecht,

Helmreich,

Flightcrews

C. E. (1996).

Centered

D., Bollin,

G., Thiel,

Tigchelaar,

Ehn,

S., Stimson,

R.S., and

Management System.

Mangold,

System (Battelle

S.J. (1991).

incidents Report,

35

A review

reported

to

prepared

for

and the the

discussion

Aviation Department

of Safety of

Transportation).

Columbus,

OH:

Volpe

National

Transportation

Systems

Center. Fitts,

P. M.

(1946).

Journal

Fitts,

Psychological

of Aviation

P. M.

(1951).

Stevens

requirements

Medicine,

17(3),

Engineering

(ed.),

in aviation

of

design.

270-275.

psychology

Handbook

equipment

and

equipment

Experimental

design.

Psychology,

In

1287-1340.

S. S. NY,

Wiley.

FiRs,

P. M.

and

Jones,

"pilot-error" Report

experiences

Hutchins,

operating

Dayton,

Dominguez,

Instrument,

E.

Science,

of

factors

contributing

aircraft

controls.

Aero

Medical

OH:

C. O. (1995).

and

(1995

to

460

Memorandum Laboratory,

Air

Goal.

a).

How

Use-Centered

Integrating

the

Ergonomics

a

cockpit

Design:

in Design,

remembers

July.

its

speeds.

Cognitive

19, 265--288.

Hutchins,

E. (1995

Hutchins,

E.

Report,

Analysis

Command.

J. M. and User,

in

TSEAA-694-12.

Materiel

Flach,

R. E. (1947).

b).

(1996).

Cognition

The

in the

Integrated

Department

of Cognitive

Wild.

Mode

Cambridge

Management

Science,

University

MA,

MYF Press.

Interface.

Technical

of California,

San

Diego.

KeUy-Bootle, Press,

S.

(1995).

Cambridge

The

Computer

Contradictionary

MA.

36

(2nd

edition).

MIT

La

Burthe,

C.

(1997).

Presentation in the

Lanir,

Moll

13-16,

E., Cook,

automated

Stassen,

R. I., Woods,

Systems

Norman,

1992,

Pergamon

D. A. (1990). interaction,

Royal

D. A. (1993).

Design

Industrie.

Safety

and

Security

66-72.

Yue,

L. and

context: in

and

Howie,

Physician

the

heart

Evaluation

interaction

room. of

M.B.

In

H.G.

Man-Machine

p. 263-274.

'problem'

of automation:

'over-automation.'

of London,

D.D., in

Airbus

DC.

July,

controllers

Press,

The not

Society

Norman,

interaction

Analysis,

Aviation

Interactions,

intravenous

editor,

on

at

Washington

of alienation.

Human-computer

with

perspective

Conference

January

Agents

Charante,

(1993).

and

Century.

(1995).

van

Factors

at International

21st

J.

Human

Inappropriate

Philosophical

feedback

Transactions

of the

B 327:585--593.

Things

that

Make

us Smart.

Addison-Wesley:

Reading,

MA.

Obradovich, cope

J. H. and with

Factors,

O'Regan, world

poor

38(4),

J. K.

Woods,

HCI

D.D.

design

(1996).

Users

in computer-based

as designers: medical

How

devices.

people H um an

574-592.

(1992).

Solving

as an outside

the

memory.

"real"

mysteries

Canadian

of visual

Journal

perception:

of Psychology,

The 46, 461-

488.

Rasmussen,

J. (1985).

Trends

in human

reliability

1185-1196.

37

analysis.

Ergonomics,

28 (8),

Salter,

N.

B.

(1996).

Multiple

Agents:

Parasuraman

N.

B.

and

the

B.

and

N.

Flight

B. and

Woods,

Factors,

37: 5-19.

G. Salvendy, edition,

Wiley,

B. and

New

Surprises

B. and

Commission: A

the

to

In

R.

and

Human

with

Cockpit

Flight

Psychology,

Pilot

Study

System.

Full

D. D.

(1995).

and

Management

2:303-321.

Interaction

of Pilot's

with

Model

International

and

Cockpit

Awareness

Journal

of

Aviation

in the

world

in

supervisory

C. Billings,

C.

(1997).

of

Human

did

we

get into

control.

Automation

that

Human

Surprises.

FactorsErgonomics,

second

York.

Woods, Observed Mission

"How

awareness

Handbook

Woods,

Automation

In

Automation

Interaction

with

(1994).

Experimental

editor,

Agent:

N.

Pilot

of Aviation

D.D.

D. D. and

Independent

Sarter,

(1992).

Journal

error

N. B., Woods,

N.

Cockpit

Pilot

4:1-28.

Mode

Sarter,

Individual

Automation

Experiences

Management

mode?"

Sarter,

on

From

editors,

D.D.

Woods, II: An

Psychology,

Sarter,

Woods,

International

Automation of

Research

Mouloua,

I: Operational

System.

N.

M.

in

to Quality,

Erlbaum.

Automation

Sarter,

Quantity

Trends and

Performance,

Sarter,

From

D.D. A on

the

(1997

a).

Corpus

of

Airbus

D.D.

Teamplay

(1997

Breakdowns

in

Simulation

Study.

publication.

38

Operational

A-320.

b).

with

Human

Mode

Errors

Pilot-Automation Manuscript

a Powerful Experiences

Factors,

of

and and

in press.

Omission

and

Coordination submitted

for

Tenney,

Y.J.,

Level

Rogers,

Flight

Design

Deck

E. L. (1989).

Human

factors

Winograd,

T. and

Design.

In

Science

Woods,

D.

D.

(1996). In

Technology

and

D.D.

Cognition

and

J. and

Cognitive

T. Huang, DC,

and

Performance,

Designs

are

No.

on

High

Development

4669).

of

Hampton,

("glass

177528).

Challenges

P. ]ones, July,

a

VA:

cockpit")

Moffett

Field,

and

Analysis

(1994).

M.

39

Interactivity,

S. Kasif,

Apparent Mouloua,

(Eds.),

and National

Simplicity,

editors,

Real

Automation

Erlbaum.

about

How

Artifacts

Shape

in press.

Sarter, and

Center,

Representations

18:87--122

Human-Centered

1997.

R. I. and

Computers

for

Information,

Ergonomics,

Systems,

Science,

Report

Hypotheses

L., Cook,

D.A.

#

Opinions

technology

Automation:

Collaboration.

Norman,

the

Systems:

Human

Information

Toward Report

(1997).

R. Parasuraman

Cognitive

Ergonomic

D.

Decomposing

D. D., Johannesen,

Error:

D.

Washington

(1997).

Pilot

Center.

J. Flanagan,

Foundation,

(1995).

of advanced

Contractor

Research

Complexity,

tasks.

(NASA

Human-Centered

Intelligence.

Zhang,

Contractor

Center.

NASA-Ames

Woods,

(NASA

aircraft.

R.W. Issues:

Research

CA:

Woods,

Pew,

Langley

transport

Woods,

and

Automation

Philosophy

NASA

Wiener,

W.H.,

N.

B.

Hindsight. WPAFB,

Behind

Hu man

Crew Dayton

in distributed

Systems

OH,

1994.

cognitive

Complete

List

NASA

of Publications

Ames

Research

Cooperative

Agreement

Engineering

Book

Chapters

D.D. Sour

Woods and Accidents.

in

Domain,

N. Sarter,

Woods

D.D.

editor,

Wiley,

New

D.D.

Woods.

Erlbaum, and

Human

From

in press.

Automation

Surprises.

In G.

FactorsErgonomics,

Automation: and

Apparent

M.

Performance:

Quantitiy

Woods.

Cognitive

Management: editor, Human D.D.

second

Theory

to Quality,

Simplicity,

Mouloula,

From

Cockpit and

editors,

and

edition,

Real

Automation

Applications,

Individual

Pilot

Erlbaum,

p.

to Multiple

Automation In R. Parasuraman Human Performance, Erlbaum,

Woods

and

N. Sarter.

and

Activities

Evaluating

Cooperation.

editors,

Verification

Issues,

Springer-Verlag, Woods.

Demands

and

Validation Berlin,

Process

the

In J. Wise,

p.

and 267-

Tracing

Jersey,

D.D.

Woods.

Hochberg,

1993,

Engineering,

of New

of Complex

Systems:

Technology and

on

P. Stager,

Human

Factors

Methods

for the

Study

of Cognition

In G. A. Klein, in Action: Models

Outside

J. Orasanu and and Methods,

of R. Ablex,

p. 228-251.

Modeling and

Impact

V. D. Hopkin,

Fault In N. Stanton, London, 1994.

1993.

the Experimental Psychology Laboratory. Calderwood, editors, Decision Making New

in Dynamic

Abduction and Disturbance Management. Factors of Alarm Design, Taylor & Francis,

Human-Machine

Aided

NJ,

and Going Engineering

1996.

D.D.

D.D.

HiUsdale

Surprises Cognitive

1997.

Agents: Trends in Research on M. Mouloua, editors, Automation 280,

Applications

of Human

Decomposing and

Sarter.

from

2-592

C. Billings.

In R. Parasuraman

Technology 3-17, 1996. N.

NCC

Aerospace

Handbook

York,

Complexity,

support

N. Sarter. Learning from Automation In N. Sarter and R. Amalberti (Eds.),

in the Aviation

Salvendy,

on

Center

Research

Cognitive

based

and

B. Huey, Academic

predicting editors, Press,

human

Human New

4o

error.

Performance York,

1990.

In J. Elkind, Models

S. Card, for

J.

Computer-

Journal N.

Papem-

Salter

anti

Observed

and

Manuscript

D.D.

Agent: A Corpus the Airbus A-320. R.I. Cook and room. Human D.D. Woods. management. N. Sarter

Errors

and

Woods.

Teamplay

Adapting 593-613,

D.D.

and

D.D.

Experimental

Woods.

Study

Management 1994.

for publication, with

in the

Pilot

of Pilot's

System.

and

Model

and

in the

get

into

Human

Cockpit

that

Factors,

of the

on

operating

fault

mode?" 37: 5-19,

Automation

of Aviation

in intelligent

Surprises

in dynamic

we

Awareness

Journal

of flexibility

did

control.

with

Independent

Automation

technology

world

Interaction

International

D.D. Woods. The price Systems, 6:1-8, 1993.

and

Mission

1997.

a Powerful

to new 1996.

in supervisory

Woods.

Commission: in a Full

and directed attention 2371-2393, 1995.

"How

awareness

and

Coordination

of Operational Experiences Human Factors, in press.

D.D. Woods. Factors, 38(4),

and

of Omission

submitted

The alarm problem Ergonomics, 38(11),

error

N. Sarter

Mode

in Pilot-Automation

Study.

Sarter

Mode 1995.

Woods.

Breakdowns

Simulation N.

I).D.

II: An

Flight

Psychology,

interfaces.

4:1-28,

Knowledge-Based

N. Sarter and D.D. Woods. Pilot Interaction with Cockpit Automation I: Operational Experiences with the Flight Management System. International Journal N.

of Aviation

Sarter

and

D.D.

phenomenon.

Related

2:303--321,

1992.

Woods.

Situation

Awareness:

A critical

but

International

Journal

of Aviation

Psychology,

ill-defined

1(1):43-55,

1991.

ReportsT

K. Abbott, Lyddane,

Psychology,

S. Slotte, G. Thiel,

Tigchelaar, Flightcrews Administration,

N. and

D. Stimson,

E. BoMn,

R. Amalberti,

Sarter,

F. Fabre,

R. Helmreich

Modem Washington

Flight

and Deck

S. Hecht,

T. Newman, D. Woods.

Systems.

D. C., June

41

T. Imrich,

R. Pearson, The

Federal

18, 1996.

R. Lalley,

Interfaces Aviation

G.

H. Between