Psychology and the Instrument Panel (Apr, 1953)

This is a really interesting early article about usability design. Specifically designing user interfaces that reduce error rates and speed up operations. I think people most commonly associate bad user interfaces with software but this article shows that it they have a long and distinguished history.

Be sure to check out the insane electric meters on the fourth page. It wasn’t enough to make the dials all go in alternating directions, no, they had to share numerals between them as well!

|<<
<< Previous
1 of 7
|<<
<< Previous
1 of 7

Psychology and the Instrument Panel

Designing indicators, switches and other controls to fit the abilities of the men who will use them is a joint problem for psychologists and engineers

by Alphonse Chapanis

OUR MACHINES have become so complicated that we have been forced in recent years to start a new branch of technology: namely, re-tailoring the machines to the abilities and limitations of human beings. This activity, called human engineering, is a new departure in the application of psychological principles to industry. Up to now the main emphasis has been on selecting and training the best man for the job. Human engineering tries to fit the job to the man—any man.

It received its big impetus during World War II, and anyone who looks at the instrument panel of a military plane will instantly know why. The maze of dials, indicators, switches and controls in a modern aircraft eloquently explains why human error is the largest single cause of accidents. Air Force psychologists have systematically interviewed many experienced pilots to probe the specific sources of errors. In one study they asked each man if he had ever made or seen anyone else make an “error in reading or interpreting an aircraft instrument, detecting a signal or understanding instructions.” Here are two typical answers:

“It was an extremely dark night. My copilot was at the controls. I gave him instructions to take the ship, a B-25, into the traffic pattern and land. He began letting down from an altitude of 4,000 feet. At 1,000 feet above the ground, I expected him to level off. Instead, he kept right on letting down until I finally had to take over. His trouble was that he had misread the altimeter by 1,000 feet. This incident may seem extremely stupid, but it was not the first time that I have seen it happen. Pilots are pushing up plenty of daisies today because they read their altimeter wrong while letting down on dark nights.”

“We had an alert one morning about 11 o’clock. About 35 Japanese planes had been picked up on the radar screen. In the mad scramble for planes, the one I happened to pick out was a brand new ship which had arrived about two days previously. I climbed in, and it seemed the whole cockpit was rearranged. … I took a look at that instrument panel and viewed the gauges around me, sweat falling off my brow. Just then the first Japanese bomb dropped. I figured then and there I wasn’t going to get my plane up, but I could run it on the ground. That’s exactly what I did—ran it all around the field, up and down the run-way, during the attack.”

The 624 pilots questioned in the Air Force survey recounted 270 “pilot error” experiences like these; there were undoubtedly many more they had forgotten. Some errors are not important enough to be noticed; others are never reported because the pilots do not live to tell of them. Of the remembered errors the most common were misreading the pointers, reversals in interpretation of the readings, inability to see the instrument properly (because of dirt, poor position, poor lighting, etc.) and mistaking one instrument for another.

INSTRUMENT dials are not the only objects in an airplane that cause “human element” accidents. Many pilot errors stem from the design and position of controls. A recent study by Wright Field psychologists showed that by far the largest source of pilot error was in the confusion of controls. The pilot would pull the throttle when he intended to pull the propeller control, or change the gasoline mixture when he meant to pull the throttle. The reason is that the military planes of the late war were fiendishly inconsistent in the placement of controls. The throttle was on the left in the B-25, in the center on the C-47 and C-82. The propeller control was in the center on the B-25, on the left in the C-47, on the right in the C-82. The gas-mixture control was on the right in the B-25 and C-47, on the left in the C-82. Sometimes controls varied among models of the same airplane.

Another major source of trouble was that controls for opposite purposes were placed too close together. On many planes the controls for the wing flap and the landing gear were side by side. To land a plane, you must put one up and the other down; to take off, you do just the opposite. Serious accidents could be traced to confusing these two controls. Sometimes pilots moved controls in the wrong direction. This was not entirely the pilot’s fault: he was required to move a control in an “unnatural” direction—to flip a lever to the right when he wanted to go left, to push one control in while he was pulling another control out.

Certain directions of movement are psychologically natural. At least they go along with other things we are doing. For instance, toggle switches should move up for on, go or increase, and down for off, stop or decrease. Above all, the motions of related controls should be consistent. If a man must flip one switch up to turn something on and another down to turn something else on, he is apt to make mistakes. The point seems obvious, but it is frequently disregarded.

That is one kind of problem in control design. There are others that have no logic but must be solved by experiment. For example, what is the best gear ratio for a control knob and indicator, such as is used in radio tuning? We have two factors in opposition here: a fast-moving needle gets us to the neighborhood of the station quickly but makes precise tuning difficult, while a slow-moving needle is good for fine tuning but time-wasting for changing stations. By experiment we have found that the best compromise is reached when the pointer moves 1.5 inches for every revolution of the knob.

THE COMBAT information center (CIC) of modern warships has been extensively studied from the human engineering point of view. Into such a station pours information from various sources—sonobuoy, sonar, telephone, television, wireless telegraphy, radar, voice radio, teletype, infrared viewing devices and human observers. The system taxes the capabilities of the human beings working in it. In battle the gunnery officer must deal rapidly with a bewildering variety of information about targets. The air combat officer similarly has a host of things to think about—the number of combat patrols he has in the air, their reserves of fuel and ammunition, and so on. And decisions must be taken with somewhat less deliberation than was allowed to the captain of an 18th-century frigate. An officer in the CIC may consider himself lucky if he is given half a minute to make up his mind.

Let us say it takes 18 seconds from the moment a target is detected to communicate and evaluate the information and start firing. In that time a target traveling at 20 knots, which used to be considered fast, will have moved 200 yards. Today a gunnery officer must deal with speeds of a different order of magnitude. In 18 seconds a target traveling at 200 knots, which is slow as aircraft go, will have moved one nautical mile, and a target at 1,000 knots will have traveled five nautical miles!

To display the CIC information so that it can be grasped and acted upon quickly enough is a formidable problem in instrument design. The dials and indicators on the instruments must be analyzed in terms of psychological function. Engineering psychologists ask: Do these instruments tell the operator what he needs to know—and neither more nor less than he needs to know? Too much data can be as bad as too little; it slows communication and encourages errors.

Instruments provide three kinds of information. The simplest type merely indicates that something is or is not working. The turn-signaling lights on your
car are wired to a small blinking light on the instrument panel in front of you, to tell you that the signals are working. Instruments of this type have merely a yes-or-no function and rarely need a dial.

The next stage of complexity is an instrument that gives a qualitative reading; it tells you whether conditions are satisfactory, and if not, in what direction they are off. An example is the temperature gauge on your automobile. All you need to know is whether the engine is too hot or too cold; if the radiator springs a leak, you are warned of it by the temperature needle going up. As long as the temperature is within the allowable limits, you are not really interested in whether it is 130, 140 or 150 degrees. Most cars nowadays carry temperature gauges without numbers, and drivers do not miss them. Many instruments are essentially of this type, and engineers are surprised to find how often the fancy indicators and readings they put on machines turn out on analysis to be “unnecessary.

The third type of instrument requires precise quantitative readings. A compass must tell direction exactly if the navigator is to bring his ship to port. The altimeter on an aircraft is both a qualitative and a quantitative instrument. Most of the time the pilot needs to know his altitude only in a general way. But when he comes in for a landing he must know just how far from the ground he is.

HAVING DECIDED on what information is needed, the human engineer next turns his attention to the problem of conveying it most effectively. In the past, dials have been designed for convenience or appearance, not necessarily for readability. The round dial is convenient because, as one engineer puts it, “you can wrap 10 inches of scale around a three-inch dial.” Automobile speedometers are designed primarily to look nice. Some time ago the psychologist Robert Sleight, then at Purdue University, tested for readability the five dials shown on page 75. Notice that the size of numbers and pointers and the distance between numbers is the same on all the dials. Observers were given only .12 second—just long enough for a quick glance—to read a dial. The pointer was set either on a number or on a small mark between two numbers, and the subject was required to read the dial to the nearest half unit. Each dial was read a total of 1,020 times in Sleight’s tests. He found statistically significant differences in their readability. The open-window dial was the best: in 1,020 trials the subjects made only five errors in reading it. The round dial was next best (112 errors); the horizontal and vertical dials were poorest.

The way in which the scale is graduated is even more important than the shape of the dial. I have studied this factor by testing different scales on radar scopes. In one such test I compared 2.5 miles with 5,000 yards as the unit of distance marked on the scope (the two distances are practically the same). The average error in estimating ranges was only about half as great with the 5,000-yard marking as with the 2.5-mile. We can read 10s, 100s and 1,000s much more rapidly and accurately than other units. In my radar-scope tests the four best scales were 1,000, 10,000, 2,000 and 5,000 yards, in that order.

From many studies of scales, engineering psychologists have formulated some general principles about good and bad number systems. They have found that the scale of 10, subdivided into units of one, is the best of all. That scale-numbering is not a theoretical problem was impressed on me when I took a look around my own laboratory. There I found dials variously scaled with 1, 3, 5, 10 and 30 as the unit. The gas and electric meters in your own cellar are particularly frustrating examples of illegibility [see photograph at the top of page 78]. Part of the reading is indicated by pointers that turn clockwise, part by pointers that go counterclockwise. The meters violate two principles of efficient dial design: namely, that pointers should move clockwise to show increases, and that all in a group should go in the same direction. During the war, when there was a shortage of meter-readers, some utilities companies asked the public to read its own meters. The results were so chaotic that the companies quickly compromised by supplying cards on which the dials were printed and on which the householder was asked to draw the positions of the pointers. One wonders why the public should not be provided with meter dials it can understand.

SOMETIMES a dial is not as good as another kind of indicator. A case in point is the reading of the position of a target on a radar screen. To locate the direction of the target the operator rotates a cursor line until it falls on the bright spot made by the target. Usually he reads the bearing from a scale around the edge of the screen [see drawing on next page]. But a counter of the window type, which gives the degrees directly in numbers so that the operator need not read the scale, is more efficient. In a series of tests I found that the substitution of a direct-reading counter for the circular scale reduced reading errors from 10 per cent to 2 per cent and the time required for the reading from 3.3 seconds to 1.7 seconds.

The scope at the bottom of the opposite page illustrates another simple improvement in design. The concentric circles on a radar scope indicate distances from the observer. Generally he has to count the circles to the target pip to find its distance. But if, as in the screen illustrated, the circles are drawn differently, the first four with solid lines and the next four with dotted lines, he can read the distance much more quickly. Most of us can grasp four or five identical objects at a glance; beyond that number we must count.

Human engineering has also done a good deal of work in redesigning symbols, such as are used, for example, on road signs. For reading with maximum speed and accuracy the symbols should suggest the objects they stand for and be clearly distinguishable from one another. Coding systems based on shape, color, size and brightness have been developed, but we are not yet certain which type will produce the best results. One practical outcome of this research was Sleight’s design of traffic arrows which, size for size, can be recognized twice as far away as the ones now commonly seen [see drawings in center of page 76].

HUMAN engineering has already reached a breadth of investigation that can only be suggested here. It covers auditory studies for the improvement of voice communication, explorations of conditions in the working environment, investigations of the design and size of machines. An operator must be able not only to see the dials and reach the controls but to find space for his knees and toes!

Engineering psychologists are also attempting to answer some broader questions. Most of the cases discussed in this article apply to separate components in man-machine systems, but engineering psychology has much to contribute to the over-all design of these systems. The automatic machines of which we have heard so much lately open up a new host of man-machine problems. In what sense is man superior to computing machines; in what sense inferior? What should man’s role be in complex man-machine systems? Should he be simply a monitor? Should he be used as an integrator of sensing mechanisms? Should he be in a position to exercise executive decisions? These and many similar questions must be answered before we can design the most efficient man-machine combinations. The human engineer already has a great deal of information to work with, for psychologists have been studying capacities for over 70 years.

4 comments
  1. glindsey says: July 6, 20078:38 am

    Of course, the real reason for the different directions on the electric meter was mechanical: the dials were hooked to interlocking gears, so each gear moved in the opposite direction as the ones it was locked to.

  2. Charlie says: July 6, 20079:46 am

    No doubt, but it doesn’t seem like it would be that hard to reverse it. Now they are digital, but I wonder if they ever decided to change that.

  3. Blurgle says: July 6, 200711:14 pm

    Usability has been a factor in aircraft design and aircraft accident investigation for decades. One notable accident near Seattle in 1956 took place because the flight engineer left the cowl gills open on takeoff, causing flutter that made the handling pilot think the flaps were dangerously asymmetrical. Afraid to turn (which with asymmetric flaps would have caused a crash), the pilot ditched the aircraft in Puget Sound; everyone got out, but five people died of hypothermia in the 15 minutes it took rescuers to reach them. The engineer had previously worked on aircraft where the cowl gill switch was moved upwards to close them, but on this Stratocruiser the cowl gill switch had to be moved downwards. He knew that but when under pressure (during takeoff) he reverted back to his prior learning.

    Many aircraft accidents in the 40s and 50s were caused by designers changing specifications for no real reason, causing pilots and flight engineers to make mistakes based on years of prior experience. Altimeters would be moved from place to place, airspeed indicators would be calibrated slightly differently, etc., etc. Designers had to learn not to try to fix what wasn’t broken, and not to deliberately move things around just for the sake of making a new model look like a new model.

  4. jayessell says: July 7, 20079:40 am

    Not aircraft related, but the Andrea Doria sinking is blamed on the knob on the radarscope!

    According to “Engineering Disasters”, a $0.10 lightbulb to illuminate the range multipler knob would have prevented collision with the other ship.

Submit comment

You must be logged in to post a comment.