Read Human-Computer Interaction as Cognitive Science text version

PROCEEDINGS of the HUMAN FACTORS AND ERGONOMICS SOCIETY 46th ANNUAL MEETING -- 2002 Proceedings of the Human Factors and Ergonomics Society 46th Annual Meeting -- 2002

1767

HUMAN-COMPUTER INTERACTION AS COGNITIVE SCIENCE

Ronald Laurids Boring Human Oriented Technology Lab & Cognitive Science Program Carleton University Ottawa, Ontario, Canada Human-computer interaction and cognitive science share historical interdisciplinary roots in human factors, but the two fields have largely diverged. Many attempts have been made to apply cognitive science to human-computer interaction, but the reverse is curiously not the case. This paper outlines ways in which human-computer interaction can serve as a unifying framework for cognitive science.

INTRODUCTION

skills, making HCI a truly interdisciplinary field. As Dix, Finlay, Abowd, and Beale suggest (1993, p. 3):

The ideal designer of an interactive system would have expertise in a range of topics: psychology and cognitive science to give her knowledge of the user's perceptual, cognitive and problem-solving skills; ergonomics for the user's physical capabilities; sociology to help her understand the wider context of the interaction; computer science and engineering to be able to build the necessary technology; business to be able to market it; graphic design to produce an effective interface presentation; technical writing to produce the manuals; and so it goes on.

Considerable attention has been paid to the task of infusing human-computer interaction (HCI) with cognitive science (Bamard, 1995; Card, Moran & Newell, 1983; Carroll, 1997; Gardiner & Christie, 1987; Green, Davies & Gilmore, 1996; Guindon, 1988; Landauer, 1995; Pollitzer & Edmonds, 1996). There is, however, a dearth of information about the reverse situation (Boring, 2001). Little has been written about the role that HCI can play in shaping cognitive science. In this paper, I will attempt to redress this shortcoming by introducing an HCI framework that serves as a unifying approach for the subdisciplines of cognitive science. I will begin by defining HCI and cognitive science in a human factors context.

WHAT IS HCI?

Despite this motley mixture of disciplines and perspectives, there is nonetheless a unifying theme across HCI subdisciplines. This theme-the application of knowledge to the common domain of user interface design-is what holds HCI together as a cohesive discipline and a distinct subdiscipline of human factors.

WHAT IS COGNITIVE SCIENCE?

Broadly speaking, HCI is a subfield of human factors (Perlman, Green & Wogalter, 1995). This placement of HCI in a larger context is acknowledged by the Association for Computing Machinery, whose popular annual CHI conference is properly called the Conference on Human Factors in Computing Systems. According to the Human Factors and Ergonomics Society (HFES), human factors is (1998, p. 1):

...the science that explores human capabilities and behavior and how these characteristics are incorporated into the design, evaluation, operation, and maintenance of products and systems that are intended for safe, effective, satisfying use by people.

In lieu of the phrase "products and services," the HCI specialist might insert the phrase "computer interfaces." Despite the broader field of human factors, Perlman et al. concede that computer technology and use have advanced so substantially that research in HCI can stand on its own, apart from the other domains of human factors research. So, what exactly is HCI? Succinctly defined, HCI is the science of designing usable, discoverable, and satisfying software interfaces (Dillon, 1983). As a science, HCI is pre-theoretic: there is no single or unified approach to conducting HCI research, nor is there a central perspective or skill set shared by HCI researchers. In fact, the design of interfaces requires many perspectives and

The lack of a common theme such as is found in HCI has hindered the interdisciplinary cohesion of cognitive science. Ask any group of cognitive scientists to define cognitive science, and there will be a multitude of answers. Some will say that cognitive science is the science of the mind; others will say it is the science of humans as information processors; others will say it is a behavioral science; still others will say it is nothing more than cognitive psychology. Cognitive science is difficult to define, because it encompasses many subdisciplines with different research questions and methods. The sum of these subdisciplines is an eclectic and often disjointed field. While there is no unified cognitive science, the cognitive sciences share threads of the same fabric. The subdisciplines of cognitive science share a topical centerpiece, namely the human mind. Of course, mind means different things to different disciplines. To neuroscience, it refers to the brain; to psychology, linguistics, and philosophy, it refers to a representational system responsible for behavior, language, and thoughts; to artificial intelligence, it refers to a simulatable central processing system. Beyond this agreement on the general topic of mind, today's cognitive subdisciplines differ considerably from one another.

1768

PROCEEDINGS of the HUMAN FACTORS AND ERGONOMICS SOCIETY 46th ANNUAL MEETING -- 2002 Proceedings of the Human Factors and Ergonomics Society 46th Annual Meeting -- 2002

The classification of mental phenomena is the guiding force of cognitive science, but often to such an extent that it undermines the cohesiveness of the discipline. As the simple questions of cognitive science were answered and it became clear that not all the big questions could be answered right away, research began to focus on smaller and smaller mental phenomena. Newell estimated in 1992 that cognitive science had amassed approximately 3000 facts about mental processes. The danger with such a proliferation of facts is that they often point in contradictory directions, making it difficult to synthesize data into theories (Newell, 1990). As new fields such as neuroscience, philosophy, education, and anthropology have been added to existing cognitive stalwarts, the proliferation of data continues at an astounding rate. But, since there is no central theory to unify these facts, cognitive science risks becoming a collection of isolated facts about mental phenomena. Fact overload has become such a staple of cognitive science that several researchers have proposed the need for multiple levels of explanation for mental phenomena (Anderson, 1993; Dawson, 1998; Pylyshyn, 1984). Based ars on M r ' (1982) explanation of vision at several distinct levels, the proponents of multiple-level hypotheses suggest that cognitive science may best be integrated by acknowledging that mental phenomena can be explained biologically, procedurally, and computationallyat the same time, with each cognitive subdiscipline contributing its appropriate level of explanation. While there are certainly good, integrative intentions behind the multiple-level approach, such an approach risks exacerbating the factfinding that presently characterizes cognitive science. By offering a means to incorporate even more facts, the multiple-level approach might even defeat the ability of cognitive science to formulate general, theoretic conclusions about mental phenomena. What cognitive science needs much more than a continued proliferation of facts is a way to tie together its disparate subdisciplines through a common research theme. It is my belief that HCI provides exactly such a unifying theme.

HCI AND COGNITIVE SCIENCE Common Origins HCI and cognitive science share a common early history. The impetus for both fields came with World War 11. In the case of human factors, there was a desperate need for academic researchers to contribute to the War effort in Britain and the USA. A large number of psychologists were conscripted to devise better and faster methods to train the large numbers of soldiers and pilots needed to fight the War. What began largely as an educational endeavor changed emphasis as these researchers realized that their training could only be made as effective as the machines their pupils were trained to use. As researchers became

directly involved in making machines easier to operate, the field of human factors emerged (Meister, 1999). Wartime researchers who contributed to the emergence of human factors included, among others, Frederick Bartlett and Donald Broadbent at the Applied Psychology Laboratory at Cambridge University, Alphonse Chapanis and Paul Fitts at Wright Field, and S.S. Stevens and George Miller at the Harvard University Psycho-Acoustic Laboratory (Roscoe, 1997). While some of these researchers continued to work on the emerging field of human factors after the War, several returned to traditional academia. It is important to note that those who returned to academia-notably Broadbent and Miller-are counted among the founding fathers of the cognitive revolution. These researchers' experiences working on applied problems helped foster their interest in the underlying cognitive dimensions of human behavior. While both the Human Factors Society (now HFES) and the Society of Engineering Psychologists (now American Psychological Association Division 2 1) were founded in 1957, it was not until the 1960s that research specifically addressed the interaction of humans with computers. Then novel computing ideas such as text editing, graphical interfaces, and pointer devices (Myers, 1998) paved the way for a gradual delineation of human factors and HCI. Whereas human factors was shaped largely by a collaboration between psychologists and engineers, HCI emerged out of the interaction between psychologists and computer scientists. The proliferation of computers subsequently fueled the growth of HCI as a distinct field. The impetus for cognitive science came in the 1940s and 1950s, shortly after human factors came into being. Up to and during World War 11, North American psychology was firmly in the midst of the behaviorist school of thought. Two wartime factors helped to overturn behaviorism, ever so slowly (Gardner, 1987). I have already discussed the first factor, which was the enlistment of experimental psychologists to aid the War cause. While behaviorist educational perspectives flourished during this period, there were strains put on this stimulus-response approach to learning. Behaviorism proved itself unable to incorporate the types of applied problems that were giving rise to human factors. It was equally inflexible in treating or explaining the psychological traumas that soldiers experienced. While neither of these two challenges in themselves pointed to outright cognitivism, they both helped to demonstrate the limitations of behaviorism. A second wartime factor in the emergence of cognitive science was the development of early computers. The basic principles of information processing, the underpinnings of computing systems to this day, were developed and formalized during and immediately subsequent to World War 11. Though the influence of computers on psychology and other disciplines was still

PROCEEDINGS of the HUMAN FACTORS AND ERGONOMICS SOCIETY 46th ANNUAL MEETING -- 2002 Proceedings of the Human Factors and Ergonomics Society 46th Annual Meeting -- 2002

1769

some time off, it was this information processing framework that shaped much of later cognitive science. The rest of the history of cognitive science is well documented and needs only a cursory mention here. The final blow to behaviorism came in the 1950s, when early cognitivists including Chomsky, Miller, Minsky, Newell, and Simon laid the groundwork for the so-called cognitive revolution (Gardner, 1987). No single event or person marked the turning point from behaviorism to cognition, but by the 1970s, cognitive psychology, Chomskyan linguistics, and artificial intelligence took the research spotlight. Loosely joined under the guise of cognitive science, these disciplines engaged in the task of describing and modeling human mental processes. While HCI and cognitive science share common origins in human factors, their subsequent development diverged. HCI continued along a largely applied course, while cognitive science focused primarily on theoretic concerns. Recently, several researchers (e.g., Card et al., 1983; Green et al., 1996) have suggested it might be time to bring the two disciplines together, by applying cognitive science to HCI.

The Traditional Approach: Cognitive Science in HCI Why does HCI need cognitive science? The answer is quite straightforward. In order to understand the user-the human in HCI-it is useful to have a firm grounding in the cognitive processes that the user undergoes (Preece, Sharp, Benyon, Holland & Carey, 1994). The user perceives information, uses knowledge, and makes decisions, all of which are the domains of cognitive science. Understanding the user allows the HCI researcher to predict and explain the interaction that occurs between the user and the computer system. Without such an understanding, it is difficult to facilitate optimized interaction in the design of software. Cognitive science already makes a strong contribution to HCI, a fact that HCI researchers readily acknowledge. Alongside computer scientists, cognitivists comprise the largest proportion of HCI researchers. Perhaps it is because of the unmistakable presence of cognition in HCI that so many researchers have addressed the role of cognitive science in HCI. Card et al. (1983) are the forerunners of applying cognitive science to HCI. They propose the model human processor, a simplified version of the general information processing paradigm in cognitive science. The model human processor has three information processing systems, which interact to simulate the cognitive processes an actual human would undergo. These systems-the perceptual, motor, and cognitive systems-allow a rich modeling of human performance on interaction tasks. Using Card et al. 's GOMS (Goals, Operations, Methods, and Selection rules) model, it is possible to quantify the tasks, decisions, and time required by a user of a particular software interface. The GOMS model provides a flowchart of the

cognitive processes in HCI, thereby affording the HCI researcher insight into the human response to a particular system. In practice, GOMS has proven a mixed bag for HCI researchers. The level of information provided by a GOMS analysis is extensive. However, there is debate regarding how informative GOMS-derived information is to the actual design of a software interface (Green et al., 1996). Outlining cognitive processes in terms of goals and selection rules may not always instruct the HCI researcher on why a particular interface is inherently more usable, more satisfying, or more discoverable. Green et al. point out that the failure of GOMS and similar cognitive models to be adopted by HCI practitioners has sometimes led to the disfavor of cognitivism in HCI. Cognitivism, thus construed, is seen as too academic and too far removed from the dynamic design cycle to contribute extensively to HCI. As Hammond, Gardiner, Christie, and Marshall (1983) point out, traditional cognitive research is time consuming, and time is something that is not always available to a practitioner of HCI. Hammond et al. suggest that models like GOMS be used to develop general principles for use in HCI, rather than to answer mission critical design questions. GOMS is not the only cognitive approach to HCI. Hewett and Adelson (1998) suggest that HCI is guided by two concurrent approaches. In the first approach, the design of interfaces stems from cognitive principles. For example, Miller's (1956) constant for the number of items that may be stored in short-term memory, 7 f 2, may be used as a guiding principle in the design of a software menu system. A design principle derived from cognition might stipulate that there should be no more than seven items on a menu. In the second approach, which is influenced by engineering, interface analogies are drawn to existing solutions. When designing a particular software interface, the designer may reach a design impasse. The solution to the interface problem is then drawn based on the solution to the real-world problem. Hewett and Adelson (1998) give the example of an online help system for the design-by-analogy approach. When confronted with the problem of how to select the appropriate level of help information, the designer draws an analogy to looking for an article in a stack of journals. She realizes that it is not always necessary to have all the information available. She sees that although she began perfonning a serial search for the article in the journals, cues in the journals remind her of the context in which she first read the article. This allows her to abandon the serial search and go directly to the journal that represents the time frame in which she first read the article. By analogy, there may be times when hints are sufficient to trigger the appropriate recall of how to accomplish a given task. Using design by analogy, a help system might be designed with two levels of help. The first level encompasses brief hints to assist experienced users to retrieve from memory the

1770

PROCEEDINGS of the HUMAN FACTORS AND ERGONOMICS SOCIETY 46th ANNUAL MEETING -- 2002 Proceedings of the Human Factors and Ergonomics Society 46th Annual Meeting -- 2002

context in which they previously were able to achieve the task. A second level of help offers detailed instructions, in case the contextual hint fails to assist the user. Just as GOMS exhibits some shortcomings of the cognitive approach to HCI, applying cognitive principles to design is fraught with limitations. Hewett and Adelson (1998) argue that this approach easily results in the misapplication of cognitive principles. For example, using Miller's constant as a guide to menu design may be inherently misguided, because menu items are usually grouped together. Such grouping of items, or chunking as it is widely known, can easily overcome the limitations of the number 7 f 2. If items are chunked, they become a single entity in memory. Designing around Miller's constant proves problematic, because it is difficult to anticipate the extent of memory chunking for menus. Hewett and Adelson instead suggest the design-by-analogy approach may prove more fruitful for HCI design. In their examples, the design-by-analogy approach provides cognitive insights into users, whereas applying cognitive principles to design does not readily elucidate the design process. Such shortcomings of the cognition-to-HCI approach may be unavoidable. Cognitive science cannot fully inform HCI, because it is cognitive science that stands to benefit from the model used by HCI. Perhaps, the student is instructing the teacher-ognitive science is being applied to HCI. I believe the more informative lessons will be learned when the roles are reversed-when HCI instructs cognitive science. The Novel Approach: HCI as Cognitive Science The research literature treats the relationship between cognitive science and HCI in a one-way fashion, with cognitive science contributing to HCI. I suggest that there is much to be gained by looking at the two-way relationship, including the contributions that HCI can make to cognitive science. What can be gained by using HCI as a model for cognitive science research? Isn't it a bit presumptuous for a field that admittedly has only a minimal theoretical framework to guide another field that has striven for forty years to develop a theory? I have already established my view that cognitive science has failed to bring together its subdisciplines into a cohesive entity. I have also mentioned the glue that holds the subdisciplines of HCI together-the theme of applying knowledge to the common domain of user interface design. I suggest that this theme might prove one way to unify the cognitive subdisciplines in a manner heretofore unseen in cognitive science. A crucial component of the unifylng theme of HCI and my proposed theme for cognitive science is applied research. Applied research is simply research that is put to the task of solving a real-world problem. Bamard explains that in a cognitive context (1995, p. 641):

.

Applied cognitive [science] attempts to bridge the gap between the properties of cognition as studied in the more abstract laboratory tasks and those phenomena that are characteristic of cognition in the tasks of everyday life. There is considerable conceptual and methodological overlap between the core discipline of cognitive [science] and its applied counterpart. There are also differences. In the laboratory, hotly debated theoretical issues may focus upon predictions concerning relatively small differences in behavior whose detection requires carehlly controlled experimental conditions. The variables that give rise to those effects may be contributing relatively little to the overall ease or difficulty of carrying out the tasks of everyday life.

Application is what is currently missing from cognitive science. The lack of common methods, terminology, or goals has fortified the distinctiveness of the individual subdisciplines of cognitive science. Having an application for cognitive science would give the subdisciplines a starting point toward a unified approach. Application provides a framework for common methods, terminology, and goals, without abandoning the unique or distinct contributions of the individual subdisciplines. Applied research also overcomes the artificiality associated with much current cognitive research. Hutchins (1995) recommends what he calls "cognition in the wild." This is an understanding of mental processes that occur naturally-in the wild-not under carefully controlled laboratory settings. By focusing research too narrowly on internal mental phenomena, cognitive science has developed research methods that isolate mental phenomena from the context of the external world. Much of what is known about cognition is, in Hutchins' view, an artifact of laboratory experimentation. The solution is to remove cognition from the laboratory and put it in a real-world context. Applied cognitive research is one means to accomplish cognition in the wild. By opening up cognitive research to real-world domains, the artificiality of such research is diminished. There is yet another important byproduct of applied research. Typically, one would expect application to flow from theory. The reverse scenario is equally feasible. With cognitive researchers working on applied problems, it is certainly true that new hypotheses and theories will result. In fact, the lack of a viable unifying theory in current cognitive science suggests that the current research strategy is not proving entirely fertile ground for theory building. The move to applied research certainly cannot hurt the general cause of theory building. How do I envision unifying cognitive science under the guise of applied HCI research? The application framework is already in place in current HCI. Cognitive psychology can apply its expertise of knowledge structures to create better models of the user. Linguistics can apply its expertise in language to the development of voice recognition and speech synthesis interfaces. Artificial intelligence can apply its expertise to the development of intelligent user interfaces. Neuroscience can work to create biologically sound simulations of human perceptual and

PROCEEDINGS of the HUMAN FACTORS AND ERGONOMICS SOCIETY 46th ANNUAL MEETING -- 2002 Proceedings of the Human Factors and Ergonomics Society 46th Annual Meeting -- 2002

1771

cognitive processes. Philosophy can contribute to the ongoing ethical debates about the accessibility of computer technologies, and for those more inclined to philosophy of mind, there is even the as yet uncharted domain of the philosophy of HCI, whereby philosophy could serve a moderating, definitional, and theoretical role in HCI. To some degree, these subdisciplines are already involved in HCI, but without the banner of a unified cognitive science. Cognitive science stands to learn a great deal from HCI-not by applying its theories to HCI, but by allowing new theories to result from applied HCI research. With HCI as the catalyst for cognitive science, there would emerge a more cohesive cognitive science as well as a wealth of new theories fueled by applied findings.

CONCLUSIONS

I do not wish to conclude in this paper that there can be no fi-uithl cognitive science without HCI. Of course cognitive science has managed thus far without the intervention of HCI. But, there are important lessons that can be learned by applying HCI to cognitive science, lessons which have been absent in the one-way push to apply cognitive science to HCI. I am advocating one approach to cognitive science, not a unifylng theory to which all cognitive disciplines must ultimately subscribe. In fields as diverse as cognitive science and HCI, there is ample room for multiple frameworks. In this paper, I have simply endeavored to set the stage for one possible framework in which cognitive science and HCI converge.

REFERENCES

Anderson, J.R. (1993). Production systems and the ACT-R architecture. In Rules of Mind (pp. 1-14). Hillsdale, N J Lawrence Erlbaum. Barnard, P. (1995). The contributions of applied cognitive psychology toward the study of human-computer interaction. In R.M. Baecker, J. Grudin, W.A.S. Buxton, and S. Greenberg (eds.), Readings in Human-Computer Interaction: Toward the Year 2000. Second Edition (pp. 640-658). San Francisco, CA: Morgan Kaufman Publishers. Boring, R.L. (2001). User-interface design principles for experimental control software. CHI 2001: Extended Abstracts, pp. 399400. Card, S.K., Moran, T.P., & Newell, A. (1983). The Psychology of Human-Computer Interaction. Hillsday, NJ: Lawrence Erlbaum. Dawson, M.R. W. (1 998). Understanding Cogntiive Science. Oxford, UK: Blackwell Publishers. Dillon, R.F. (1983). Human factors in user-computer interaction: An introduction. Behavior Research Methods & Instrumentation,

15, 195-199.

Gardner, H. (1987). The Mind's New Science: A History of the Cognitive Revolution. New York,NY:Basic Books. Green, T.R.G., Davies, S.P., & Gilmore, D.J. (1996). Delivering cognitive psychology to HCI: The problems of common language and of knowledge transfer. Interacting with Computers, 8,89- 1 1 1. Guindon, R. (1988). Cognitive Science and its Applications for Human-Computer Interaction. Hillsdale, NJ: Lawrence Erlbaum. Hammond, N.,Gardiner, M.M., Christie, B., & Marshall, C. (1987). The role of cognitive psychology in user-interface design. In M.M. Gardiner and B. Christie (eds.), Applying Cognitive Psychologv to User-Interjiace Design (pp. 13-54). Chichester, UK: John Wiley & Sons. Hewett, T.T., & Adelson, B. (1998). Psychological science and analogical reminding in the design of artifacts. Behavior Research Methods, Instruments & Computers,30,3 14-3 19. Human Factors and Ergonomics Society. (1 998). Human Factors & Ergonomics: Designing for Human Use. Santa Monica, CA: Human Factors and Ergonomics Society. Hutchins, E. (1995). Cognition in the Wild. Cambridge, MA: MIT Press. Landauer, T.K. (1995). Let's get real: A position paper on the role of cognitive psychology in the design of humanly usefid and usable systems. In R.M. Baecker, J. Grudin, W.A.S. Buxton, and S. Greenberg (eds.), Readings in Human-Computer Interaction: Toward the Year 2000. Second Edition (pp. 659665). San Francisco, CA: Morgan Kaufman Publishers. Man; D. (1982). Vision.San Francisco, CA: W.H. Freeman. Meister, D. (1 999). n e History of Human Factors and Ergonomics. Mahwah, NJ: Lawrence Erlbaum. Miller, G.A. (1956). The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychological Review, 63,8 1-97. Myers, B.A. (1 998). A brief history of human-computer interaction technology. Interactions,5,4444. Newell, A. (1990). Unified Theories of Cognition. Cambridge, MA: Harvard University Press. Newell, A. (1992). Unified theories of cognition and the role of Soar. In J.A. Michon and A. Akyuerek (eds.), Soar: A Cognitive Architecture in Perspective: A Tribute to Allen Newell (pp. 2579). Dordrecht, Netherlands: Kluwer Academic Press. Perlman, G., Green, G., & Wogalter, M.S. (1995). Preface. In G. Perlman, G.K. Green, and M.S. Wogalter (eds.), Human Factors Perspectives on Human-Computer Interaction: Selections from Proceedings of Human Factors and Ergonomics Society Annual Meetings 1983-1994 (pp. vii-x). Santa Monica, CA: Human Factors and Ergonomics Society. Pollitzer, E., & Edmonds, E. (1996). Editorial: The evolving partnership between cognitive science and HCI. International Journal of Human-Computer Studies, 44,73 1-741. Preece, J., Rogers, Y., Sharp, H., Benyon, D., Holland, S., & Carey, T. (1 994). Human-Computer Interaction. Reading, MA: Addison- Wesley. Pylyshyn, Z.W. (1 984). Computation and Cognition. Cambridge, MA: MIT Press. Roscoe, S.N. (1 997). The Adolescence of Engineering Psychology. Volume I , Human Factors History Monograph Series. Santa Monica, CA: Human Factors and Ergonomics Society.

Dix, A., Finlay, J., Abowd, G., & Beale, R. (1 993). Human-Computer Interaction. Heme1 Hempstead, UK: Prentice Hall Intemational. Gardiner, M.M., & Christie, B. (1997). Applying Cogntiive Psychologv to User-Interjiace Design. Chichester, UK: John Wiley & Sons.

Information

Human-Computer Interaction as Cognitive Science

5 pages

Report File (DMCA)

Our content is added by our users. We aim to remove reported files within 1 working day. Please use this link to notify us:

Report this file as copyright or inappropriate

953551


You might also be interested in

BETA
Human Factors Design Guidelines
Human-Computer Interface Usability in a Survey Organization: Getting Started at the Census Bureau
%!PS-Adobe-3.0