Hostname: page-component-78c5997874-4rdpn Total loading time: 0 Render date: 2024-11-15T14:24:31.875Z Has data issue: false hasContentIssue false

Engineering experts critics for cooperative systems

Published online by Cambridge University Press:  07 July 2009

Barry G. Silverman
Affiliation:
Institute for Artificial Intelligence, George Washington University, Staughton Hall, Room 206, Washington, DC 20052, USA
R. Gregory Wenig
Affiliation:
Institute for Artificial Intelligence, George Washington University, Staughton Hall, Room 206, Washington, DC 20052, USA

Abstract

Knowledge collection systems often assume they are cooperating with an unbiased expert. They have few functions for checking and fixing the realism of the expertise transferred to the knowledge base, plan, document or other product of the interaction. The same problem arises when human knowledge engineers interview experts. The knowledge engineer may suffer from the same biases as the domain expert. Such biases remain in the knowledge base and cause difficulties for years to come.

To prevent such difficulties, this paper introduces the reader to “critic engineering”, a methodology that is useful when it is necessary to doubt, trap and repair expert judgment during a knowledge collection process. With the use of this method, the human expert and knowledge-based critic form a cooperative system. Neither agent alone can complete the task as well as the two together.

The methodology suggested here offers a number of extensions to traditional knowledge engineering techniques. Traditional knowledge engineering often answers the questions delineated in generic task (GT) theory, yet GT theory fails to provide four additional sets of questions that one must answer to engineer a knowledge base, plan, design or diagnosis when the expert is prone to error. This extended methodology is called “critic engineering”.

Type
Research Article
Copyright
Copyright © Cambridge University Press 1993

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Bailley, D. 1991. Developing a Rule Base for Identifying Human Judgement Bias in information Acquisition Tasks, Master Thesis, Engineering Management Department, George Washington University, DC, 04.Google Scholar
Berger, M, Coleman, L, Muilenburg, B and Rohrer, R, 1991. Influencer Effects on Confirmation Bias, Experiment Report, Institute for Artificial Intelligence, George Washington University, DC, 04.Google Scholar
Bingham, WP, Datko, L, Jackson, G and Oh, S-H, 1990. Establishing and Approving Quality Critical Operational Issues: An Application Using the COPE Shell, Experiment Report, Institute for Artificial Intelligence, George Washington University, 04.Google Scholar
Boose, J, 1984. “Personal construct theory and the transfer of human expertise”. In: Proceedings of the National Conference on Al, Morgan Kaufman.Google Scholar
Bylander, T and Chandrasekaran, B, 1987. “Generic tasks for knowledge based reasoning: The right level of abstraction for knowledge acquisitionInt. J. Man-Machine Stud. 26 231243.CrossRefGoogle Scholar
Creevy, L, Neal, R, Reichard, E and Turrentine, B, 1991. Comparison of a Dynamic Critic to a Static Critic Using the Wason's 2 4 6 Problem, Experiment Report, Institute for Artificial Intelligence, George Washington University, 04.Google Scholar
Fischer, G, 1987. “A critic for Lisp”. In: Proceedings 10th International Joint Conference on Artificial Intelligence, pp 177184, Morgan Kaufman.Google Scholar
Gingrich, S, Lehner, G, Lepich, C, Powell, D, and Vautier, J, 1990. The Mission: A Group One Final Report, Experiment Report, Institute for Artificial Intelligence, George Washington University, 04.Google Scholar
Herron, T, Mackoy, R, Mohan, et al. , 1990. Critics for Expert Statisticians, Experiment Report, Institute for Artificial Intelligence, George Washington University, 12.Google Scholar
Langlotz, CP and Shortliffe, EH, 1983. “Adapting a consultation system to critique user plansInt. J. Man–Machine Stud. 19 479496.CrossRefGoogle Scholar
Miller, PL, 1983. “ATTENDING: Critiquing a physician's management planIEEE Trans. PAMI 5 (5) 09, 449461.CrossRefGoogle ScholarPubMed
Ratte, D, Crowe, C, Skarpness, P and Allen, T, 1991. Watson's 246 Confirmation Bias, Experiment Report, Institute for Artificial Intelligence, George Washington University, 04.Google Scholar
Silverman, BG, 1990. “Critiquing expert judgment via knowledge acquisition systemsAI Magazine 11 (3) Fall, 6079.Google Scholar
Silverman, BG, 1991a. “Expert critics: Operationalizing the judgment/decisionmaking literature as a theory of ‘bugs’ and repair strategiesKnowledge Acquisition 3 (2), 06, 175214.CrossRefGoogle Scholar
Silverman, BG, 1991b. “Criticism based knowledge acquisition for document generation” Innovative Applic. Artif. Intell. pp 291319, AAAI Press.Google Scholar
Silverman, BG, 1992a. Critiquing Human Error: A Knowledge Based Human Computer Collaboration Approach, Academic Press.Google Scholar
Silverman, BG, 1992b. “Human-computer collaborationHuman-Computer Interaction 7 (2) Summer, 165196.CrossRefGoogle Scholar
Silverman, BG, 1992c. “Modeling and critiquing the confirmation bias in human reasoningIEEE Trans. Systems. Man and Cybernetics, 09/10 973983.Google Scholar
Silverman, BG, Donnell, ML and Bialley, D, 1992. Toward the Implementation of Cognitive Bias Theory: A Methodology and a Rule Base, Institute for AI Technology Report, Washington, DC.Google Scholar
Silverman, BG and Mezher, T, 1992. “Expert critics for engineering design applicationsAI Magazine 13 (1) 04, 4562.Google Scholar
Spickelmier, RL and Newton, AR. 1988. “Critic: A knowledge-based program for critiquing circuit designs” Proceedings of the IEEE International Conference on Computer Design: VLSI in Computers and Processors, pp 324327, IEEE Computer Society Press.Google Scholar
Staff, , 1989. Knowledge Engineers Guide to COPE, Potomac: IntelliTek.Google Scholar
Staff, , 1990. COPE Users Guide, Potomac: IntelliTek.Google Scholar
Zhou, H, Simkol, J and Silverman, BG, 1989. “Configuration assessment logics for electromagnetic effects reduction (CLEER) of equipment on naval shipsNaval Engineers J. 101 (3) 05, 127137.CrossRefGoogle Scholar