It’s the System’s Decision
Uncovering Expert Systems and Ethics in the Workplace
» Download PDF
“I think, to be perfectly candid, the objective isn't to support managers, but to eliminate them. Management is a very interesting, but possibly not
a very important job in the future. We have used technology brilliantly to
eliminate primarily clerical jobs. Suddenly, the beast is going to turn on
those people who have funded projects and felt as if their skill-set was not
a replaceable skill-set. The managerial skill-set” *1
The implementation of Expert Systems (ES) has broadened from
the automation of structured and repetitive business processes to
systems that require a very high degree of expertise and control,
generating fundamental changes in the way businesses operate. As
evidenced in Randall Field’s statement, some corporations are moving
into a domain in which humans are replaced by and subject to the
“intelligence” of a machine. There is a growing sense that "more than
being helped by computers, companies will live by them, shaping strategy
and structure to fit new information technology" 2 While the importance
of the technical design and implementation of ES is evidenced by the
considerable literature on the subject, there is a lack of comprehensive
analysis of the ethical issues that arise from employee interaction with
Organizations are today facing new types of ethical dilemmas
associated with the information technologies they are increasingly
depending upon. An ethical dilemma, according to Abramson, is as a
situation in which “there are conflicts and tensions concerning the right
and the good, when choosing one course of action will uphold one more
principle while violating another” 3 Hence, it is important to understand
the ethical implications of a gradually more common situation in which
many employees are increasingly depending upon automated systems for
decision making purposes. When acting as “passive acceptors”,
employees can practice moral disengagement 4 which may have
dangerous organizational and social consequences. At the same time, it is
critical to understand that there are no “ethically neutral” ES and identify
potential sources of conflict between the user’s and the system’s moral
This paper aims to open the debate on the ethics of employee
interaction with Expert Systems. While the importance of the technical
design of ES is evidenced by the considerable literature on the subject,
press and academic attention has virtually ignored the ethical issues
pertaining to the actual user’s interaction with the system. The core
problem arises when employees perceive these “intelligent” systems as
legitimate authorities, disengaging morality from their conduct 5 and
decreasing their autonomy
On the nature of Expert Systems
Expert Systems, also called Knowledge-based systems (KBS), are
computer programs or systems which emulate the decision-making and
problem solving abilities of a human expert. 6 More formally defined, ES
are computer programs which use non-numerical domain-specific
knowledge to solve problems with a competence comparable with that of
human experts.” 7 By using “human knowledge” stored in the system’s
database, ES intend to solve problems that would otherwise require an
Expert systems have three main components: a knowledge base,
a reasoning or inference engine, and an interaction interface. 9 The
knowledge database will provide the “raw material” and will define what
information the system will use. It is defined as a finite stock of
information that ES will use for problem solving. The inference engine will
determine how to use that information thus designing the system’s
reasoning chains. Although all the elements are important for the
usability and quality of the system, the knowledge base occupies a
fundamental role. Some authors go even further claiming that the
knowledge base is what will define the success or failure of an ES. 10 Nonetheless, the component that makes the system “intelligent” is not
the database or the ability to store very large amounts of data but the
“reasoning or inference” engine. This engine will determine the rules that
the system will utilize for solving a specific problem. One of the most
common problem-solving models involves the chaining of IF-THEN rules to
form a line of reasoning. If the chaining starts from a set of conditions and
moves toward some conclusion, the method is called forward chaining. If
the conclusion is known but the steps to that conclusion remain unknown,
then the method is backward chaining.
Because expert systems intend to simulate human intelligent
behavior, ES development occurs within the Artificial Intelligence (AI)
domain. The latter is defined as “the art of creating machines that
perform functions which require intelligence when performed by
people” 11 and it arose from the first AI conference, held at Dartmouth
College in the year 1956. One of the most prominent researchers in the AI
domain, Robert Feigenbaum, defines expert systems as intelligent
computer programs that use knowledge and inference procedures to
solve problems that were difficult enough to require significant human
expertise for their solutions. 12
Currently, ES are able to complete a number of tasks that were
traditionally accomplished by human managers. These systems can
schedule crews, interview applicants, produce sales projections, manage
inventory, administer skill tests and even assess employee knowledge.
Hence, ES are taking over more comprehensive functions within the
organization, increasing their span of influence significantly. Formerly, as
decision makers, managers applied their own reasoning and ethical
standards when resolving a problem. Thus, if ES replace managers as
decision makers, will the managers be able to influence the ethics of the
Case Study: Mrs. Fields Cookies
The use of expert systems in Mrs. Fields’ Cookies operations is the
key of their business success. 13 14 The ES at Mrs. Fields have been
developed by Park City Group –owned by her former husband, Randall
Fields- and is called ROI-Retail Operations Management. Its designers
claim that this software has reduced administrative work by almost 70%,
liberating managers from “less challenging tasks” 15
By analyzing Mrs. Fields operations, it is evident that management
has not been eliminated but instead has been limited to the domain of
execution. This shift has left most of the reasoning and judgment
considerations to the ES. Thus, the store’s business decisions such as the
number of cookie dough batches to mix, the cookies per hour to bake or
the supplies orders are made by the interactive, inter-adjusted system
that each Mrs. Field retail shop has. Hence, as soon as the store manager
opens the store, (s)he will enter basic information such as the weather
conditions, and the program will respond with an outline of the day's
schedule and a sales projection. In this way, the company’s ES
standardizes the goals and type of operations management of each of
the more than 700 stores. 16
Mrs. Fields ES also operates as an important Database
Asset, allowing the managers and especially the store controllers (at the
headquarters in Utah) to monitor financial records, look at the computer
reports of sales at each store and identify current and potential
problems. 17 But most importantly, this holistic management information
system allows each Mrs. Fields store to bake the same quality of cookies
with equal taste, whether you are purchase them in New York or London.
Initially, the implementation of ES was to maintain control over
the company’s different stores as it expanded its operations. The
company founder, Debby Fields, wanted to remain personally involved in
the various retail outlets. Her extraordinary amount of control over the
store’s operations is accomplished by the system’s standardization and
pre-programmed decision making features. So, day to day, Mrs. Fields can watch over all of her kitchens worldwide and keeps the
chocolate chips melting in the exact way in every shop. 18
With the automation of the production process and
standardization of daily myriad decisions, the company is able to reduce
the average cost for each cookie. This enables them to bake the most
optimum number of cookies per hour, taking into account store-specific
variables. Thus, the system will ask the store manager to input
information related to the weather and special events to project
production and sales. For instance, the store’s expected sales will not be
projected high when the there is an intense snowstorm in town. The
interesting fact is that the program will also “recommend” marketing
explicit moves, such as free samples when sales are lower than expected
(based on the computer’s forecast) Thus, the program eliminates the cost
associated with having an “expert” manager that can forecast demand.
This decision-making program is a competitive advantage to the
firm, as it sets the standards of quality and performance of all their stores,
reducing inefficiencies and saving time by nearly eliminating the
bureaucratic issues arising from paperwork and personal interviews. The
system will schedule employee’s timetables and record their work
hours. By enabling the staff to punch in and out, the time clock facilitates
the payroll process. All the applicants considered for raises and
promotions are exposed to a variety of multiple choice tests that create
an "equal-opportunity" work policy. When it’s time to hire new
employees, the company has a specific program that compares the
potential worker's skills with present workers. Basically, the program
filters the most eligible candidates through a number of interactive
interviews and tests. The final decision of whether to hire or not the
employee is ultimately based on the program's criteria.
The importance of preventing the user from becoming a passive
acceptor or letting “the system” be in charge of a decision is not always
evident. However, ignoring the consequences of these attitudes towards
ES can create various sources of conflict. Decision making in an
organization cannot be delegated solely to a computer because they
cannot take responsibilities or liabilities from the consequences of their
conclusions. Hence, businesses have to implement the necessary buffers
to preventing moral disengagement from occurring. For instance, if a
physician misdiagnoses a patient after using an ES, and the latter dies,
who is going to be held liable? Before the introduction of ES, the
responsibility remained vested in the physician. 19 Today, there is no clear
answer to that issue as other people could be held responsible, such as
the expert system’s developers and even the experts who provide the
information in the first place. Hence, reducing the worker’s autonomy
has major implications as it triggers a dangerous disengagement from the
consequences of their decisions. With the commonly accepted fallacy that
the “computer is always right”, workers might become over dependent on
the decisions suggested by the system. When assuming an intrinsic
reliability in the ES’s decisions, workers will separate themselves from the
consequences of their actions.
The concept of Moral Disengagement includes a number of
psychological processes by which a person can disengage morality from
their conduct and thus prevent self-condemnation when acting against
individual ethical standards 20 According to Bandura, individuals adopt
moral standards in order to self-sanction and control their behavior,
allowing for a distinction between right and wrong. Overall, he considers
this self-monitoring process as part of a person’s development of
individual morality. In this self-regulatory practice, he claims that
individuals are motivated to act in ways that promote satisfaction and a
sense of self-worth; this is because people act according to their moral
beliefs to promote positive self-sanctions.
At the same time, Bandura states that people will “refrain from
behaving in ways that violate their moral standards because such conduct
will bring self-condemnation” Hence, the underlying let motive in
Bandura’s research, is trying to comprehend how people act against their
own moral standards but still avoid negative the otherwise applied self-
sanctions. After many years of research, Bandura identified nine basic
mechanisms through which moral self-sanctions were selectively
activated and disengaged within a person’s self-regulatory process. 21
In the case of ES, employee’s exercise of moral control is
weakened by two mechanisms that Bandura defines as “Displacement of
Responsibility” and “Diffusion of Responsibility.” 22 In the first case, when
employees regard ES as authorities, and their recommendations as
dictates, their sense of personal accountability will erode or in many cases
simply inexistent. This means that the users of the ES do not recognize
themselves as personally responsible for their actions when following the
systems advice or recommendations. This displacement of responsibility
has the potential of allowing employees to act unethically while claiming
that they were just “following orders from the system.”
With the standardization of business processes and decision
making, ES promotes the diffusion of employee’s responsibility over their
conduct. The fact that ES provide suggestions or recommendations from
which all employees have to act upon, operates in detriment of personal
moral control. This means that when employees can differ some or all of their personal accountability to the system, they have the potential to
behave more cruelly or inhumanely. Bandura states that “Where
everyone is responsible no one is responsible” identifying one of the most
dangerous “unintended consequences” of ES implementation in the
Loss of Autonomy
In the process of “facilitating” decision making, ES also erode
employee’s autonomy. Autonomy is “the degree to which a job provides
an employee with the discretion and independence to schedule their
work and determine how it is to be done.” 23 To operate smoothly,
organizations need to limit employee’s autonomy to a certain extent with
the purpose of establishing common work practices and rules. However,
when the limits to employee judgment are too high, a number of ethical
First, autonomy is correlated to responsibility. If employees
utilizing the ES are to be held morally responsible for the decisions they
make, the system must be developed to provide for the user’s autonomy.
The underlying principle behind this position is that employees can only
be held accountable when acting as autonomous agents. Hence, when ES
excessively restrict or limit the user’s autonomy, a corresponding
employee limit to responsibility will result.
Although pre-designed decision making processes provide
“consistency” and homogeneity to the businesses’ operations 24 they are
double edged swords. On the one hand they may reduce and avoid
common human errors. On the other, ES have the potential to convert the
user into a passive “acceptor” of the computers suggestions. As a matter of fact, ES are purposely designed to enable employees’ rapid
endorsement with the automated solution25 When employees are
encouraged to perceive the computer’s reasoning and decisions as the
“right” ones they will diminish their individual reasoning capabilities.
According to Aristotles, reasoning is what makes as humans and the only
way to reach eudaimonia or happiness. 26 Thus, replacing human judgment
for automated reasoning has consequences at the employee’s personal
level as well. When workers notice that their reasoning has no importance
in the ES decision-making process, they numb themselves, becoming
something other than a human being.
In Mrs. Fields case, the store manager was reduced to an executor
of the system’s recommendations, (s)he had little to no control over.
There is no place for breakthrough innovations and managers encounter
a very low degree of flexibility and personal "add-ons" when it comes to
hiring, retaining and managing their staff. In many ways, the expert
system is the store manager's supervisor. This advanced computer
program dictates what the manager should do, every day, every hour, and
every minute. It is no surprise, with that degree of paternalism built-in
the expert systems, that Mrs. Fields has 100% turnover rate among store
managers. Generally people, and especially managers, want to make a
difference in the place they work at; they want to make important
decisions that will influence considerably in the firm's strategy. However,
that is practically impossible for Field’s store managers, who are not even
entitled to decide when to mix the batches or when to discard the cookie
It is precisely this “automated influence” over employee’s
decisions and performance that workers ultimately perceive as
restrictions and limitations to their autonomy. 28 A study by Klein and Jiang examining novice decision makers using an expert system showed
that although employees had a positive impression of the system, they
did not feel a sense of accomplishment when utilizing ES. Klein and Jiang
also found out that employees tend to view ES favorably during their
training period. 29 In many ways, the ES becomes the “real” store manager.
ES can indeed provide great advantages for novice practitioners of a
particular field who have not yet acquired enough experience to make
confident decisions. Oz et Al claims that ES could potentially narrow the
knowledge gap between novices and experts. 30 However, as employees
become more knowledgeable, they are confronted by the fact that the
solution was not a product of their own reasoning process but the
Sometimes employees are willing to give up their autonomy when
acknowledging that they can separate themselves from the system’s
decisions. This means that a user could be attracted by the level of
support that these paternalistic systems offer. ES remove the burden of
deciding what is important and what is not; and Craig et al. suggest that
by “telling the users what they should be looking at, the designer of the
standard reports “removes the burden of deciding what is important and
what is not” 31
Aspiring to foster “morally neutral” employees will only
strengthen the ES ethical standards. The programmer’s desire to design
an “objective” system avoiding the ethical dimensions within the system’s
inference engine is an unobtainable goal. There are no “ethically neutral”
ES. 32 Waldrop argues that expert systems represent values, assumptions,
and purposes from its designer/s 33 Thus, the ethical standards of ES
developers are built-in the system’s functions and inference rules. At the
same time, there is no “universal” set of values that could apply to all employees. By imposing one way of thinking and resolving a problem, ES
will operate on the basis of specific ethical assumptions and approaches
to solving a moral problem. This situation can generate a potential conflict
with the user as their moral standards could differ from those of the
system. 34 Hence, it is important to understand that although ES currently
designed do not include explicit ethical decision models, they are still
applying moral standards to their inference mechanisms. 35
Expert Systems are designed to enhance human decision making
and should never replace individual judgment. Organizations need to be
aware off to prevent potential organizational conflicts arising from
employee’s autonomy loss and moral disengagement. Although ES have
multiple economic benefits as seen in Mrs. Fields cookies example, these
systems can also operate as moral inhibitors when interacting with
employees. When deciding to implement an ES it is important to consider
the potential sources for employees’ moral disengagement and autonomy
loss. Once the roots are identified, organizations will be able to design
mechanisms to avoid an accountability vacuum from employee’s actions.
1. Fields, R. (2004, April 9). Randy Fields Interview: Automating
'Adminitrativia' Decisions. (D. Power, Interviewer)
2. Main, J. (1988, September 26). The Winning Organization. Fortune , 50-
52. Newman, M. (1988). Professionals and expert systems: a meeting of
minds? Computers and Society , 18 (3), 14-27.
3. Abramson, M. (1990). Ethics and Technological Advances:Contributions
of Social Work Practice. Social Work in Health Care , 15 (2), 5-16.
4. Bandura, A. (1999). Moral Disengagement in the Perpetuation of
Inhumanities. Personality and Social Psychology Review , 3 (3), 192-209.
5. Ibid 4.
6. Durkin, J. (1994). Expert systems. Design and Development. London:
Prentice Hall International.
7. Doran, J. (1988). Expert systems and Archeology: What lies ahead? In R.
&. (eds), Computer and Quantitative Methods in Archaeology (pp. 235-
241). BAR International Series.
8. Turban, E., & Aronson, J. (2001). Decision Support Systems and
Intelligent Systems. Upper Saddle River, NJ: Prentice Hall.
9. Feigenbaum, E., & Engelmore, R. (1993, May). Knowledge-Based
Systems in Japan. Retrieved from World Technology Evaluation Center:
10. Feigenbaum, E. (1977). The Art of Artificial Intelligence: I. Fifth
International Joint Conference on Artificial Intelligence (pp. 1014-1029).
Massachusetts: Massachusetts Institute of Technology.
11. Kurzweil, R. (2001). The Age of Intelligent Machines: A(Kind of)Turing
Test. Retrieved from Kurzweil CyberArt Technologies:
12. Ibid 8.
13. Hopper, M. (1990). Rattling SABRE-new ways to compete on information. Harvard Business Review , 64 (4), 118.
14. Gill, G. (1995). High-Tech hidebound: Case studies of information
technologies that inhibited organizational learning.
Accounting,Management and Information Technologies , 5 (1), 41-60.
15. Ibid 1.
16. Fitzsimmons, J., & Fitzsimmons, M. (2006). Service
Management:Operations, Strategy, Information Technology. New York:
Mc Graw Hill.
17. Ibid 16.
18. Ibid 16.
19. Newman, M. (1988). Professionals and Expert Systems: A Meeting of
Minds? Computers & Society, 18, 14—27.
20. Ibid 4.
21. Ibid 4.
22. Bandura, A. (2002). Selective moral disengagement in the excercise of
moral agency. Journal of Moral Education , 31 (2), 101-116.
23. Simmering, M.(2005). Autonomy. Encyclopedia of
Management.October 4 2007.
24. Harmon, P., & King, D. (1985). Expert Systems: Artificial intelligence in
business. New York: Wiley & Sons.
25. Holsapple, C., & Whinston, A. (1985). Management support through
artificial intelligence. Human systems management , 163-171.
26. Biondi, C. (2005). Aristotle's Moral Expert: The Phronimos. In L.
Rasmussen, Ethics Expertise: History, Contemporary Perspectives, And
Applications (pp. 125-132). New York: Springer.
27. Ibid 13.
28. Argote, L., & Goodman, P. (1986). The organizational implications of
robotics. In D. (. Davis, Managing technological innovation:
Organizational strategies for implementing advanced technologies (pp.
127-153). San Francisco: Jossey-Bass.
29. Klein, G., & Jiang, J. (1999). User perception of expert system advice.
Journal of Systems and Software , 48 (2), 155-161.
30. Oz, E., Fedorowicz, J., & Stapleton, T. (1993). Improving quality, speed
and confidence in decision making: measuring expert systems benefits.
Information and Management , 24 (2), 71-82.
31. Craig, R., Berkovich, D., & Vivona, J. (1999). Microsoft Data
Warehousing: Building Distributed Decision Support Systems . New York:
32. Carlson, J., Carlson, D., & Wadsworth, L. (1999). On the relationship
between DSS design characteristics and ethical decision making. Journal
of Managerial Issues , 180-197.
33. Waldrop, M. (1987). Man-made minds: The promise of artificial
intelligence. New York: Walker & Co.
34. Boland, R. (1987). The In-Formation of Information Systems. In R.
Boland, & R. Hirschheim, Critical Issues in Information Systems Research
(pp. 363-370). New York: Wiley.
35. Chae, B., Paradice, D., Courtney, J., & Cagle, C. (2005). Incorporating
an ethical perspective into problem formulation: implications for
decisions support system design. Decision Support Systems , 40 (2), 197-