0% found this document useful (0 votes)
52 views7 pages

Codigo de Etica

This document discusses the concept of risk and risk assessment in engineering. It addresses the following key points: - Engineers have a social responsibility to consider public safety when designing and testing new technologies, as their work amounts to social experimentation. - Risk is an unavoidable aspect of advancing technology, so engineers must assess risks and balance them with potential benefits. - Accurately assessing risk is challenging, as it requires making probabilistic estimates with uncertainties and may involve assigning monetary values to human life. Multiple factors influence how risks are perceived. - Risk is defined as the product of the frequency of hazardous events and the magnitude of their consequences. However, accounting for unknown and random factors that influence outcomes makes precise

Uploaded by

gene
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views7 pages

Codigo de Etica

This document discusses the concept of risk and risk assessment in engineering. It addresses the following key points: - Engineers have a social responsibility to consider public safety when designing and testing new technologies, as their work amounts to social experimentation. - Risk is an unavoidable aspect of advancing technology, so engineers must assess risks and balance them with potential benefits. - Accurately assessing risk is challenging, as it requires making probabilistic estimates with uncertainties and may involve assigning monetary values to human life. Multiple factors influence how risks are perceived. - Risk is defined as the product of the frequency of hazardous events and the magnitude of their consequences. However, accounting for unknown and random factors that influence outcomes makes precise

Uploaded by

gene
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Negligence, Risk and the Professional Debate

Over Responsibility for Design


(ESSAY #5)

The Implicit Social Contract Between Engineers and Society

Engineering is a form of social experimentation. As with every experiment, there is a risk of negative
consequences as well as positive ones. We need only examine the way the quality and structure of life has
changed over recent decades with the arrival of such phenomena as high-speed computers, new composite
materials, new modes of high-speed transportation, and new advances in bio-engineering to appreciate the
potential impact on society of technological change. Unfortunately, such advances are not always accompanied
by unalloyed blessings, and we must deal with the possible side effects. We must, for example, consider our
needs for protection against invasion of privacy to easily-accessible computerized records, or our needs for
protection from the many drunken drivers who cause over 20,000 deaths per year.

As engineers test designs for ever-increasing speeds, loads, capacities and the like, they must always remember
their larger societal obligation: protecting the public welfare. After all, the public has provided engineers, through
the tax base, with an educational opportunity, and, through legislation, with the means for licensing and regulating
themselves. In return, engineers have a responsibility for protecting the safety and well-being of the public in all of
their design efforts. This is part of the implicit social contract all engineers agree to when they accept admission to
an engineering college.

The Issue of Public Risk and Informed Consent

As technology advances, risks are unavoidable. Thus, the issues of risk and decision making confront all
engineering professionals. Recognizing there will always be some measure of risk associated with engineering
design, how do engineers know when those risks outweigh the possible benefits gained from their work? How
do they make informed decisions?

Engineering, more than any other profession, involves social experimentation. Often one engineer's decision
affects the safety of countless lives. It is, therefore, important that engineers constantly remember that their first
obligation is to ensure the public's safety. This is a difficult assignment, for engineers are not typically autonomous
professionals. Most of them work for salaries within a structured environment where budgets, schedules and
multiple projects are important factors in the decision-making process.

The decision-making process is often complicated by the fact that most engineers have multiple responsibilities
attached to their job descriptions. They are responsible for actual engineering practice (including research,
development and design), for making proposals and writing reports, for managing projects and personnel, and
often for sales and client liaison. In other words, engineers, by the very nature of their professional stature both
outside and inside the corporate structure, cannot work in a vacuum. Graduation is not a license to merely tinker
in an engineering laboratory. As an engineer advances, she will be given more authority for directing projects.

This is a natural phenomenon in the engineering community. Most engineers aspire to managerial positions, even
if only on one specific project. There are many benefits associated with managerial authority, not the least of
which are increased financial remuneration and job satisfaction. But authority comes with a heavy price tag:
increased responsibility for decisions made. It is important to remember that responsibility always rests with the
project leader.

Eventually, engineers and engineering managers have to make tough decisions about whether a product is safe
for public use. Sometimes those decisions involve conflicts over technical problems versus budgets, and
problems with schedules and personnel allocations. The engineering manager must first be an engineering
professional. Before attending to profits, she must meet professional engineering code requirements and
obligations to public safety. This requirement can create difficulties for the engineer.

The problems engineering professionals face involve how to define, assess and manage risk in the light of
obligations to the public at large, the employer, and the engineering profession as a whole. The following
literature review acts as a catalyst for discussion on risk and the decision-making process as it relates to the
cases you are studying. Bear in mind that, above all, risk assessment is closely tied to the perspective that
engineering is a social experiment, that engineers have an implicit social contract with the public they serve, and
that professional societies and their codes of ethics play important roles in helping shape the engineering decision-
making process.

Risk

In 1990 Theodore S. Glickman and Michael Gough assembled a number of papers, Readings in Risk, as an
introduction to risk definition, assessment, evaluation and management.1 Three papers in the series offer
definitions of what we mean by risk, as it relates to the engineering profession.2

These three papers pose the classic question about technology-induced risk: "How safe is safe enough?" The
answer to this question often depends on who is asked, whether the risk taken is voluntary or involuntary, what
the near-term and long-term consequences are, and what the spatial distribution as well as expected probability
of the risk is. As an adjunct, how we perceive and evaluate each given risk depends on factors such as the
magnitude of exposure to the risk, whether the effects are reversible, and whether threshold levels of those risks
exist. Even if we can establish these parameters, we must remember that computations are based on our best
available evidence, not necessarily the appropriate evidence. Our analyses often neglect the realities of uncertain
inputs and exogenous disturbances to the system under study; therefore, probabilities entered in a numerical risk
assessment are, at best, judicious engineering estimates. Nevertheless, we must perform risk assessments to
establish what public risks can be undertaken, given the possible benefits from the technological advances being
pursued. Where do we find our starting point in assessing risk?

Assessing Risk

We start with things we know. Risk assessment deals with setting magnitudes on the risks we know exist. This
involves using tools such as fault- or event-tree analyses, and requires probability-based estimates (or sometimes
knowledge) of the likelihood of a given fault or event. Once we have estimates, we can look at how best to
minimize the risk.

Risk abatement addresses problems associated with how risks can be regulated or minimized. This means we
have to address political questions associated with managing risk. Who should manage the risks? How should
the risk be managed, and at what level in the hierarchy should risk be managed? This political analysis can only
begin after the most accurate risk assessment possible is undertaken. Of course, this puts the greatest burden of
proof on the engineers, and often stretches their capabilities.

In his "Social Benefits Versus Technological Risk,"3 Chauncey Starr concludes that we are generally willing to
take voluntary risks that are 1000 times (three orders of magnitude) as uncertain as involuntary risks. In Starr's
studies, risk is generally proportional to the cube of the incremental wages involved in recognition for taking that
risk. For example, doubling wages would tend to convince a worker to take eight times the risk. Further, there is
a perceived separation of three orders of magnitude between involuntary risks (such as a corporations placing a
toxic waste dump in your area) and voluntary risks such as smoking. Thus, studies show that people tend to
overestimate the likelihood of low probability risks associated with causes of death. The converse is also true.
Such tendencies lead to overconfident biasing (or "anchoring") in personal risk assessments, which in turn
suppresses adjusting the assessment for the realities of the situation. In his study of 57 risk abatement programs
at five different government agencies in Washington (including the Environmental Protection Agency and
Occupational Safety and Health Administration), Starr shows that risk abatement programs cost from $170,000
to $3 million in cost/life in different agencies. But this presents a new problem how do you affix a price tag to a
human life? This is not an easy task.

To assist in solving this problem, Norman C. Rasmussen, in "The Application of Probabilistic Risk Assessment
Techniques to Energy Technologies,"4 suggests a more basic definition for "risk," namely, "consequences/unit
time." Rasmussen argues that risk is the product of frequency (events/unit time) and magnitude
(consequences/event). For example, to compute whether you would have been one of the approximately 50,000
auto fatalities in a given year, multiply the number of auto accidents/year (about 15 x 106) times the death rate
per accident (one in 300). Yet, to make Rasmussen's formula work, we must be able to differentiate between the
unknown versus the "dreaded" risks. And within each category we must be able to differentiate whether those
risks are observable, controllable, voluntary, short-term, fatal, increasing and so on, and attach an appropriate
weighting factor to each of these considerations as they enter into our numerical risk assessment. According to
Rasmussen's study, these weighting factors can vary between zero and 100, thereby making accurate risk
predictions problematical indeed.

One of the most serious drawbacks to any risk assessment is perhaps the omission of totally random and
exogenous inputs. Such random and exogenous inputs cannot be predicted at all, and yet, when performing the
final analysis of a catastrophe, they are often the key items in the sequence of events that lead to the disaster.
Most airplane crash and explosion disaster investigations have pointed to one or two exogenous causal factors,
the absence of which would have prevented the disaster. These same causal factors, however, could not have
been foreseen, and therefore the likelihood of their occurring could not have been predicted. One example of this
is the real-world scenario involving the pilot who accidentally spills his coffee over the control console, thereby
unwittingly setting off an "engine on fire" alarm, which in turn led to catastrophe. (This was popularized in the
book and movie of the 1950's, "Fate Is the Hunter.")

With all these drawbacks, you might ask, "Well, why bother?" The answer is that, we must still perform risk
assessments to the best of our abilities in order to protect ourselves from those problems we do know about, or
can foresee.

So, how can we answer the question, "How Safe Is Safe Enough?" We must first recognize that there is no
single, simple answer. We are often choosing between unpleasant alternatives. There will always be some level of
risk associated with engineering and innovation; therefore, any risk analysis should involve the following five
steps:
1. Define all the possible alternatives.

2. Specify the objectives and measure the effects.

3. Identify the consequences of the actions taken.

4. Quantify the alternatives based on the best available information.

5. Analyze the alternatives to arrive at the best choice for cost/risk.

Once engineers and managers have established, to the best of their abilities, the costs versus benefits of the risks
involved, they must manage that risk.

Managing Risk

In a recent article,5 Michael Davis notes there are five major problems associated with managing risk:

1. We must deal with uncertainties.

2. We are often forced into too narrow a focus on specific classes of risk.

3. A commitment is required to provide immediate solutions, instead of "going slow."

4. Our adversarial world promotes inflexibility in our risk analyses.

5. The nature of a risk management program does not promote consensus, collaboration or cooperation between
all involved.

The last point, in particular, is important for the engineering professional. In recent discussions6 with engineering
professionals at a large research and development firm, engineers (both managerial engineers and non-managerial
engineers) note that engineering team players need to know all the particulars associated with the problem at
hand. This includes budgetary and time constraints, promises made to clients, and realistic assessments of
technological flaws.

Risk management is perhaps the most important aspect of the engineer's professional tool kit. It is important for
the public, our corporations, and for the engineering profession at large. As Michael Davis notes, managers and
engineers approach risk in different ways. Managers have to factor in such things as schedules, budgets and
contract requirements. Engineers tend to place safety considerations above all others. Engineers need more
training in balancing risk versus benefit, so they can better communicate their legitimate concerns about public
safety. For both managers and engineers alike, the likely tendency is to look at their work through a microscope,
and, accordingly, not to see the many complications that fall just outside the field of resolution we are viewing.

Microscopic vision is enhanced vision, a giving up of information not likely to be useful under the circumstances
for information more likely to be useful. If a point of light at the other end, microscopic vision is like looking into a
microscope at things otherwise too small to see. [Microscopia is neither nearsighted nor myopic it's not a kind of
blindness; rather, it's a kind of insight. A person with microscopic vision need only look up from the microscope.]
... Microscopic vision is a power, not a handicap, but even power has its price. You cannot both look into the
microscope and see what you would see if you did not.7
Thus, as Davis argues, risk management programs force us to look up from the microscope, so we are better
equipped for avoiding tragedies due to failed innovation.

The decision-making process is never an easy one. The important thing all engineers must remember is that their
first obligation is to public safety. Some risk is unavoidable. Engineering professionals must learn how to convince
their managers that minimizing that risk is worth the effort involved.

Negligence and the Codes Of Ethics Of Professional Societies

What guidelines do the professional codes of ethics give engineers? The following pages contain two such codes:
the ASME, and the recently revised IEEE code adopted by the IEEE Board of Directors in October 1990.
Note the emphasis in the first clause of the new IEEE code on the public's welfare, and the responsibility cited in
the second clause for engineers to inform all affected parties of any inherent dangers or risks. This is very
different from engineering codes of ethics written in past decades, where the primary concern was competitive
bidding, advertising, obligations to employers and clients, and so on. While these are all important issues for
professionals, they are less important than obligations to the public those obligations implicit in the engineers'
adopted social contract.

In their recent book on engineering and ethics,8 Mike Martin and Roland Schinzinger cite earlier studies on the
responsibilities of engineers. They summarize these responsibilities as involving the following considerations:

1. A primary obligation to protect the safety and respect the right of consent of human subjects.

2. A constant awareness of the experimental nature of any project, imaginative forecasting of its possible side
affects, and a reasonable effort to monitor them.

3. Autonomous, personal involvement in all steps of a project.

4. Accepting accountability for the results of a project.9

Martin and Schinzinger's implicit assumptions are that: a) engineers are and should be held responsible for past
actions; b) engineers are responsible for the roles they have played in projects; c) engineers are capable of
making moral (and certainly technically correct) decisions autonomously; and d) as such, each individual engineer
is accountable for the projects she/he works on. Thus, an engineer can be deemed negligent if she/he does not
meet these criteria.

ASME Code Of Ethics Of Engineers

The Fundamental Principles

Engineers uphold and advance the integrity, honor, and dignity of the Engineering profession by:

I. using their knowledge and skill for the enhancement of human welfare;

II. being honest and impartial, and serving with fidelity the public, their employers and clients; and

III. striving to increase the competence and prestige of the engineering profession.

The Fundamental Canons


1. Engineers shall hold paramount the safety, health and welfare of the public in the performance of their
professional duties.

2. Engineers shall perform services only in areas of their competence.

3. Engineers shall continue their professional development throughout their careers and shall provide
opportunities for the professional development of those engineers under their supervision.

4. Engineers shall act in professional matters for each employer or client as faithful agents or trustees, and shall
avoid conflicts of interest.

5. Engineers shall build their professional reputation on the merit of their services and shall not compete unfairly
with others.

6. Engineers shall associate only with reputable persons or organizations.

7. Engineers shall issue public statements only in an objective and truthful manner.

IEEE Code Of Ethics (Revised October 1990)

We, the members of the IEEE, in recognition of the importance of our technologies in affecting the quality of life
throughout the world, and in accepting a personal obligation to our profession, its members and the communities
we serve, do hereby commit ourselves to the highest ethical and professional conduct and agree:

1. to accept responsibility in making engineering decisions consistent with the safety, health, and welfare of the
public, and to disclose promptly factors that might endanger the public or the environment;

2. to avoid real or perceived conflicts of interest whenever possible, and to disclose them to affected parties
when they do exist;

3. to be honest and realistic in stating claims or estimates based on available data;

4. to reject bribery in all its forms;

5. to improve the understanding of technology, its appropriate application, and potential consequences;

6. to maintain and improve our technical competence and to undertake technological tasks for others only if
qualified by training or experience, or after full disclosure of pertinent limitations;

7. to seek, accept, and offer honest criticism of technical work, to acknowledge and correct errors, and to credit
properly the contributions of others;

8. to treat fairly all persons regardless of such factors as race, religion, gender, disability, age, or national origin;

9. to avoid injuring others, their property, reputation, or employment by false or malicious action;

10. to assist colleagues and coworkers in their professional development and to support them in following this
code of ethics.
Both the legal and medical ethics literature look at negligence in terms of what society and the law consider
"reasonable" types of professional behavior, as well as the consequences of professional failures to act. In his
recent article, Kenneth McK Norrie notes that negligence,

through the standard of reasonableness, imports into the law an ethical command as an attempt to encourage
certain types of safe ("reasonable") behavior and discourage other types of unsafe ("unreasonable") behavior. ...
The more knowledge, skill and experience a person has, the higher standard the law subjects that person to.10

In his article John C. Hall argues that we must take responsibility not only for our negligent acts, but for our
failures to act as well. What is it the negligent man has to answer for? For Hall,

What we blame the negligent man for is his decision, at some time in the past, not to take the steps he knows
from experience to be necessary to ensure that when the time comes he will remember to do his duty.11

In the end, engineers must be ever cognizant of both their actions and inactions as professionals.

Notes

4. to reject bribery in all its forms; 5. to improve the understanding of technology, its appropriate application, and
potential consequences; 6. to maintain and improve our technical competence and to undertake technological
tasks for others only if qualified by training or experience, or after full disclosure of pertinent limitations; 7. to
seek, accept, and offer honest criticism of technical work, to acknowledge and correct errors, and to credit
properly the contributions of others; 8. to treat fairly all persons regardless of such factors as race, religion,
gender, disability, age, or national origin; 9. to avoid injuring others, their property, reputation, or employment by
false or malicious action; 10. to assist colleagues and coworkers in their professional development and to support
them in following this code of ethics. Both the legal and medical ethics literature look at negligence in terms of
what society and the law consider "reasonable" types of professional behavior, as well as the consequences of
professional failures to act. In his recent article, Kenneth McK Norrie notes that negligence, through the standard
of reasonableness, imports into the law an ethical command as an attempt to encourage certain types of safe
("reasonable") behavior and discourage other types of unsafe ("unreasonable") behavior. ... The more
knowledge, skill and experience a person has, the higher standard the law subjects that person to.10 In his article
John C. Hall argues that we must take responsibility not only for our negligent acts, but for our failures to act as
well. What is it the negligent man has to answer for? For Hall, What we blame the negligent man for is his
decision, at some time in the past, not to take the steps he knows from experience to be necessary to ensure that
when the time comes he will remember to do his duty.11 In the end, engineers must be ever cognizant of both
their actions and inactions as professionals. Notes

You might also like