0% found this document useful (0 votes)
33 views12 pages

1998 - Einarsson An Approach To Vulnerability Analysis of Complex Industrial Systems

Uploaded by

nmosilva
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views12 pages

1998 - Einarsson An Approach To Vulnerability Analysis of Complex Industrial Systems

Uploaded by

nmosilva
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

Risk Analysis, Vol. 18, No. 5.

1998

An Approach to Vulnerability Analysis of Complex


Industrial Systems

Stefan Einarssonl and Marvin Rausand’

Received January 14. 1998; revised February 9, I998

The concept of vulnerability of complex industrial systems is defined and discussed in relation to
risk and system survivability. The discussion is illustrated by referring to a number of previous
industrial accidents. The various risk factors, or threats, influencing an industrial system’s vulner-
ability are classified and discussed. Both internal and external threats are covered. The general
scope of vulnerability analysis is compared to traditional risk analysis approaches and main dif-
ferences are illustrated. A general procedure for vulnerability analysis in two steps, including
building of scenarios and preparation of relevant worksheets, is described and discussed.

KEY WORDS: Vulnerability; robustness; risk factors; risk analysis; system analysis.

1. INTRODUCTION general context in connection with human and societal


biological or eco-sy~tems,(~.~) databases and
Two main trends may be seen in most types of computer systems,(lOJl)s t r ~ c t u r e s , ( ~and
~ - ~complex
~) in-
modem man-made systems: centralization and automa- dustrial system^.('^-^^)
tion.(l) These trends have been amplified by the rapid The vulnerability concept has yet not been given a
development of computer systems and networks. Today, generally accepted definition for technological applica-
there is hardly any major company that does not depend tions. In some of the references cited above, vulnerabil-
on its computer systems. In many cases, the computer ity is considered to be similar to the risk concept,
networks of one company are connected through a com- although with a somewhat broader interpretation. The
plex web into those of suppliers and customers, and the threats referred to are often external to the system, and
dependency on the Internet is significantly increasing. may involve deliberate actions.
These developments are claimed to take place to In this paper, we will use the term vulnerability to
secure an effective and economical operation, but at the describe the properties of an industrial system that may
same time they lead to an increased vulnerability. The weaken its ability to survive and perform its mission in
vulnerability concept is used to characterize a system’s the presence of threats. A system is said to survive an
lack of robustness or resilience with respect to various accident (or some other disturbance) if it is able to operate
threats, both within and outside the boundaries of the from a specified time after the accident, and to regain a
system. similar market position as it had prior to the accident.
In military application^(^*^) the term has been used The objective of this paper is to discuss the vul-
to describe how vulnerable for example an aircraft or a nerability of complex industrial systems, to clarify nec-
ship’s hull is with respect to physical impacts. In recent essary concepts, and to outline a framework for
years the term vulnerability has also been used in a more vulnerability analysis.
Section 1 of the paper presents an introduction to
I Department of F’roduction and Quality Engineering, Norwegian Uni- the vulnerability concept and outlines the objectives of
versity of Science and Technology, N-7034 Trondheim, Norway. the paper. The vulnerability concept is discussed in more
535
0272~332/98/1000-0535s15.00/1 Q 1998 Society for Risk Analysis
Einarsson and Rausand

detail in Sect. 2, together with associated concepts like; Vulnerability may be considered as the “opposite”
robustness, resilience and damage tolerance. A number of robustness and resilience, in the same way as risk is
of internal and external risk factors, or threats, influenc- the “opposite” of safety.
ing a system’s vulnerability are discussed in Sect. 3. A Robustness4r damage tolerance(I2)-is a desirable
general approach to vulnerability analysis of an indus- quality of any system. Any system should be able to
trial system is outlined in Sect. 4. The approach com- sustain some damage without failure. Both robustness
prises two steps and is based on two specific worksheets. and resilience refer to a system’s ability to accept and
The analysis is based on a scenario approach and is fo- withstand unexpected applications and operational con-
cused on the system’s survivability. Some concluding ditions. Robustness is a “static” concept, while resil-
remarks are given in Sect. 5 . ience means that the system may change and adapt to
The paper is focused on an entire company. Dis- the new situation.
turbances to the “production line” must be based on a In some applications it may be beneficial to distin-
detailed modeling of the various steps in the line-and guish between internal and external threats.(I7)Internal
analyzed by a modified reliability analysis.(19)Under- threats are threats inside the system boundaries (com-
standing, predicting, mitigating, and preventing high- ponent failures, operator errors, etc.); while external
consequence failures are the keys to a more efficient loss threats are threats outside the system boundaries.
prevention and control. External threats may in some cases be obvious due
to the geographical location or closely linked to the sys-
tem’s inputloutput interfaces. In other cases, the external
threats may seem rather remote, and at a first glance
2. WHAT IS WLNERABILITY? independent of the system.
The consequences of an accident may be mani-
fested within the physical boundaries of a company, or
An industrial system may be vulnerable with re- elsewhere connected to the products or services supplied
spect to as diverse threats as: technical failures, human by the company. Design and fabrication errors may in
errors, criminal acts, environmental impacts, accidents, this context be considered as accidents. Such accidents
loss of key personnel, strikes, product liability claims, may imply immediate or delayed consequences, inside
hostile takeover attempts, inflation, variations in energy or outside the company premises. The “outside” con-
prices, etc. According to Powers(2o);“the modem world sequences may in many cases be far more devastating
is rife with danger and an almost infinite number of per- for the survivability of the company than ‘‘inside” con-
ils beset any business.” sequences. The Norwegian Contractors’ design error that
We will use the term accidental event to denote the led to the capsizing of the Sleipner platform is an ob-
manifestation of one or more threats. An accidental vious example. Less dramatic design or fabrication er-
event is the first identifiable adverse event in the system rors may lead to product liability claims andlor that
caused by one or more threats, and is a random event products have to be called back for rework. In some
that may occur at a more or less predictable point of cases the consequences of an accident may be mani-
time. The consequences of the accidental event may de- fested several years after the design or fabrication error
stabilize the system for a period of time long enough to took place.
threaten the survivability of the system. The conse-
quence chains may in some cases be stopped or miti-
gated by various barriers and safety functions, and by 3. RISK FACTORS INFLUENCING A SYSTEM’S
emergency procedures. Various threats may also influ- WLNERABILITY
ence the barriers, safety functions, and emergency pro-
cedures. In this section, various risk factors+x threats-
In this paper we define the vulnerability of an in- that may influence an industrial system’s vulnerability
dustrial system as: are briefly discussed and classified into broad categories.
The overview does not aim at giving a complete descrip-
The properties of an industrial system; its premises, facilities,
and production equipment, including its human resources, hu- tion of all possible risk factors, but rather highlight some
man organization and all its software, hardware, and net-ware, important types.
that may weaken or limit its ability to endure threats and sur- The main categories of risk factors covered in our
vive accidental events that originate both within and outside approach are illustrated in the cause-effect diagram in
the system boundaries.
Fig. 1. A detailed taxonomy of specific risk factors has
Analysis of Complex Systems 537

I Internal factm

I Externalfactors

Fig. 1. Cause-effect diagram illustrating the various categories of risk factors influencing an accidental event
or disruption in a technical system.

been established based on these categories, and tested in actions that they cannot be adequately analyzed. He con-
a number of case studies. The detailed taxonomy in- siders accidents in such systems a natural phenomenon
cludes the similar, but more limited, hazard taxonomies and calls them system accidents or normal accidents.
in the risk assessment standard EN 1050,(21)and in the Per~-ow,(~~)classifies a system according to two ba-
AIChE ‘‘Guidelines for Hazard Evaluation Proce- sic characteristics:
dures.’’W
Due to the vast variety of potential risk factors, 0 System interactions (linear vs. complex)
such a taxonomy can never be complete. Even a non- 0 System couplings (loose vs. tight)
complete taxonomy is, however, considered to be a val- In a linear system the dependencies and interactions be-
uable tool that may aid to identify vulnerability tween the various system components may be described,
problems. The risk factors in Fig. 1 are classified as in- and possible consequences foreseen or predicted-and
ternal and external risk factors respectively. The cate- thereby potentially prevented. In a complex system, in-
gories in Fig. 1 are clearly overlapping, and several risk teractions and the consequences of interactions cannot
factors may be classified under more than one category. be predicted. A complex system will therefore in general
As mentioned above, the threats may not only in- be more vulnerable than a linear system.
fluence the occurrence of an accidental event, but may The system complexity is determined by the num-
also have influence on the barriers and safety functions ber and type of interdependencies between units and
that have been installed to stop or mitigate the accident subsystems. High dependency exists when unit or sub-
consequences. A negative safety culture in a company system functions cannot be performed because a second
may, as an example, be a contributing risk factor to both unit or subsystem does not function, or functions poorly.
the onset of a fie, and also to the maintenance and avail- Complexity tends to increase with system life, since
ability of the fire detection and f i e extinguishing sys- there is often a tendency to add new, or “nice to have”
tems. features, and to modify systems without removing su-
perfluous equipment.
3.1. Internal Risk Factors Tight coupling means that there is no slack or buf-
fer between subsystems. What happens in one subsystem
directly affects what happens in other subsystems. Many
3.1.1. System Attributes companies have deliberately chosen a tight coupling and
thereby a high degree of vulnerability in their production
According to P e r ~ o w (some
~ ~ ) systems, like nuclear line in order to reduce costs and increase efficiency and
power plants, are so complex and have so many inter- flexibility, e.g., by adopting the “just-in-time” principle.
538 Einarsson and Rausand

Loosely coupled systems may incorporate shocks major accidents have been caused by such latent failures.
or failures and pressures for change without destabili- Recall, for example, one of the main conclusion from
zation, e.g., through adequate buffers. Tightly coupled the investigation of the Three Mile Island accidentW
systems will respond more quickly to perturbations. “. . .wherever we looked we found problems with the
Many companies are closely interconnected, e.g., human beings who operate the plant, with the manage-
through single supplier agreements. An accidental event ment that runs the key organization, and with the agency
at a supplier may in such cases have some of the same that is charged with assuring the safety of nuclear power
effects as if the event happened within the company. A plants.”
strike at a supplier to a car producer may, for example,
lead to a stop in production at the car producer.
3.1.4. Maintenance Factors

3.1.2. Technical Failures and Technical Hazards Many major accidents occur either during mainte-
nance or because of inadequate or faulty executed main-
The reliability of the technical equipment has a sig- tenance. The Piper Alpha in the North Sea
nificant influence on a system’s vulnerability. Most tech- in 1988 where 167 persons were killed, and the Flix-
nical failures will have limited consequences, and may borough accident(27)in the U.K. in 1974, are among the
be considered part of the normal operation. Some units most severe accidents that may be connected to main-
are, however, so expensive that it is not economically tenance. According to Hale et a1.(28)3540% of all se-
viable to stock redundant units, or even spare parts. rious accidents in the process industry in the Netherlands
These units are often critical for the system operation, are connected to maintenance. Studies by the British
and will bring the system down when they fail. Health and Safety Executive of the fatalities in the chem-
Failures of less expensive equipment may also have ical industry showed that approximately 30% were
devastating consequences. This is especially relevant for linked to maintenance activities, taking place either dur-
computer systems and networks. A crash of a disk drive ing maintenance or as a result of faulty maintenance.
may e.g., have far-reaching consequences if the neces- Human errors and inadequate organization are generally
sary backups are not in place. considered to be the two main causes of maintenance-
A special problem is connected to so-called com- related accidents.
mon cause failures and cascading failures. Such failures
are often unavoidable side effects of tight coupling and
complexity. A threat or a minor initial failure may in 3.1.5. ‘‘Staff ’ Factors
these cases lead to multiple failures in seemingly inde-
pendent units. Staff factors are here dealt with within the follow-
ing items:
0 Strikes and other labor conflicts
3.1.3. Human and Organizational Factors
0 Loss of key personnel
0 Recruitment of new staff, availability of skilled
Human factors are generally considered to be the personnel
main potential for improvement of the safety perform- 0 Safety culture-job dedication
ance within most organizations. It is often claimed that 0 Unfaithful servants (embezzlement, sabotage,
more than 80% of all accidents may be attributed to etc)
human error. 0 Liability/damages claims from staff members
Reason(24)distinguishes between two basic types of (e.g., asbestos)
failures, active and latentfailures, that may cause system
breakdown. Active failures are unsafe acts committed by All these factors relating to the staff may influence
those at the “sharp end” of the system (pilots, train the vulnerability of a system. All companies need a com-
dnvers, control room operators, etc.). Latent failures are petent staff on all levels. In many cases, the companies
usually fallible decisions by management only becoming have to compete to get the “best” personnel. Key per-
evident when they combine with local triggering factors. sonnel may be lost to competing companies. This may
If no actions are taken to prevent latent failures, most have deeper roots in the organization itself and partly be
systems will have a wide range of latent failures waiting self-inflicted, conflicts between the staff and the owners
for a triggering factor to produce an accident. Several may, e.g., build the ground for such a situation. Key
Analysis of Complex Systems 539

personnel may also be lost in accidents, inside or outside sive bomb the IRA exploded at Bishopsgate near Lon-
the company. don City on April 24, 1993. IRA’S aim was in this case
to destroy the computer networks serving the City.
The Chernobyl accident on 26 April 1986 must be
3.2. External Factors claimed to be caused by societal factors-r institutional
attitudes. Equipment failures and human errors, in the
3.2.1. Environmental Factors traditional interpretation of the concept, played an insig-
nificant role in the events that led to the disaster. The
The vulnerability of a company may depend on a accident was caused by deliberate violations of operating
wide range of environmental--or, natural-hazards, that rules in order to force through an experiment-no matter
what.0 133)
are specific for the company’s location. A coarse clas-
sification of these hazards may include the following: The consequences of an accident also depend on
the societal and political situation in which the accident
0 Geological threats (earthquakes, landslides, occurs. This dependency may be formed by earlier ac-
ground subsidence, etc.) cidents in the same system, accidents in similar systems,
0 Meteorological threats (storms, floods, draughts, increased knowledge or awareness of possible accident
frost, lightning, etc.) consequences, environmental problems, etc. Societal
0 Biological hazards (epidemics, viruses, bacteria, factors in major accidents are further discussed and ex-
grasshopper swarms, etc.) emplified by Hovden et a1.(34)
0 Extraterrestrial hazards (falling meteorites or
spacecrafts, cosmic radiation, etc.)
0 Technological hazards (nuclear radiation, pollu- 3.2.3. Infrastructure Factors
tion, etc.)
Each company should assess the relevance and magni- Almost any business activity relies upon the use of
tude of the environmental hazards in its own region and increasingly complex software-based systems. The ad-
develop their own disaster preparedness plan. Several vantage of using software includes the increase in the
guidelines have been developed to aid the companies in system’s functionality, flexibility, and performance. The
this activity. Among these guidelines are the “Emer- inherent disadvantage, however, lies in its increased like-
gency Management Guide for Business & Industry” that lihood of failures and disruptions.
has been developed for the US Federal Emergency Man- Many companies are especially vulnerable to acci-
agement Agency (FEMA)(29)and the ‘‘Recommended dental events to their computer systems. It has been es-
Practice for Disaster Management” issued by the US timated that 20% of all companies which fall victim to
National Fire Protection Association.(3o) a major computer disaster never recover. Many “hack-
Several universities have set up separate depart- ers” see it as a sport to try to break into computer sys-
ments dedicated to research into the prediction, crisis tems. No networks seems to be unpenetrable. One out
management, etc. of environmentalhatural disasters. A of 20 Swedish enterprises reported, e.g., that they were
listing of such departments and institutes may, e.g., be intruded by hackers in 1995-1996. In the same period,
found on the Internet address: http: one out of four Swedish enterprises had significant com-
//m.colorado.edu/hazards/centers.html. Several of puter virus problems. Many companies are reluctant to
these departments claim that there is evidence that the admit that they have problems with viruses and hack-
vulnerability to natural hazards is increasing worldwide. e r ~ . (The
~ ~above
) figures are therefore considered to rep-
resent conservative estimates. Hackers have even been
able to get access to highly protected defense networks.
3.2.2. Societal Factors Computer systems may also be destroyed by physical
impacts, either by deliberate actions or by random ac-
The vulnerability of a system depends on a wide cidents; a gas leak may demolish a building, or a careless
range of societal and political factors. Sabotage and ter- contractor digging up the road, may severe power, tel-
rorist actions(32)may have their background in the soci- ephone and data lines.
etal and political situation. In some cases such actions Many companies are also very vulnerable to dis-
may be a revenge to actions or attitudes expressed by ruptions of other types of infrastructure, like: transpor-
the company. In other cases, the “revenge” may be tation, telecommunication, electric power, water supply,
against a whole society. A relevant example is the mas- and sewage.
540 Einarsson and Rausand

3.2.4. Legal and Regulatory Factors Accidental events may have extremely different im-
pacts on a company’s economy depending on the nature
A company may sometimes be vulnerable to new and circumstances under which the event takes place. In
laws and regulations. New laws and more strict regula- some cases an accidental event may have both negative
tions often follow after accidents and serious near-acci- and positive effects. A raw material cost explosion may
dents. The Seveso directive(36) was, for example, revised be very serious for the survivability of a company, but
after the Sandoz !ire in Basle, Switzerland, and the Bho- it may also boost its employees’ morale and thereby
pal accident in India. have an indirect positive effect on the company’s econ-
New laws and regulation may in some cases lead omy.
to required and costly changes of a company’s produc- A company may also be subject to hostile takeover
tion line or its products. In extreme cases, the surviva- attempt after suffering an industrial accident. Hostile
bility of a company may be threatened, e.g., if a takeover attempts may further distort the management’s
financially weak company has to fulfill strict require- attention from the daily operation toward fighting the
ments imposed by a new regulation. This may especially takeover attempt.
be the case for new regulations within the environmental Takeover attempts triggered by financial and stock
area where new evidence about pollution often leads to market changes are often very important for the surviv-
strict requirements. ability of the company. This topic is, however, not fur-
ther pursued in this paper.

3.2.5. Market Factors


3.3. Safety Functions and Mitigation
Most technological systems are vulnerable with re-
spect to a wide range of market factors. These factors
may influence the system’s products and services, but Safety functions and mitigation have vital roles in
also the system’s financial status. Products or services protecting a company’s production equipment, its prop-
may become obsolete, because they are outperformed by erty, the environment, and the life and health of em-
similar products and services offered by other compa- ployees and third persons. Most safety functions are
nies. The prices of the products may fall due to market installed to keep critical process parameters within ac-
changes or high competition and may become too low ceptable ranges, or to secure that protective actions fol-
to secure an economical operation of the company. The low immediately if the acceptable ranges are exceeded.
cost of raw materials may rise above acceptable levels Mitigation comes in when process parameters have
and make it impossible to continue a cost effective pro- passed beyond acceptable limits and necessary counter-
duction. measures need to be initiated. These may be designed-
The market has both social and political dimen- in consequence reducing measures on the process equip-
sions, and the information submitted to customers and ment itself, or certain operational procedures that need
the public during-and immediately after-an accident to be followed by the employees. Safety functions ad-
may therefore be more decisive for the survivability of dress both the technical system, and hybrid systems
the system than the physical damages caused by the ac- comprising both technical and organizational aspects.
cident. Rumors and erroneous information may destroy Successful mitigation requires coordinated deci-
the company’s future market position. sion-making and actions from various levels in the or-
ganization. These actions have to be planned and will
usually need to be supported by regular training.
3.2.6. Financial Factors Safety systems are sometimes very complex, as is
clearly seen by looking at the computerized process and
Many companies have had to go through the bitter emergency shutdown systems on offshore oil and gas
experience that one badly planned economic decision platforms. To introduce changes into a main production
that was meant to secure the future of the company, led line, or the associated safety system without a proper
to bankruptcy. Economic results have a high impact on consequence analysis on system level may be a risky
the organizational climate and on the employees’ feeling activity. The regular testing and analysis of the perform-
of security, since people often plan their future with the ance of safety functions and mitigation is one of the
company. The economy of a company will further have major tasks for keeping the vulnerability of a company
a significant impact on its loss prevention policy. at an acceptable level.
Analysis of Complex Systems 541

L
9 New stable situation
2
a
.-C
In
8
.-C
In
d

/ / Time
Disruption time

Fig. 2. Illustration of the disruption time following an accidental event

4. VULNERABILITY ANALYSIS A quantitative target of the vulnerability analysis may,


for example, be the ‘‘survival probability” of the sys-
The approach to vulnerability analysis presented in tem, or the probability of successful fulfillment of a
this paper is scenario-based, where a scenario is defined specified mission.
as a sequence of potential events, where the events may
be separated in time and space, and where barriers to
prevent the sequence are a part of the scenario. The po- 4.1. Vulnerability Analysis vs. Risk Analysis
tential scenarios are established by using the taxonomy of
risk factors (threats), with the main headings in Fig. 1. A traditional risk analysis(38)is mainly limited to
The analysis consists of two major steps. In the first step accidental events taking place within the physical
the potential scenarios are identified and briefly evaluated. boundaries of the system, and the threats studied are
In the second step, a quantitative analysis of the scenarios often limited to technological hazards within these
provides a criticality ranking of the scenarios. boundaries. In some risk analyses, environmental threats
A consequence of using a scenario-based approach are partly covered. The majority of the risk factor cate-
is that we only consider discrete events. Problems as- gories in Fig. 1 are, however, considered to be irrelevant
sociated to continual incremental changes of a system in traditional risk analyses. In a vulnerability analysis
function are therefore not considered as relevant in the we work with open system models, where risk factors
analysis, unless the gradual degradation gives rise to a both inside and outside the physical boundaries of the
specific event or action. system are taken into account. A vulnerability analysis
A first disturbance in the system-an accidental and a risk analysis of the same company will therefore
event-may give rise to a chain of consequential events produce quite different sets of accidental events.
that produce a scenario. The (conditional) probabilities The actions to mitigate, restore and restart the ac-
of the various events comprising the scenario will de- tivities after an accident are normally not part of a risk
pend on the barriers, the safety functions and the miti- analysis. There is, however, a trend within some indus-
gating systems and procedures. tries to include a part of the emergency preparedness into
The objectives of a vulnerability analysis of an in- the risk analysis process. The Norwegian oil companies
dustrial system may comprise: are now, e.g., issuing a standard called “Risk and emer-
0 To identify potential threats to the system
gency preparedness analysis.”(39)
0 To verify that the vulnerability of the system is A vulnerability analysis focuses on the whole dis-
acceptable ruption period until a new stable situation is obtained,
0 To verify that the system’s security actions and as illustrated in Fig. 2. All activities to restore and restart
installations, and safety functions are adequate are therefore included in the analysis.
0 To evaluate the cost-effectiveness of a proposed The main differences between a risk analysis and a
action to improve the vulnerability vulnerability analysis are illustrated in Fig. 3 where the
0 To aid in establishing an emergency prepared- shaded triangles illustrates the scope of the risk analysis.
ness plan A vulnerability analysis is from Fig. 3 seen to comple-
0 As a design tool-to design a robust system ment and extend the risk analysis.
542 Einarsson and Rausand

Risk analysis
4 ,
Vulnerability analysis

Fig. 3. Difference in scope between vulnerability analysis and risk analysis.

The main focus of a risk analysis is the possible 4.2.1. IdentiJ5cation of Scenarios
event chains following an accidental event. A number of
baniers and safety fimctions are normally designed into
The overall objective of the first step of the vul-
the system to prevent escalation of possible accidental
nerability analysis is to increase the awareness of poten-
events, and to mitigate consequences. The reliability of
tial disturbances and accidental events that may threaten
these barriers and safety functions are studied for relevant
the survivability of the system, i.e., to fight the attitude
accidental loads. Assessments of fire and explosion loads
that “a disaster is something that happens to someone
etc. constitute an important part of a risk analysis.
else.” We propose a procedure based a worksheet as
A major part of the accidental events that are rel- illustrated in Fig. 4. Both the worksheet and the proce-
evant for a vulnerability analysis will be caused by ex- dure have similarities to a standard FMECA.
ternal threats, and by deliberate actions. A detailed
In column (a) risk factors-or threats-that may
causal analysis of these events will in many cases not contribute to vulnerability are listed, based on the tax-
be worthwhile, since we often will not be able to influ- onomy discussed in Sec. 3. A thorough identification of
ence on these threats. The only defense will often be to
the potential threats will always be the most important
install barriers and make the system more robust against
step of any vulnerability analysis.
the threats.
All potential threats should be carefully considered
The focal point of a vulnerability analysis is the and the relevant single threats and combined threats en-
(business) survivability of the system. The “barrier” tered into column (a) of worksheet no. 1 in Fig. 4.
concept must therefore be broader than for a risk anal- Each threat may initiate one or more disturbances
ysis. Insurance policies, and strategies for handling of or accidental events in the system, giving rise to a sce-
media and informing customers and the public during nario. Some scenarios may follow directly from a spec-
the various stages of a potential accident may for ex- ified threat, and hence be easy to predict; while other
ample be considered as barriers.
scenarios may be very difficult to predict. Combinations
Experience has for example shown that many com- of several threats may cause complex scenarios. The po-
panies (260% in Norway) become bankrupt after a major tential scenarios produced by the various threats should
fire, not because of the direct fire costs, but because they be entered into column (b) of the worksheet in Fig. 4.
have lost their market when they are up and running A traditional method to develop scenarios is to
again. This may in some cases be avoided by adequate search for accidents in similar systems either from ac-
planning of such emergencies. cident databanks, available information from other com-
panies, and other sources, like accident reports. Some
scenarios may be developed by an approach similar to
4.2. A Two-step Approach to Vulnerability Analysis HAZOP,(40)where the parameters and guidewords are
modified to the scope of a vulnerability analysis.
For some scenarios involving emergencies it is rel-
The two main steps of the vulnerability analysis are evant to develop them in cooperation with the pertaining
based on two different worksheets. governmental bodies or municipal authorities, for in-
Analysis of Complex Systems 543

Potential Resourcedsystemdplansfor
Scenario Like’y7 immediate mitigation, restoration, rebuilding, etc. Remarks
Threat
(Emergency) (yedno)
lntemal External

(a) (b) (c) (d) (e) (f) (9)

I I I I I

Fig. 4. Vulnerability analysis worksheet no. 1.

Likelihood Consequences of scenario Resources to mitigate, Total


Scenario of Human Envimn- B ~ propew
~ I rebuild,
~ restore,
~ ~etc.
(Emergency) mental
scenario impacts impacts impacts impacts
External I
No. Description (4-0) (4 - 0) (4 - 0) (4 - 0) -
(4 0) (4 - 0)

1.
(1) (2) (3) (4) (5) (6)

2.

3.

Fig. 5. Vulnerability analysis worksheet no. 2.

stance those that will be involved in rescue operations Several case studies have shown that the first step
in case of an emergency. is a very valuable process. Many weak points have been
When a scenario is described it is important to es- revealed and corrected during the analysis, especially
tablish whether the scenario is likely or not (c). This is concerning internal and external resources, systems and
done to limit the number of scenarios for further atten- plans [columns (e) and (f)].
tion, so that scenarios that are extremely remote may be
kept out of the further analysis. Only the really remote
scenarios should, however, be excluded during this 4.2.2. Assessment of Scenarios
screening process.
The next step is then to locate the potential immediate The second step of our approach comprises a quan-
effects of the first accidental effects in the scenario. These titative assessment of the various scenarios identified in
effects should be listed in column (d) of worksheet no. 1. the first step. The main objective of the second step is
Establishing an oversight over immediate effects is neces- to establish a ranking of the scenarios according to their
sary to be able to analyze resources, systems and plans for criticality. The analysis is carried out based on work-
mitigation, restoration, rebuilding, etc. These resources, sheet no. 2 in Fig. 5. The worksheet is rather similar to
systems and plans may both be internal (e) as well as ex- a work-sheet developed by the U.S. Federal Emergency
ternal (f). Detailed lists of barriers and mitigating systems Management Agency.(29)
may be used as checklists for establishing the status of The input to columns (1-5) in Fig. 5 is given a
these resources, systems and plans. Finally remarks may weight from zero to four, where “zero” means negli-
be made about each scenario in column (g). gible while “four” means most critical [or very frequent
To summarize, the objective of the first part of the in column (l)].
analysis is to identify and describe scenarios that have Worksheet no. 2 is based on the “likely” scenarios
consequences above a certain level (scenarios of rele- that were identified and described in the first step. These
vance), and how the system is prepared to cope with scenarios are numbered consecutively and entered into
these scenarios. the first column of worksheet no. 2.
544 Einarsson and Rausand

Consequence ranking

Fig. 6. Consequence-likelihood matrix for the various scenarios (example). The “circles” denote the
consequences btfore mitigating resources are included, and the triangles denote t h e j n d consequences
including the effect of the mitigating resources. The length of the line between the circle and the
triangle illustrates the effect of the resources for mitigation, restoration and rebuilding.

The likelihood of the scenario is registered on a The scenario with the largest risk will be put on top
scale from zero to four and entered into column (1). of the action list, as the most critical scenario, and then
Columns (2-5) address the consequences of the scenario the scenario with the next highest sum as number two,
with respect to human (h), environmental (e), business and so on. It is also possible to use the analysis specif-
(b), and property (p) impacts. The consequences are ically towards selected areas of interest by focusing only
given a rank on a scale from zero to four. The total on these or by giving them a higher weight.
consequence of scenario no. i may be presented as a The time interval from an accidental event occurs
(weighted) sum of the four consequence ranks. until a new stable situation is established is obviously
very important for the assessment of the final conse-
c, = kh * ch,t + ke * ‘e,t + kb ‘ ‘b.r quences of the accidental event. The new stable situation
+ k, cp,i for i = 1, 2,. . will sometimes be different from the initial stable situ-
ation. The new stable situation will usually be “weaker”
The weights, k,,,k,, kb, k,, may be chosen to reflect the as illustrated in Fig. 2, but may also be improved. This
importance of the four consequence groups with respect is for example the case if we, through an adequate in-
to the system’s mission. surance coverage, are able to replace inefficient equip-
Several companies may have the possibility to build ment with more efficient equipment. It may also be the
scenarios through cooperation with other companies case if we are able to get rid of an unprofitable activity
within the same branch. Today, several companies in that is politically impossible during normal operation.
high risk industries are exchanging data on accidents and The disruption time illustrated in Fig. 2 may apply
incidents. The Norwegian offshore industry has estab- to all the four consequence categories in worksheet no.
lished a common registration system for accidents called 2. If we, for example, loose key operators, it will take
SYNERGI.(41) The participating companies have thereby some time to get replacements or educate other operators
access to information on accidents that have happened within the company. The disruption times for each con-
anywhere in the area. sequence category should be considered when entering
Columns (6 and 7) address the presence and ade- the consequence ranks into worksheet no. 2. An alter-
quacy of internal and external resources for mitigation, native approach would obviously be to include an extra
restoration and rebuilding, and give them weights on the column in worksheet no. 2 with an estimate of the dis-
scale from zero to four. Rank four is given to a strong ruption time illustrated in Fig. 2.
(available and adequate) resource, while rank zero is
given to a very weak (unavailable or inadequate)resource.
The total criticality ranlung of a scenario is entered
in column (8). This rank may be given in various ways. 5. CONCLUDING REMARKS
An option would be, for each scenario, to multip!y the
likelihood rank with the consequence rank ci and then A new approach to vulnerability analysis has been
subtract a weighted sum of the ranks for the internal and presented with a broader scope than traditional risk anal-
external resources. We may also form a consequence- ysis (Fig. 3). The main differences between risk analysis
likelihood matrix as shown in Fig. 6. and vulnerability analysis have been identified and dis-
Analysis of Complex Systems 545

cussed. In our two-step approach to vulnerability anal- 5. R. S. Chen, “The Human Dimension of Vulnerability,” In R.
Socolow, C. Andrews, F. Berkhout, and V. Thomas (eds.), Indus-
ysis we focus on the “business” survivability based on trial Ecology and Global Change (Cambridge University Press,
analysis of scenarios that are evaluated against internal Cambridge, 1994), Chap. 6, pp. 85-105.
and external defense resources. 6. R. W. Kates, J. H. Ausabel, and M. Berberian (eds.), Climate
Impact Assessment: Studies of the Interaction of Climate and So-
The effects of several internal and external risk fac- ciety (ICSU/SCOPE Report 27, John Wiley & Sons, Chichester,
tors on stability of a company have been discussed. The 1985).
identified risk factors are different in nature, and acci- 7. J. J. Schwarz, “Societal Vulnerability and (Inter)National Stabil-
ity,” in IFAC Workshop: Conhibutions of Technology to Inter-
dental events may originate from a single or a combi- national Conflict Resolution (Cleveland, Ohio, 1986), pp. 127-
nation of several risk factors. The scenarios identified in 131.
worksheet no. 1 will therefore be different in nature and 8. D. M. Livennan, “Vulnerability to Global Environmental
also in their effect on the business stability. Some sce- Change,” in R. E. Kasperson, K. Dow, D. Golding, and J. X.
Kasperson (eds.), Understanding Global Environmental Change:
narios are relatively simple and may represent a single The Contributions of Risk Analysis and Management (Clark UN-
event, like a fire in a storage building. In such cases, versity, Worchester Massachusetts, 1989), pp. 27+.
analysis of the consequences may be rather straightfor- 9. N. A. Eisenberg, C. J. Lynch, and R. J. Breeding, Vulnerability
Model: A Simulation System for Assessing Damage Resulting
ward and it may also be easy to find temporary solutions. from Marine Spills, Environmental Control, Rep. CG-D-136-75,
Other scenarios may require complex remedies and Rockville, Maryland, 1975.
also cooperation with local authorities. In Norway, the 10. J. Berleur, C. Beardon, and R. Laufer (eds.), Facing the Challenge
municipal authorities are required to carry out so-called of Risk and Vulnerability in an Information Society (IFIP Trans-
actions A-33, North-Holland, Amsterdam, 1993).
risk and vulnerability analyses,(42)identify potential sce- 11. D. J. Icove, “Collaring the Cybercrook An Investigator’s View,”
narios and establish emergency plans in cooperation IEEE Specrmm, 31-36 (1997).
with local companies. 12. N. C. Lind, “A Measure of Vulnerability and Damage Control,”
Reli. Eng. Syst. Safety (1995).
We firmly believe that a thorough identificationand 13. X . Wu, D. L. Blockley, and N. J. Woodman, “Vulnerability of
analysis of potential threats and scenarios will contribute Structural Systems, Part 1: Rings and Clusters,” Civil Eng. Syst.
to an improvement of the company’s robustness and 10, 301-317 (1993).
“business” survivability. If you know the threats you 14. X . Wu, D. I. Blockley, and N. J. Woodman, “Vulnerability of
Structural Systems, Part 2: Failure Scenarios,” Civil Eng. Syst.
are exposed to, you will also be likely to take actions to 10, 319-333 (1993).
prevent these threats from creating emergencies. 15. H. D. Foster, “Resilience Theory and System Evaluation,” in J.
It has been claimed that only a massive emergency A. Wise, V. D. Hopkin, and P. Stager (eds.), Verification and
Validation of Complex Systems: Human Factors Issues, NATO
is able to verify the ultimate efficiency of a company’s Advances Science Institute. Series F: Computers and Systems Sci-
emergency preparedness. This may be true, but is no ences, Vol. 110, (Springer Verlag, Berlin, 1993).
excuse for not being prepared. Accidents and disruptions 16. T. S. Bott, Application of System Analysis Techniques to Vul-
should as far as possible be controlled at the source in nerability Studies of Complex Installations, 9th Advances in Re-
liability Technology Symposium, Los Alamos (1986).
terms of carefully planned barriers. Emergency prepar- 17. D. Meister, Psychology of System Design (Elsevier, New York,
edness plans should be developed and practiced for all 1991).
relevant emergencies. Successful risk and vulnerability 18. R. Rosness, Vulnerability in Complex Systems, SINTEF, Safety
and Reliability, N-7034 (Trondheim, Norway, 1992).
control will probably only be achieved through a very
19. A. Hqland and M. Rausand, System Reliability Theory; Models
patient learning process. A vulnerability analysis may, and Statistical Methods. (John Wiley & Sons, New York, 1994).
according to our opinion, be an important contribution 20. C. B. Powers, “Preparing for the worst,” IEEE Spechum 49-54,
to this process. (December 1996).
21. EN-1050, Safety of Machinery-Risk Assessment ( C E N X o m -
it6 Europken de Normalisation, rue de Stassart 36, B-1050 Brux-
elles, 1994).
REFERENCES 22. AIChE, Guidelinesfor Hazard Evaluation Procedures. (Center for
Chemical Process Safety, American Institute of Chemical Engi-
1. 1. Rasmussen, “Human Factors in High Risk Technology,” in A.
neers, 345 East 47th Street, New York, 1992).
E. Green (ed.), High Risk Safety Technology (John Wiley & Sons, 23. C. Perrow, Normal Accidents: Living with High-Risk Technolo-
Chichester, 1982), Chap. 1.6, pp. 143-215. gies. (Basic Books, New York, 1984).
2. R. E. Ball, Aircraft Combat Survivability: Susceptibility & Re- 24. J. Reason, “The Identification of Latent Organizational Failures
duction, Lecture notes, of Aeronautics, Naval Postgraduate in Complex Systems,” in Verijication and Validation of Complex
School, Monterey, California (1 979). Systems: Human Fuctors Issues, NATO AS1 Series (Ber-
3. SURV1AC-SurvivabilityyNulnerability Information Analysis IidHeidelberg, 1993).
Center, Intemet:http://www.surviac.flight.wpaf%.af.mil 25. J. G. Kemeny et al., The Need for Change: The Legacy of TMl,
4. R. Cantor and S. Rayner, “Changing Perceptions of Vulnerabil- New York Report of The President’s Commission on The Acci-
ity,” in R. Socolow, C. Andrews, F. Berkhout, and V. Thomas dent at Three Mile Island (Pergamon Press, 1979).
(eds.), Industrial Ecologv and Global Change (Cambridge Uni- 26. Lord Cullen, The Public Enquiry into the Piper Alpha Disaster
versity Press, Cambridge, 1994), Chap. 5 , pp. 69-83. (HMSO, London, 1990).
546 Einarsson and Rausand

27. T. A. Kletz, “Process Industry Safety,” in D. Blockley (ed.), En- 34. J. Hovden, M. Rausand, and G. Sergeev, Role of Societal Factors
gineering Safety (McGraw Hill, London, 1992), Chap. 15, pp. in Major Industrial Accidents, TIEMES 1995, 9-12 May, Nice,
347-368. France (1995).
28. A. R. Hale, B. H. J. Heming, F. G. T. Rodenburg, and K. Smit, 35. Dagens Nyheter (Swedish newspaper) 17 June (1997).
Maintenance and Safety; A Study of the Relation Between Main- 36. CEC-82, EC Directive 821501 on the Major Accident Hazards of
tenance and Safety in the Process Industry, Report to the Dutch Certain Industrial Activities (Seveso-Directive). Commission of
Ministry of Social Affairs and Employment, Safety Science the European Communities, Brussels, Official Journal (OJL) 230
Group, Delft University of Technology (1993). 8.5.82 (1982).
29. T. Wahle and G. Beatty, Emergency Management Guide for Busi- 37. D. Smith and C. Sipika, “Back from the Brink, Post-Crisis Man-
ness and Industry, Prepared for U.S Federal Emergency Manage- agement,” Long Range Planning 26 (l), 28-38 (1993).
ment Agency (FEMA), Internet: http://www.fema.gov/femd 38. IEC 300-3-9, Dependability Management-Part 3: Application
bizindex .html GuideSection 9: Risk Analysis of Technological Systems. (Inter-
30. NFPA 1600, Recommended Practice for Disaster Management national Electrotechnical Commission, Geneva, 1995).
(National Fire Protection Association, 1 Batterymarck Park, P.O. 39. NORSOK Standard, Risk and Emergency Preparedness Analysis,
Box 9101, Quincy, MA 02269-9101, 1995). NORSOK Z-013 (draft), Internet: http://www.nts.no/NORSOK
31. D. Mosey, Reactor Accidents (Nuclear Engineering International 40. T. Kletz, Hazop and Hazan: IdenllfLing and Assessing Process
Special Publications, Quadrant House, Sutton, Surrey, SM2 5AS, Indusby Hazards. (The Institution of Chemical Engineers, Davis
U.K. 1990). Building, 165-1 71 Railway Terrace, Rugby, Warwickshire CV2 1
32. R. Westrum, “Vulnerable Technologies: Accidents, Crime and 3HQ, U.K., 1992).
Terrorism,” Interdisc. Sci. Rev. 11 (4), 336391 (1986). 41. SYNERGI-accident database, Internet: http://www.synergi.no
33. R. F. Mould, Chernobyl: The Real Story (Pergamon Press, Oxford, 42. Guidelinesfor Municipal Risk and Vulnerability Analyses (in Nor-
1988). wegian) (Direktoratet for Sivilt Beredskap, Norway, 1994).

You might also like