0% found this document useful (0 votes)
203 views30 pages

Schoonhoven, C.B. 1981. Problems With Contingency Theory Testing Assumptions Hidden

This paper describes five problems with contingency theory that have contributed to its mixed empirical support. These problems range from a lack of clarity in theoretical statements to more subtle issues like symmetrical and nonmonotonic assumptions embedded in arguments. The paper tests hypotheses from Galbraith's contingency theory of organizing for effectiveness under uncertainty. Data comes from a study of hospital operating room effectiveness. While traditional contingency notions were not supported, more precise hypotheses accounting for the identified problems received stronger empirical validation.

Uploaded by

Muhammad Usman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
203 views30 pages

Schoonhoven, C.B. 1981. Problems With Contingency Theory Testing Assumptions Hidden

This paper describes five problems with contingency theory that have contributed to its mixed empirical support. These problems range from a lack of clarity in theoretical statements to more subtle issues like symmetrical and nonmonotonic assumptions embedded in arguments. The paper tests hypotheses from Galbraith's contingency theory of organizing for effectiveness under uncertainty. Data comes from a study of hospital operating room effectiveness. While traditional contingency notions were not supported, more precise hypotheses accounting for the identified problems received stronger empirical validation.

Uploaded by

Muhammad Usman
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 30

Problems with Contingency Theory: Testing Assumptions Hidden within the Language of

Contingency "Theory"
Author(s): Claudia Bird Schoonhoven
Source: Administrative Science Quarterly, Vol. 26, No. 3 (Sep., 1981), pp. 349-377
Published by: Sage Publications, Inc. on behalf of the Johnson Graduate School of
Management, Cornell University
Stable URL: https://www.jstor.org/stable/2392512
Accessed: 18-09-2019 22:56 UTC

JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide
range of content in a trusted digital archive. We use information technology and tools to increase productivity and
facilitate new forms of scholarship. For more information about JSTOR, please contact [email protected].

Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at
https://about.jstor.org/terms

Sage Publications, Inc., Johnson Graduate School of Management, Cornell University


are collaborating with JSTOR to digitize, preserve and extend access to Administrative
Science Quarterly

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
Problems with Con- This paper suggests that there are five problems with
contingency theory, ranging from a simple lack of clarity in
tingency Theory: Testing
its theoretical statements to more subtle issues such as the
Assumptions Hidden
embedding of symmetrical and nonmonotonic assump-
within the Language of tions in the theoretical arguments. Starting from Galbraith's
Contingency "Theory" t (1973) contingency theory about organizing for effective-
ness, several traditional contingency hypotheses were
Claudia Bird tested along with more precise hypotheses developed from
knowledge of the five problems with contingency theory.
Schoonhoven
Data were drawn from a study of organizational effective-
ness in acute care hospital operating room suites. Although
traditional contingency notions were not supported by the
data, the more precise hypotheses received stronger empir-
ical support. The study data suggest that relationships
between technology, structure, and organizational effec-
tiveness are more complicated than contingency theory
now assumes. The paper concludes by suggesting formula-
tion of a contingency theory of organizational effectiveness
that includes interactive, nonmonotonic, and symmetrical
argu ments.
? 1981 by Cornell University
0001 -8392/81 /2603-0349/$00.75

I would like to thank W. Richard Scott, This paper describes five problems with contingency theory
James G. March, and Michael T. Hannan
for their valuable comments on the re- that appear to account for much of its mixed empirical support.
search reported here. The study was car- The problems range from a simple lack of clarity in theoretical
ried out in association with the Stanford
statements to more subtle issues, such as the embedding of
Center for Health Care Research as part of
a larger project supported by Contract symmetrical and nonmonotonic properties in theoretical asser-
Number PH 42-63-65 from the National tions. Once these problems are made explicit, it is possible to
Center for Health Services Research,
Health Resources Administration, DHEW,
make more precise hypotheses about the empirical relation-
through the National Academy of ships expected when a contingency process is believed to be
Sciences-National Research Council under operating. The five problems are illustrated by testing hypothe-
subcontract MS 46-72-12, with William H.
Forrest, Jr., W. Richard Scott, and Byron ses taken f rom Galbraith's (1 973) contingency theory about
Wm. Brown, Jr., as principal investigators. organizing for effectiveness under conditions of task uncer-
Additional support was obtained from the
tainty. The three dimensions of structure focused on are rules
Organizational Research Training Program
at Stanford University undera training grant and procedures, decentralization of decision making, and pro-
from the National Institute of Mental fessionalization of the work force. Data were drawn from a
Health, DHEW, and from a National Re-
search Service (NRS) post-doctoral fellow-
study of effectiveness in acute care hospital operating suites.
ship to the author from NIMH, DHEW.
The analytic technique used in this study has not been widely
I am indebted for assistance to all my
applied to contingency arguments in the past.1 Previous
colleagues at the Stanford Center for
Health Care Research, in particular Ann analyses of contingency ideas have ranged from rich empirical
Barry Flood, Donald E. Comstock, Joan R. descriptions (Burns and Stalker, 1961) to diagrams and data but
Bloom, Byron Wm. Brown, Jr., William H.
Forrest, Jr., and Curt Englehard. Their con-
no tests of significance (Woodward, 1965; Mohr, 1971), ad-
tributions are gratefully acknowledged. In vancing to analysis of variance (Pennings, 1975) and correla-
addition, I wish to thank Michael Aiken, tional and regression techniques (Khandwalla, 1974). In this
Alice A. Young, Anne M. McMahon, Ed-
ward Wells, and three anonymousASQ
analysis, we graphed a partial derivative from the complete
reviewers for their comments on an earlier regression equation for effectiveness, examining the functional
version of this paper.
form of the interaction between technology and structure and
Any analysis, interpretation, or conclusion not just the coefficients of the variables involved. Graphed are
based on CPHA/PAS is solely that of the
author, and CPHA specifically disclaims any
the direct effects of structure on effectiveness as well as the
responsibility for any such analysis, in- interaction with technological uncertainty.
terpretations, or conclusions.

An earlier version of this paper was pre-


BACKGROUND
sented at the Annual Meetings of the
American Sociological Association,
Meyer et al. (1978: 18) has recently asserted that contingency
Chicago, September 5, 1977.
1
theory is widely accepted and, thus, is no longer controversial.
I am indebted to Michael T. Hannan for His observation is supported by the central position that con-
suggesting this technique. Also see Donald tingency theory seems to hold in the managerial and applied
E. Comstock's (1 975b) unpublished work
literature, if judged only by the number of textbooks in the area
for an additional illustration of this tech-
nique, applied to empirical data. (e.g., Kast and Rosenzweig, 1974; Tosi and Carroll, 1976;

349/Administrative Science Quarterly, 26 (1981): 349-377

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
Galbraith, 1977; Hellriegel and Slocum, 1978). We believe that
both Meyer and the textbook authors may have overlooked
some important discrepancies between contingency theory
and the extent of empirical support for it. In the fourteen years
since Thompson's (1 967) exhortation that we attend to con-
straints and contingencies residing within and outside the
boundaries of the organization, some have suggested that
contingency theory is not a very useful approach to explaining
differences in the structure and effectiveness of organizations.
Mohr (1971), for example, was among the first to suggest that
there were problems with the contingency ideas. In testing the
consonance version of contingency theory, he found no sup-
port forthe hypothesis thatwork groups will be most effective
when autocratic supervision is employed in routine jobs and
democratic supervision in nonroutine jobs. In a study of broker-
age offices, Pennings (1975) questioned the usefulness of
what he termed the "structural contingency model." He did not
find strong support for the argument that organizational effec-
tiveness is a function of the goodness of fit or consistency
between environmental and structural variables. Both studies
have been criticized for other reasons; for example, Pennings's
findings could possibly be attributable to lack of environmental
variation, since all offices were part of a single brokerage firm
(see also Scott [1977: 93] and Lynch [1974: 340]). However, a
larger set of problems flows through most discussions by
contingency theorists and researchers, which is also reflected
in whole or in part in the work of Mohr and Pennings.

PROBLEMS WITH CONTI NG ENCY THEORY

Lack of Clarity

There are several interrelated problems with contingency


theory. First, contingency theory is not a theory at all, in the
conventional sense of theory as a well-developed set of interre-
lated propositions. It is more an orienting strategy or
metatheory, suggesting ways in which a phenomenon oughtto
be conceptualized or an approach to the phenomenon ought to
be explained. Drawn primarily from large-scale empirical
studies, contingency theory relies on a few assumptions that
have been explicitly stated, and these guide contingency re-
search. The first explicit assumption is that there is no one best
way to organize; the second is that any way of organizing is not
equally effective under all conditions (Galbraith, 1973: 2). The
"theory" then asserts that, in order to be most effective,
organizational structures should be appropriate to the work
performed and/or to the environmental conditions facing the
organization. Although the overall strategy is reasonably clear,
the substance of the theory is not clear.

The lack of clarity is substantially due to the ambiguous


character of the "theoretical" statements. Statements from
contingency theorists and researchers suggest that a particular
structure should be "appropriate for" a given environment
(Thompson, 1967), that organizations are more successful
when their structures "conform" to their technologies (Wood-
ward, 1965: 69-71), that an organization's internal states and
processes should be "consistent with" external demands
(Lawrence and Lorsch, 1969), that organizations should at-
tempt to maximize "congruence" between technology and
their structure and adapt their structures to "fit" their technol-

350/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
Problems with Contingency Theory

ogy (Perrow, 1970: 80), that technology and structure need to


be properly "aligned" (Khandwalla, 1974: 97), that a "coalign-
ment" should exist between environment and structure (Law-
rence, 1 975), and that communication structures should
'match" the nature of the task (Tushman, 1978). Contingency
theory currently requires greater precision than is provided by
these richly suggestive but ambiguous statements.
Furthermore, the above statements do not differentiate envi-
ronment from technology, since neither separate processes
nor separate predictions distinguish the two. It is quite possible
that environment and technology are related to distinctly differ-
ent structural variables rather than to the same unspecified set,
as is implicitly assumed by the ambiguous "theory."

Contingency Relations as Interactions


The second problem with contingency theory is that lack of
clarity by contingency theorists blurs the fact that an empirical
interaction is being predicted. When contingency theorists
assert that there is a relationship between two variables
(dimensions of technology and structure, for example) which
predicts a third variable (organizational effectiveness), they are
stating that an interaction exists between the first two variables
(Namboodiri, Carter, and Blalock, 1975: 109). This point has
been made by others; Lazarsfeld (1958) used the phrase
"contingent" to describe interactive relationships, and, more
recently, Southwood (1978) introduced the term "mu Itiplar" for
propositions that predict statistical interaction. The status in-
consistency literature demonstrates another theoretical appli-
cation of the general point. However, this observation has not
been explicitly applied to the body of knowledge known to
organizational theorists as "contingency theory." For example,
in describing the relationship between uncertainty and profes-
sionalization, Galbraith (1973: 12-13) implicitly assumed an
interaction between the two, since the impact of professionali-
zation on effectiveness is described as varying over the range of
uncertainty, but this interactive relationship is not acknowl-
edged. Explicit recognition should be given to the fact that
contingency arguments produce interactive propositions.

Functional Forms of Interaction

The third problem is that, because of a lack of clarity, theoretical


statements also fail to provide any clues about the specific form
of the interaction intended. The mathematical function of the
implied interaction between structure and technology (or envi-
ronment) is seldom made explicit. One consequence of this lack
of specificity is that the mathematical function implied by the
verbal theory may be represented in practice by a function that
has quite different properties. The function implied by the
verbal theory is actually a statement about how structure is
hypothesized to relate to technology (or environment) for
greater effectiveness.

The phrases quoted earlier, such as that structure should be


properly "aligned with" technology, "consonant with"
technology, "fit," and "be appropriate for" the technological
circumstances, can be interpreted several ways. In onetheoret-
ical interpretation, one assumes that effectiveness is most
likely when two factors such as technological uncertainty and
professionalization are both present but that effectiveness is

351/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
less likely when either is absent. A theory interpreted thus
would be multiplicative (Blalock, 1965). It would be expressed
by a corresponding function, such as Y=(X1X2), and would be
stated hypothetically as follows:
The greater the value of variable 1 (technological uncertainty), the
greater the impact of variable 2 (structure) on variable 3
(effectiveness).

A second theoretical interpretation implies that there is a value


of structure for each value of technology that will maximize
effectiveness. This might be labelled a "matching" or
"maximizing" theory, since it presumes that for everyX1 there
is a uniqueX2 at which Y is maximized and that deviations in
either direction from the uniqueX2 reduce the value of Y.
Functions meeting "matching" theory requirements would be
stated as
1( I )2
lX1-X21
Its corresponding hypothesis takes the following form:
Given the value of variable 1 (technological uncertainty), there is a
matched value forvariable 2 (structure) that produces the highest
value of variable 3 (effectiveness). Deviations from this relationship in
either direction reduce the value of variable 3 (effectiveness).

Depending on one's interpretation of the theorists' ideas,


contingency theory is capable of producing precise hypotheses
as well as corresponding functions, as these two examples
illustrate.

Other substantive interpretations and their corresponding func-


tional interpretations might be made within the maximizing
interpretation. Might not there be several maxima, as in a
harmonic function? A harmonic function would be especially
consistent with the "consonance" terminology and would
represent yet another of the matching and maximizing argu-
ments. Or there could be an asymptotic approach to the
maximum. Or perhaps even a threshold effect in which, for a
given value ofX2 (technology), the effects ofX1 (structure) on Y
(effectiveness) increase asX1 increases up to a point beyond
which no increases in Y occur.

Neither the multiplicative, the matching, nor the maximizing


variations developed here exhaust the possible interpretations
2 that can be made of the statements currently used by con-
The term ( 1 tingency theorists. However, these rather different interpreta-
IX, -X2I tions should serve as a sufficient illustration of the difficulties
is not defined when a subunit is perfectly
encountered when imprecise statements are used to express
matched, since theabsolute difference be-
tweenX1 andX2=O. One approach to this is theoretical ideas.
to set such cases to some non-zero
minimum value, close to the value obtained This third problem has important consequences. The
in the data for the "nearly" perfectly mathematical function used to express an interaction is not a
matched case. This procedure would in-
sure that the empirical range is not dramat-
trivial operational decision. It is one that should be grounded in
ically violated and, thus, that results will not theory, since its form makes assumptions with cleartheoretical
be skewed by an unrealistically high
implications. If this function is reduced to a relatively thought-
number, once the absolute difference has
been divided into 1. I thank an anonymous less operationalization, then the theory tested may have quite
reviewer forthe initial observation and Ruth different properties from the one asserted.3
Cronkite for this approach.
3 The Analytic Model Used
Pennings (1975) is one of the few re-
searchers in the contingency tradition to
A fourth problem with contingency theory is that the opera-
check for interaction explicitly. However, tional and computational procedures that researchers tend to
the value of this contribution is diluted,
use impose assumptions on an already imprecise conceptual
since the functional forms are not made
explicit, nor are precise hypotheses stated. framework. Because of the tendency to rely on the general

352/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
Problems with Contingency Theory

linear model and correlational procedures, the relationships


studied within the contingency framework are typically as-
sumed to be linear. See, for example, Khandwalla's (1974)
correlational and regression analysis of the relationship be-
tween mass output orientation and structure, Tushman's (1978)
linear analysis of differences in mean amount of communica-
tion by task complexity, and Leifer and Huber's (1977) linear
analysis of structure and perceived environmental uncertainty.
Khandwalla attempts to correct for the limitations of his initial
correlational analysis of technology and structure by
dichotomizing his sample on relative profitability- a fairly
common approach. However, among those using this ap-
proach, no explicit attempt is made to test for statistical
interaction.

Some relationships between technology, structure, environ-


ment, and effectiveness may indeed be linear. Woodward's
(1965) early work, however, should have alerted us at least to
check for nonlinear effects, if not to predict or expect them.
She found a curvilinear relation between technical complexity
and the span of control of first-line supervisors, the proportion
of skilled workers, and the number of line and staff specialists.4

The sometimes misplaced assumption of linearity is important


for two reasons. First and obviously, the researcher fails to
check for nonlinear relations when linearity is unquestioningly
assumed. Equally as important is that the assumption of
linearity masks another implicit assumption hidden within con-
tingency theory, that contingency relations are symmetrical.

Assumptions about Contingency Relationships

An assumption of symmetrical effects is hidden in the language


of contingency theory. Often contingency arguments suggest
that lower values of a dimension of structure, when coupled
with lower values of technological (or environmental) uncer-
tainty, should produce effective organizations. An implication of
this argument is that if, instead, high values of structure are
combined with low values of technology, or vice versa, then
effectiveness will be impaired because no congruence exists
between technology and structure. This symmetrical property
of contingency theory arguments is important because it
suggests a nonmonotonic effect of structure on effectiveness
overthe range of uncertainty, ratherthan the usual assumption
that an effect is constant over all values of the independent
variable.

For example, in Galbraith's (1973) argument, combinations of a


low value of uncertainty and a low value of structure cannot be
distinguished from high-high combinations, since both combi-
nations of values should yield equally effective organizations. It
is possible, however, to recast the low-low combination equals
effectiveness into a combination that is inappropriate accord-
ing to the theory, and this would be expected to produce less
effective outcomes. For example, lower uncertainty combined
with increased decentralization should yield negative out-
comes. Were we to recast the combination in such a manner,
4 we could then make more specific hypotheses about the
In developing several statistical interaction relationships between uncertainty, decentralization, and effec-
models, Southwood also discussed cur- tiveness. Three hypotheses become immediately apparent: (1)
vilinearity, pointing out that it IS analogous
The impact of decentralization on effectiveness is non-
to interaction and may be confounded with
it (1 978: 11 56). monotonic over the range of uncertainty. (2) In lower uncer-

353/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
tainty subunits, increases in decentralization will negatively
influence effectiveness. (3) In higher uncertainty subunits,
increases in decentralization will positively influence effective-
ness. If we press the implied symmetry assumption, then one
clearly must develop a set of nonmonotonic hypotheses, as
illustrated above.

GALBRAITH'S CONTINGENCY ARGUMENTS

Although this paper has criticized contingency theorists for


insufficient clarity, several of Galbraith's (1973) arguments
have been selected to test, since his work has several
strengths not shared by the others. He has brought a cohesive
analytic framework to the structural design of organizations,
and he has been quite clear in specifying to what the structure
of the organization is adapting. He argued that structure de-
pends on the amount of information processed among decision
makers during task execution and, secondarily, on the relative
costs of various structural designs.

One of Galbraith's basic assumptions (1973: 4) is that the


greater the uncertainty of the task, the greater the information
that must be processed during task execution to achieve a given
level of performance. He also assumed, however, that alterna-
tive structural arrangements vary in their capacities for process-
ing information. Some are more effective than others for a
given level of uncertainty. As a consequence, the specific
structures adopted should depend on the amount of uncer-
tainty present in the tasks and workflow. Thus, the degree of
task uncertainty is the contingent variable on which turn
alternative organizational and subunit structures (1 973: 4).
Galbraith's arguments apply to the organizational as well as to
the subunit level of analysis. The author states: "All lead to the
conclusion that the best way to organize is contingent upon the
uncertainty and diversity of the basic task being performed by
the organizational unit. . . . This approach would account for
task predictability differences which exist between and within
organizations." And this . . . confirms again . . . that the pre-
dictability of the task is a basic conditioning variable in the
choice of organizational forms" (1973: 4). When speaking of
task uncertainty, Galbraith was actually referring to the re-
search tradition that takes the more general label of research on
technology. Environmental uncertainty is a variable indepen-
dent of technological uncertainty and is explicitly excluded from
the arguments considered here and the measures described
later.

Given his assumptions, Galbraith developed an information-


processing model to explain how uncertainty and information
relate to structure (1973: 8). The information processing model
incorporates several conventional dimensions of structure: (1)
rules and programs, also referred to as standardization; (2)
hierarchical referral, or centralization of decision making; and (3)
professionalization, an element of goal setting. The model then
moves to what Galbraith calls "new design strategies" (1 973:
1 5) as more uncertain tasks are faced: (4) creation of slack
resources, (5) creation of self-contained tasks, (6) creation of
vertical information systems, and (7) creation of lateral relations
(1973: 1 5). The focus here is on the firstthree arguments in his
model, since the largest body of research is on these traditional
dimensions of structure; the purpose is only to illustrate the

354/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
Problems with Contingency Theory

problems with contingency theory, not to test Galbraith's entire


model comprehensively.

Implied Multiplicative Hypotheses

Since Galbraith prefers not to formalize relationships as


hypotheses or as mathematical functions (1973: viii), the
reader must discover the precise relations that he intends to
hold between uncertainty and each of the dimensions of
structure. We assume that the "true" relationship that Gal-
braith intends between uncertainty and each dimension of
subunit structure is a multiplicative one. When a relationship
between twovariables is expressed as a multiplicative function,
the effect of one variable, X1, is increased by higher values of
the other, X2 (Namboodiri, Carter, and Blalock, 1 975: 175) and
produces the most pronounced impact on the dependent
variable when both exhibit high values. This effect is consistent
with Galbraith's emphasis on the higher ranges of task uncer-
tainty and structure in his discussions. The following statement
is an example of the general form of the contingency hypothe-
ses suggested in Galbraith's works, assuming a multiplicative
relationship between uncertainty and structure: the greaterthe
task uncertainty, the greater the impact of a dimension of
structure on effectiveness. This paper focuses on three of his
specific arguments, first presenting Galbraith's contingency
hypothesis and then an elaborated set of contingency hypothe-
ses developed with awareness of the five problems with
contingency theory described earlier.

Hypothesis 1: Uncertainty, Rules and Programs, and


Effectiveness

Galbraith linked uncertainty, rules and programs, and effective-


ness in the following way: when task uncertainty is low and
work is very predictable, information-processing needs are low.
Since organizations seek to increase the predictability of their
production tasks through program specification (March and
Simon, 1958), task-related rules and procedures will be estab-
lished to guide execution. "Many standard operating proce-
dures arise in this manner" (Galbraith, 1973: 10). As task
uncertainty increases, however, one cannot accurately predict a
priori the combinations of possible events and the proper
"rules" to apply. To attempt to do so is likely to cause the
inflexible application of rules to cases in which standardized
procedures are inappropriate and, thus, less effective. There-
fore, the general rule becomes, "Decide what to do when you
encounter the contingency," and there should be less stan-
dardization of rules and procedures as uncertainty increases
(Galbraith, 1973: 10-1 1). "Destandardization" can be substi-
tuted for "less standardization" to simplify the wording. The
first hypothesis advanced from Galbraith's discussion is as
follows:
Hypothesis 1 (H1): The greater the technological uncertainty, the
greater the positive impact of destandardization on effectiveness.
Hypothesis 2: Uncertainty, Decentralization of Decision
Making, and Effectiveness

Galbraith asserted that centralization of decision making should


vary with task uncertainty for greater effectiveness. He argued
that when exceptional cases are encountered at lower levels of
uncertainty, these are usually referred upward in the managerial

355/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
hierarchy for centralized decision making. Centralization is
possible since information processed does not overburden the
hierarchy at lower levels of uncertainty. However, as uncer-
tainty increases, inflating the volume of information required at
the point of task execution, increased participation in decision
making must be exercised by the workforce. "This can be
accomplished by increasing the amount of discretion exercised
by employees at lower levels of the organization" (1973: 12).
Subunits structured to refer all exceptional cases upward to a
centralized decision pointare likelyto suffersignificantdelays in
implementing decisions because of the lengthy referral pro-
cess. This may be an efficiency loss for outcomes in some
organizations; however, if the unit is performing tasks in which
a more rapid decision process is crucial to outcomes, then
decision delays have clear implications for outcome quality
(Galbraith, 1 973: 11 -1 2). Upward referral also results in con-
densation of information as well as some distortion (March and
Simon, 1958); thus, the probability that decisions made far
from the site of task execution will be inappropriate is clearly
increased. The second hypothesis advanced from Galbraith is
as follows:

Hypothesis 2 (H2 ): The greater the technological uncertainty, the


greater the positive impact of decentralization on effectiveness.
Hypothesis 3: Uncertainty, Professionalization, and
Effectiveness

Galbraith argued that, as task uncertainty increases and it


becomes more efficient to decentralize decision making down
to the points of action at which the information originates, the
organization faces a potential behavioral control problem. "That
is, how can the organization be sure that the employees will
consistently choose the appropriate response to the job-related
situations which they will face?" (1973: 12). He answered his
question by noting that to increase the probability that
employees will select the appropriate behavior, organizations
substitute craft or professional training of the workforce for
lower levels of training and skill. He referred to this as
professionalization (I1973: 12), or selection of responsible
workers who have the appropriate skills and attitudes. As a
consequence, once the skills and training of the workforce have
been adapted to task uncertainty needs, then task-relevant
decisions can be safely delegated to the level of the workforce
itself without sacrificing control over outcome quality (1973:
12-13). The last contingency hypothesis derived from
Galbraith is the following:
Hypothesis 3 (H3 ): The greaterthe technological uncertainty, the
greater the positive impact of professionalization on effectiveness.
A Set of Elaborated Contingency Hypotheses
The arguments and hypotheses above assume a positive
monotonic effect of structure on effectiveness. Consistent
with his preference for not stating hypotheses, Galbraith does
not explicitly discuss the question of monotonicity in his
arguments. Since nonmonotonic relationships are rare and
seldom appear in organizational sociology, and given the ab-
sence of specific information to the contrary in his work,
monotonic relationships are assumed for Galbraith's argu-
ments. The monotonic assumption is incompatible with the
symmetrical property of contingency arguments identified ear-

356/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
Problems with Contingency Theory

lier, however. If lower ranges of uncertainty require stan-


dardized procedures, for example, then destandardizing should
have a negative influence on effectiveness. As a consequence,
we should expect a nonmonotonic effect of destandardization
on effectiveness over the range of uncertainty. Assuming a
multiplicative form of interaction between uncertainty and each
dimension of structure, as well as identifying a symmetrical and
thus nonmonotonic effect of structure on effectiveness over
the range of uncertainty, three hypotheses ratherthan one may
be advanced for each of the original contingency relationships:
Hypothesis 1a (Hla): The impact of destandardization on effective-
ness is nonmonotonic over the range of uncertainty.

Hypothesis lb (H lb): When technological uncertainty is low, in-


creases in destandardization will negatively influence effectiveness.
Hypothesis ic (H1c): When technological uncertainty is high, in-
creases in destandardization will positively influence effectiveness.
Hypothesis 2a (H 2a): The impact of decentralization on effectiveness
is nonmonotonic over the range of uncertainty.
Hypothesis 2b (H2b): When technological uncertainty is low, in-
creases in decentralization will negatively influence effectiveness.
Hypothesis 2c (H 2c): When technological uncertainty is high, in-
creases in decentralization will positively influence effectiveness.
Hypothesis 3a (H 3a): The impact of professionalization on effective-
ness is nonmonotonic over the range of uncertainty.
Hypothesis 3b (H3b): When uncertainty is low, increases in profes-
sionalization will negatively influence effectiveness.

Hypothesis 3c (H31c): When uncertainty is high, increases in profes-


sionalization will positively influence effectiveness.
Hypotheses About the Effects of Uncertainty and Resources
To this point it has been argued contingently that several
dimensions of structure interacting with technological uncer-
tainty will increase organizational effectiveness. A more exact-
ing approach, however, is to determine the explanatory power
of an interaction after controlling for the main effects of
variables comprising the interaction term. We can control for
the direct effects by adding measures for each of the main
variables in the technology and structure interaction. This
means that we will look in the analysis for information on both
X1, the effect of destandardization, for example, as well as for
information on X1X2,the interaction effect of destandardization
and uncertainty. The model being developed controls for the
effects of structure on effectiveness and also hypothesizes the
negative influence of uncertainty on effectiveness. In the latter
hypothesis, uncertainty is presumed to undermine organiza-
tional effectiveness unless it is met by structural features
designed to absorb the information uncertainty. Restated as an
hypothesis:
Hypothesis 4 (H4): Uncertainty will be negatively related to
effectiveness.

Galbraith's original hypotheses (H1 through H3) are compared


with the elaborated set of contingency hypotheses (Ha
through H3c) in a model predicting organizational effectiveness.
To this model one additional variable, resources of the organiza-
tion, has been added as a control. This was done because
resources have been found to influence effectiveness. In their
study of school organization and achievement, Bidwell and

357/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
Kasarda (1975: 68) found that ". . . resources did have a
substantial impact" on effectiveness. Essentially, greater re-
sources appear to allow for implementation of the relations
between uncertainty and structure believed to be necessary for
effectiveness or, in Galbraith's terms, to allow for the relative
costs of various structural designs. A fifth hypothesis is devel-
oped by bringing resources explicitly into the model:
Hypothesis 5 (H5 ): The greater the organizational resources, the
greater the effectiveness of the organization.

The hypotheses stated above can be described in a causal


model. Figure 1 represents the system of relations
hypothesized; notations on the paths between variables repre-
sent the positive or negative relationships expected.

X1X2

X1X3 eY

XX41

X2 >

x3

x4

x5
Where:

X1 =Workflow uncertainty X4= Professionalization


X2= Destandardization X5=Resources
X= Decentralization Y=- Effectiveness

Y=a +X1X2 +X1X3 +X1X4-X1 +X2 +X3 +X4 +X5 +eY (1)

Figure 1. Causal model of organizational effectiveness predicted from


three technology-structure interactions.

METHODS

As part of a larger study of the quality of surgical care in


hospitals (see Stanford Center for Health Care Research,
Scott, Forrest, and Brown, 1976), data were collected on 8
patients who underwent surgery in the operating room suite
(ORs) of 17 acute-care hospitals in the U.S. The sample
included only nonfederal, voluntary, or community, nonprof
hospitals providing short-term care for a variety of acute
medical and surgical patients. The hospitals were selected to
maximize variance in size (from 99 to 638 beds), expense ratio
(from $28,000 to $56,000 annual expenses per occupied bed),
teaching status (six hospitals were affiliated with a medical
school or had an approved and active house staff program,
whereas the remainder had no active physician training pro-
grams), and geographical location (10 states and all major
regions of the continental U.S. were represented). Although
every effort was made to produce a representative sample of
U.S. acute-care hospitals, the sample was biased in favor of
larger hospitals (theaverage numberof beds per hospital in the
358/ASQ, September 1 981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
Problems with Contingency Theory

sample was 310, compared to the national average of 1 53 beds


for hospitals of the same type).

This study focused on the surgical transformation process in


these hospitals and the patients operated on within that
process. Although the technical core of surgery is the operating
room suite (OR), the entire surgical transformation process
includes the OR suite as well as patient wards regularly caring
for at least 12 surgical patients during the year preceding the
study.

The patients studied underwent at least one of 1 5 surgical


procedures selected as the basis for indicators of organizational
effectiveness: gastric surgery for ulcer; selected surgery of
biliary tract; surgery of large bowel; appendectomy; splenec-
tomy; abdominal hysterectomy; vaginal hysterectomy;
craniotomy; amputation of lower limb (ankle to hip); fractured
hip; arthroplasty of the hip; lumbar laminectomy, with or
without fusion; pulmonary resection; prostatectomy; and
selected surgery of abdominal aorta and/or iliac arteries. These
procedures were selected by several criteria. Since death
following surgery is a relatively rare event in today's modern
hospitals, we attempted to maximize the number of deaths
observed by choosing high-volume procedures orthose with a
high risk of a poor outcome. We also included surgical proce-
dures by which the disease could be staged as to its severity or
progress, procedures on a variety of organ systems, a range of
surgical specialties, and procedures affecting both male and
female patients in a wide range of ages and physical conditions.
These latter data formed one of the major strengths of the
larger study and also of our measures of surgical effectiveness.
We used them to adjust for patient preoperative condition by
including disease stage and the preoperative physical status of
the patient. The number of patients studied per OR varied by
size of hospital, ranged from 39 to 1,256, and accounted for 81
percent of the 10,563 who qualified for inclusion.

Sources of Data

Five principal sources of data were used in this analysis: (1) an


interview with each hospital administrator, (2) interviews with
the director of the operating room, (3) questionnaires adminis-
tered to all registered and practical nurses in the OR suite, (4) a
daily schedule of the operations to be performed in each OR,
sampled every four days for eight and a half months, and (5)
information on the physical status, disease stage, age, sex,
economic status, and death or complications following surgery
for individual patients.

All questionnaires and interviews administered to members of


the hospital staffs were collected by two teams of inter-
viewers, who spent approximately two weeks in each hospital
administering the instruments. Copies of the original operating
room schedules and modifications of them were collected from
the OR suites by a set of part-time technicians located in each
hospital for the duration of the study. Technicians also gathered
the data on individual patient characteristics, which formed the
basis for computing the outcomes of surgery- death and
complications - that were the measures of surgical effective-
ness. Both the interviewers and the technicians employed
detailed schedules and structured instruments in their data

359/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
collection activities and received extensive training before
entering the field.

Measures

Our approach to measuring technology, structure, and effec-


tiveness is presented here, whereas detailed descriptions of
the operationalizations of each variable are presented in Ap-
pendix A. We measured a single technology, the one directly
characteristic of the organizational unit studied, the operating
room. Several separate production technologies may be
present in the same organization, each with different charac-
teristics. Researchers, however, generally assume that organi-
zations possess only one type of technology, failing to differ-
entiate multiple technologies. This is a fairly crucial distinction,
since some researchers have averaged across several
technologies in the same organization (Khandwalla, 1974);
others have dropped from the analysis organizations with more
than one technology (Woodward, 1965); and still others have
failed to note the issue at all (Harvey, 1968; Rushing, 1968;
Hickson, Pugh, and Pheysey, 1969).

Our indicators of technology measured variation in the work-


flow of the operating room (OR) and, hence, uncertainty. Daily
fluctuations, as well as variations over time in the number and
type of operations, greatly affect the flow of work in the OR.
Our approach assumed that variation in the flows of patients (by
type and expected arrival for specific operations) determines
predictability of the care tasks confronting the OR staff.

We measured structure at the same level of analysis as


technology, since our measures of destandardization, decen-
tralization, and professionalization all relate to the structure of
the operating room itself rather than to the larger organization.
In much research in the technology and structure tradition
these two variable sets are measured at different levels of
analysis. Whether consciously or not, some research designs
continue to reflect the expectation that technology at the
workflow level of production should be strongly correlated with
structural variables at the more remote administrative levels of
the organization, despite the findings of Hickson, Pugh, and
Pheysey (1969) to the contrary and the theoretical argument of
Scott etal. (1972). In general, it is importantto make explicitthe
levels of organization at which we measure both technology
and structure, given the mixed empirical results characteristic
of research on the relationships between the two variables.

We measured effectiveness as severe morbidity: a risk-


adjusted postsurgical death and complication rate averaged for
all patients undergoing surgery in the operating room suite of a
given hospital. There are several points to be made regarding
this measure. The first concerns level of analysis. I n ourcritique
of contingency theory, we have been broadly concerned with
how environment, technology, and structure influence organi-
zational effectiveness. Once focused on a specific theoretical
argument, Galbraith's, our interest has narrowed to the influ-
ence of technology and structure on organizational effective-
ness. In this latter tradition, arguments have been made at both
the subunit and organizational level. Our measurement concern
was with organizational and suborganizational level

360/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
Problems with Contingency Theory

phenomena, given that it is at those levels at which the


theoretical arguments about contingency theory are made. We
measured effectiveness at the same level of analysis as that at
which technology and structure were measured, since post-
surgical death and complication rates characterize outcomes of
the surgical transformation process. We selected the subunit
rather than the larger organizational level, because the higher
level (with overall hospital death and complication rates, for
example) would include measures from more than a single
technology; in acute-care hospitals the two main technologies
are medical and surgical care. Thus, the organizational level
would be inappropriately high for this study, mixing outcomes
not treated in the transformation process.

Another point about the effectiveness measure concerns the


adjustments made for the patients preoperative characteris-
tics. A real strength of the larger study design (Stanford Center
for Health Care Research, 1974; Scott, Forrest, and Brown,
1 976) as well as of this analysis is that it included six charac-
teristics of patients -sources of input variations to the
transformation process - in the measure of effectiveness.
Through health care literature it is now well known that
postsurgical death and complication rates are related to pa-
tients' preoperative and socioeconomic conditions. Noting this
point in a different organizational setting, Hannan, Freeman, and
Meyer (1976), in a comment on Bidwell and Karsarda's (1975)
work, argued that there are two specification errors to be aware
of when analyzing models of school effectiveness: omission of
input variables and analysis at the wrong level. They pointed out
that ordinary least squares applied to a model that excludes
causal variables correlated with those included gives biased
estimates, confounding organizational effects with the effects
of excluded input variables. Given an error of specification
(excluding student input variables, in this case), analysis with
data aggregated above the theoretically appropriate level can
lead to inflation of the original errors of misspecification (Han-
nan, Freeman, and Meyer, 1976).

Hannan, Freeman, and Meyer noted further that they


'. . . would similarly be suspicious of estimates of the effect of
hospital organization properties on treatment effectiveness
which ignored such input variations as patient risk at time of
entry into the hospital" (1976: 137). As the health care literature
attests, their concern is well placed. By including characteristics
of patients in our measure of effectiveness, the problem of
specification bias was avoided. Our measure of effectiveness
was adjusted for the six patient-specific variables of stage of
surgical disease, physical status, age, sex, life stress, and the
likelihood of the patient's seeking medical care. Hannan,
Freeman, and Meyer (1 976: 137) also noted that they '. . . hold
no brief for the priority of individual level propositions" applied
to studies and theories of organizational effectiveness. Having
dealt in this study with their initial concern by including input
variables, we have also dealt with their second concern about
the inflation of original errors of misspecification through
aggregation.

Measurementcdetails forall variables are described in Appendix


Ai including the means standard deviations and ranges for
each measure used in the analysis.

361/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
Analysis of Data

Analysis of the data proceeded in two steps. First, the model of


organizational effectiveness was tested using multiple regres-
sion analysis. The equation used was Equation (1) described in
Figure 1. Since two dimensions of professionalization have
been identified in the measurementappendix (Appendix A), the
equation was run twice. I n the first equation, initial training (B.S.
degree) was the dimension of professionalization (X4) em-
ployed, whereas in thesecond equation all measures remained
the same but professional activities were substituted as the
measure of X4. Results are presented as unstandardized re-
gression coefficients. Unstandardized regression coefficients
were used for a number of reasons, the most salient of which
had to do with testing for interaction in multiple regression.
Allison (1977) has shown that the inclusion of a product term in
a multiple regression is a legitimate way to test for interaction
as we have done, using a multiplicative term. He showed, along
with Althauser (1971) before him, that the standardized coeffi-
cient for the product term is affected by changes in the means.
However, the unstandardized coefficient for the product term
is notaffected, nor is theT-test forthe productterm affected by
the addition of arbitrary constants to the variables in the model.

To determine if a nonmonotonic effect was present, we pushed


the analysis to a second stage: the graphing of a partial
derivative from the larger regression equation for effective-
ness. Merely inspecting the signs and magnitudes of regres-
sion coefficients is insufficient analysis for contingency
hypotheses. Graphing a partial derivative from the larger re-
gression equation will reveal nonmonotonic effects not readily
apparent in the tabled coefficients. If a nonmonotonic effect is
present, as revealed bythegraph and its calculations, it will add
substantially to our knowledge to know where in the range of
technology a change in the direction of a slope occurs. In the
second phase of the analysis, a method for plotting the joint
effect of the main and interaction terms was introduced,
followed by graphs of each interaction term. Each graph
expresses the change in effectiveness, given a change in a
structural dimension, overthe range of uncertainty. I n so doing,
we tested for the extent to which dimensions of subunit
structure have a symmetrical and nonmonotonic effect on
effectiveness over the range of uncertainty.5

RESULTS

Results of the first phase of the analysis are presented in the


Table: unstandardized regression coefficients of severe mor-
bidity on the interactions between technology and structure,
their main effects, and the resources of the organization. In
interpreting these results, readers should note a peculiar fea-
ture of the effectiveness measure: severe morbidity is ex-
pressed in positive numbers, which are interpreted negatively.
5 If a particular variable is positively related to severe morbidity,
While Hannan, Freeman, and Meyer (1976) then severe morbidity increases. This is interpreted as an
originally suggested this technique,
undesirable outcome, since lower rates of severe morbidity are
Southwood has also commented recently
that "it is possible to determine the range desired for greater surgical effectiveness. As a consequence,
of the effect of X1 over the range. . of variables with an inverse relationship to severe morbidity, in
X,
and vice versa. This is not provided by the
regression equation but may be deduced
fact, enhance organizational effectiveness.
f rom it by substituting values on the control
variable at the extremes of its range"
In this analysis the focus was on variables relevant to Gal-
(1978: 1168). braith's and our contingency hypotheses: the interactions
362/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
Problems with Contingency Theory

Table

Regression Coefficients of Organizational Effectiveness Measured as Severe Morbidity* (N=17 Operating Room Suites)

UncyDeStd UncyDeCnt UncyProflzn Uncy DeStand DeCent Proflzn Resour Constant F R2


(X1X2) (X1X3) (X1X4) (Xi) (X2) (X3) (X4) (X5)

Equation (1) for -7.34X -2.531 -10.3 +24.20 +2.81 +.827 +3.72 -6.140 -7.31 3.96 .81
severe morbidity (2.14) (1.04) (12.6) (8.09) (.842) (.416) (5.04) (2.43)
Equation (2) for -7.72X -2.70 -1.20 +28.2X +2.95X +.928 +.479 -6.78X -8.92 3.85 .81
severe morbidity (2.40) (1.22) (1.44) (11.9) (.939) (.497) (.583) (2.11)

*=p <.10; 04=p <.05; *00=p <.01.


Unstandardized Regression Coefficients. Standard errors in parentheses.

*Effectiveness, measured as severe morbidity, is interpreted negatively. The higher the severe morbidity- death or complications
following surgery- the lower the effectiveness. If a variable is positively related to severe morbidity, then severe morbidity increases.

between uncertainty and destandardization, decentralization,


and professionalization. Comments on findings for resources
were expressed elsewhere (Schoonhoven, 1976) and are not
repeated here. In the Table, the first three coefficients in both
equations express the interaction terms. The uncertainty-
destandardization (X1X2), the uncertainty-decentralization
(X1X3), and the uncertainty-professionalization (X1X4) interac-
tions have negative effects on severe morbidity. This means
that effectiveness is enhanced, since severe morbidity was
decreased by each of the three technology-structure interac-
tions. Although the first two coefficients in each equation were
significant and all three coefficients in each equation were in
the expected direction, these results only partially supported
Galbraith's hypotheses 1, 2, and 3 because we could not yet
determine if the effects were monotonic or not. Nevertheless,
at first glance, of the three contingency hypotheses suggested
by Galbraith, the uncertainty-destandardization and the
uncertainty-decentralization relationships appeared to be
clearly supported in this first phase of the analysis, and the
uncertainty-professionalization relationship appeared to be par-
tially supported.

Examining only the algebraic signs of each interaction term,


while controlling for the corresponding main effects, as we
have done in the Table, gives insufficient information for
properly testing a contingency hypothesis. The signs of a
coefficient can tell us if an effect is in the hypothesized
direction. For example, destandardization in the fi rst equation
has a positive coefficient of 2.81 on severe morbidity. In
interaction with uncertainty, the effect is a negative 7.34. From
these data, we would expect to see a decreasingly positive
slope expressing the change in severe morbidity, given a
change in destandardization over the range of uncertainty. We
could not, however, determine whether a symmetrical and
nonmonotonic effect was present by inspecting the regression
coefficients. We therefore moved to phase two of the analysis,
plotting the joint effect of the main and interaction terms.

Analysis of Interaction Terms

Galbraith's arguments suggest that the greater the uncertainty,


the greater the positive impact of destandardization, decen-
tralization, and professionalization on effectiveness. These
hypotheses were tested by elaborating each interaction term
mathematically and then displaying it graphically. In so doing,
any symmetrical and nonmonotonic effects would be revealed
directly testing the adequacy of traditional contingency hypoth-

363/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
eses like Galbraith's as compared to the ones elaborated. For
those unfamiliar with the technique of graphing a partial
derivative from the larger regression equation, equations (2)
through (6) in Appendix B providea more detailed explanation of
the method.

The six graph patterns possible are illustrated in Figure 2. Graph


(1) illustrates the pattern that would be observed if Galbraith's
hypotheses were correct using severe morbidity as the mea-
sure of effectiveness. Results like these would indicate that the
greater the uncertainty, the greater the negative impact of a
dimension of structure on severe morbidity, and therefore, the
greater the effectiveness, as hypothesized. Results like these
are possible when both b1 and b3 take negative values and the

Figure 2. Potential graph patterns.

Graph 1: Monotonic Pattern Graph 4: Monotonic Pattern


d Severe Morbidity d Severe Morbidity
d Structure d Structure

+ + /
o 0

Hereb1=-andb3= -. Here b1= + and b3= -.

Graph 2: Nonmonotonic Pattern Graph 5: Monotonic Pattern

o 0

Here b1= + and b3= -. Here b1 -and b3= -.

Graph 3: Nonmonotonic Pattern Graph 6: Monotonic Pattern

+ /+ \
0 0

Hereb1 - and b3 = +. Here b1= + and b3 -.

364/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
Problems with Contingency Theory

relationship is monotonic. Graph (2) illustrates the pattern re-


quired if the nine elaborated contingency hypotheses (Hia, lb, Ic
through H3a, 3b, 3c) are supported. A nonmonotonic effect is
apparent in this graph, since the plotted line crosses the hori-
zontal axis, thereby changing signs, and b1 would be a positive
and b3 a negative value. Graph (3) illustrates the remaining
nonmonotonic effect, whereas graphs (4) through (6) reveal the
other monotonic patterns possible.

Interaction of Destandardization and Uncertainty

The Table showed that the uncertainty-destandardization in-


teraction had a significant negative effect on severe morbidity.
Since the results were substantively identical in both equations,
for simplicity we graphed the interactions using coefficients
from the first equation only. To analyze the interaction, we
assumed that the effect of increasing destandardization on
severe morbidity is modified by the level of uncertainty facing
the organizational unit. This interpretation of the interaction
may be expressed as Equation (3) (Appendix B):
Y=b1X1 +b3X1X2. (3)
This means that the effect of destandardization on severe
morbidity is modified by uncertainty. This partial form of the
larger regression Equation (1) controls for all other terms in the
equation.

First, we had to determine if the effect of destandardization on


severe morbidity was monotonic over the range of uncertainty
observed in our sample. We used Equation (6) (Appendix B) to
find the point at which an increase in destandardization has no
effect on severe morbidity:

X2 = -b1l b3, (6)


where X2= uncertainty; b1=main effect of destandardization;
and b3= uncertainty destandardization. We substituted into the
equation regression coefficients from the Table, first equation,
finding that
X3 = -b1 b3=-(2.81)/-7.34=.38.

Since the calculated value of uncertainty (X2) was within the


range observed in oursample-.191 to .718 (Appendix A) -we
concluded that destandardization has a nonmonotonic effect on
severe morbidity over the range of uncertainty.

This nonmonotonic effect indicates that destandardization has


a negative effect on some operating room suites and a positive
effect on others. The effect depends on the degree of work-
flow uncertainty. To depict where the positive and negative
effects are over the range of uncertainty, we plotted the effect
of destandardization on severe morbidity. We substituted into
Equation (4) values of uncertainty in order to locate the appro-
priate plots.

Figure 3 shows the effect of uncertainty on the relationship


between destandardization and effectiveness. Its vertical axis
represents the effect of destandardization on severe morbidity.
The horizontal axis indicates the degree of workflow uncer-
tainty. The plotted line represents the change in severe morbid-
ity, given a change in destandardization over the range of uncer-
tainty. From the plotted line we see that the effect of destan-
dardization on severe morbidity is positive in the uncertainty
range below .38 and negative in its higher range above the value

365/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
dY (Severe Morbidity)
dX, (Destandardization)
2.00-

1.50

1.00

.50

0
.25 .50 .75 1.0

- .50 - \ Uncertainty

-1.00

-1.50

-2.00

-2.50

Severe Morbidity =2.81 (DeStand)-7.34 (DeStand*Uncertainty)

Y=b1X1 +b3b3 dYldX1 =b1 +b3X2

Figure 3. The effect of technological uncertainty on the relationship


between destandardization and effectiveness, measured as severe mor-
bidity.

of .38. That is, destandardization promotes effectiveness in the


range of uncertainty above .38, but below that point increases in
destandardization decrease effectiveness, since severe mor-
bidity increases where the slope is positive. The plot of the
partial derivative from the original regression equation there-
fore revealed the following symmetrical and nonmonotonic
relationships:
In lower uncertainty subunits, destandardization has a positive effect
on severe morbidity, thus reducing effectiveness. However, the slope
is a decreasingly positive one, so that the disadvantages of destan-
dardization decrease as uncertainty increases.
In greater uncertainty subunits, destandardization has a negative im-
pact on severe morbidity, thus increasing effectiveness. The slope is
an increasingly negative one, so that the advantages of destandardiza-
tion become greater as uncertainty increases.

Galbraith's first hypothesis was not supported by this analysis,


because a nonmonotonic effect is apparent, and the process is
more complicated than his arguments indicate. However, the
alternative hypotheses advanced from problems with con-
tingency theory are supported. The impact of destandardization
on effectiveness is nonmonotonic over the range of uncer-
tainty, as hypothesized (Hia). The interaction between uncer-
tainty and destandardization, expressed as a multiplicative func-
tion, is significant. And there was a symmetrical effect in the
data, as expected.

Interaction of Uncertainty and Decentralization

The Table showed that the interaction between uncertainty and


decentralization has a significant effect on severe morbidity. To
interpret this interaction, we made the assumption that the

366/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
Problems with Contingency Theory

effect of decentralization on effectiveness is modified by the


level of uncertainty present in the organizational unit. The in-
terpretation of this interaction would also be expressed as
Equation (3).

To determine if a nonmonotonic effect was present over the


range of uncertainty, we again substituted coefficients from
the table into Equation (6) (Appendix B). The value of uncertainty
at which changes in decentralization had no effect on severe
morbidity was .326. Since .326 was within the observed range
of uncertainty in our sample (.191 to .718), the effect of decen-
tralization on severe morbidity was nonmonotonic over the
range of uncertainty. The point of inflection, where the slope
changes signs, was quite near the mean level of uncertainty of
the subunits.

Figure 4, the plot of this interaction, reveals the following. In


lower uncertainty subunits, decentralization had a positive ef-
fect on severe morbidity; that is, it increased severe morbidity,
and effectiveness decreased. However, in greater uncertainty
subunits, decentralization had a negative effect on severe mor-
bidity, thus decreasing severe morbidity and increasing effec-
tiveness. Because of its decreasing negative slope, effective-
ness was even further enhanced as decentralization increased
in the highest uncertainty subunits.

dY (Severe Morbidity)
dX1 (Decentralization)

.50

.25

0
.25 .50 .75 1.0

Uncertainty

-.25

-.50

-.75

-1.0

Severe Morbidity=.827 (DeCent)-2.53 (DeCent*Uncertainty)

Y=bX1 +b3X3

dYldX1 =b1 +b3X2

Figure 4. The effect of technological uncertainty on the relationship


between decentralization and effectiveness, measured as severe
morbidity.

367/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
These results parallel our findings for the uncertainty-
destandardization interaction. Galbraith's second hypothesis
was not supported by these data, because a nonmonotonic
effect was apparent, rendering the interaction more compli-
cated than his arguments would suggest. In contrast, the alter-
native hypotheses advanced from an analysis of problems with
contingency theory were supported by the data. The impact of
decentralization on effectiveness was nonmonotonic over the
range of uncertainty, as hypothesized (H2a), and a symmetrical
effect was apparent, as expected.

Interaction of Uncertainty and Professionalization

The coefficient expressing the uncertainty-professionalization


interaction was also in the expected direction (Table). To inter-
pret the interaction, we made the assumption that the effect of
professionalization on effectiveness is modified by the level of
uncertainty present in the subunit. This interpretation is also
expressed as Equation (3). To determine if a nonmonotonic
effect was present, we once more substituted coefficients
from the Table into Equation (6). The value of uncertainty at
which changes in professionalization had no effect on severe
morbidity was:
X2=-b11b3=-3.721-10.3=.36.

Since .36 is within the observed range of uncertainty- .191 to


.718 - we concluded that the impact of professionalization
was nonmonotonic over the range of uncertainty.

dY (Severe Morbidity)
dXj (Professionalization)
2.0 -

1.0

0
.25 .50 .75 1.0

Uncertainty
-1.0

-2.0

-3.0

-4.0

Severe Morbidity = 3.72 (Proflzn)-10.3 (Proflzn*Uncertainty)

Y=bX1 +b3X3
dYldX, =bl +b3X2

Figure 5. The effect of technological uncertainty on the relationship


between professionalization and effectiveness, measured as severe mor-
bidity.

368/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
Problems with Contingency Theory

Figure 5 graphically displays the nonmonotonic effect. The


positive slope of professionalization on severe morbidity de-
creases until it reaches zero atan uncertainty level of .36. It then
turns down and is increasingly negative as uncertainty in-
creases. Thus, in operating room subunits with below the aver-
age level of workflow uncertainty, an increase in professionali-
zation of the workforce resulted in greater rates of severe
morbidity. Although this effect was predicted from the analysis
of problems with contingency theory, in substance it is quite
contrary to conventional theorizing about the benefits of pro-
fessional training. In operating subunits with moderate to high
levels of uncertainty, however, increases in professionalization
resulted in reduced rates of severe morbidity and thus greater
effectiveness. These results were consistent over both dimen-
sions of professionalization: initial training and the extent to
which initial training was supplemented and maintained by cur-
rent professional activities. The findings were duplicated when
the coefficient for the second dimension was used to produce
the graph. The data wholly support hypotheses 3a, 3b, and 3c
This particular analysis must be interpreted cautiously, how-
ever. Although the coefficients for the uncertainty-
professionalization interaction were in the expected direction,
the results may be unreliable, given the size of the standard
errors associated with the original regression coefficients.

Despite the complicated interaction effects that our analysis


has revealed, the following possibility presents itself. One
might argue that the significant main effects that we have
found are really nonlinear functions rather than linear. If this
were true, then the significant interaction effects thatwe have
seen are really only artifacts of the main variables' nonlinear
properties. That is, the nonlinear effects are being "forced" into
the interaction terms of each equation, since they essentially
have no other means of expression. As a consequence, one
could make the argument that nonlinear effects of the main
variables, rather than the interaction effects among those vari-
ables, are a better representation of our data. We have tested
for such a possibility using an earlier version of severe morbid-
ity, unstandardized for the number of patients in a hospital. We
have checked for square and square root forms of nonlinearity
in our main variables. Although some nonlinear effects of the
main variables do appear in the data, there is no indication that
the nonlinear model is superior to the interaction model that we
initially developed from contingencyarguments. Foran elabora-
tion of the search for nonlinearities in our data, the reader is
referred to Schoonhoven (1976).

DISCUSSION

Analysis of data from the operating room suites of 17 hospitals


implied that traditional versions of contingency theory like Gal-
braith's (1 973) underrepresent the complexity of relations be-
tween technological uncertainty, structure, and organizational
effectiveness. We found multiplicative forms of interaction
between technology and structure that are symmetrical and
nonmonotonic. Destandardization, decentralization, and pro-
fessionalization had different influences on effectiveness,
which depended on the level of workflow uncertainty. There
are several consequences of these findings. First, if we intend
to continue making contingency arguments, it is now possible

369/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
to state our hypotheses with greater clarity and specificity.
Those hypotheses should reflect the underlying assumptions
of nonmonotonic and symmetrical relations among interacting
variables.

Second, the data suggest that the "appropriate for" types of


arguments used in contingency theory be amended from an
isomorphic low-low, medium-medium, and high-high set of
combinations of technological uncertainty and structure, since
the relationships found were not simply linear. For example, in
Figure 4, the graph showed that increasing decentralization
decreases effectiveness when uncertainty is low. This means
that the low uncertainty-low decentralization combination was
not appropriate and that an equally rigid isomorphic
moderate-moderate combination was also not appropriate from
an effectiveness standpoint. Decentralization should not be
increased as workflow uncertainty increases until uncertainty
reaches a value close to the midpoint on the range it takes in an
organization. The mean level of uncertainty in our sample was
.416. The point of inflection of the partial relation between
effectiveness and decentralization was .326, the point at which
the effect of decentralization on effectiveness changed signs
and effectiveness was enhanced. This means that when more
than approximately one-third of an OR's scheduled workflow is
subject to change, the benefits of decentralization are realized,
and decentralization is "appropriate" from an effectiveness
perspective.

Our findings have shown that if the levels of decentralization


and destandardization are increased when uncertainty is low,
effectiveness suffers. These negative effects take into account
the level of professionalization of the workforce. That is, it is not
simply inept or ill-trained workers causing effectiveness to suf-
fer, since we have accounted for characteristics of the work-
force in the equation for effectiveness. A likely explanation of
the decrease in effectiveness is that if even well-trained work-
ers are allowed greaterdiscretion and fewer rules to guide their
behavior when work-flow uncertainty is low, management
cannot be certain of the decisions and actions that will be taken
by the individual workers. Control over worker behavior is re-
duced, when it need not be. Since rules can be devised and
since the hierarchy can absorb the amount of information pro-
cessing required when uncertainty is low, managers should
devise rules, not because it is simply more efficient, but be-
cause greater control over outcome quality is gained. It is more
effective for the organization.

Substantively, the hypotheses and the findings for destan-


dardization and decentralization were not entirely inconsistent
with the usual ways in which we theorize about uncertainty,
effectiveness, and these two dimensions of structure. How-
ever, with some variable combinations, applying the symmetry
assumption produces unconventional and perhaps counterintui-
tive predictions. For example, our data suggest that increasing
the level of professionalization of the workforce is not appropri-
ate along the entire continuum of workflow uncertainty, be-
cause professionalization has an undesirable influence on ef-
fectiveness in the lower ranges of workflow uncertainty. This
suggests a re-thinking of the assumed positive value to an
organization of advanced professional training when work flow

370/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
Problems with Contingency Theory

uncertainty is low. Roth (1974) has raised a similar question in a


theoretical article on professionalism.

Why does increasing professionalization when workflow un-


certainty is low cause organizations to be ineffective? Two
possible explanations come to mind. One is a selective percep-
tion argument, and the second is an uncertainty- or variation-
creating argument. Selective perception results in amplification
of the existing uncertainty. Even though workflow uncertainty
is objectively low, as measured by variations in the OR schedule
of cases, perhaps the nurses perceive otherwise. Perhaps
these workers selectively attend to variation because they are
trained to deal with uncertainty and to make judgments. If so,
they may misperceive the overall objective characteristics of
the work setting and make inappropriate choices, with reduced
outcome quality.

A variation-creating argument proposes that nurses may intro-


duce uncertainty and variation into the work setting when it is
lacking. Perhaps professionals prefer variety so strongly that,
when it is lacking in low uncertainty situations, they vary their
approach to the constant set of cases, with inconsistent results
that reduce effectiveness in the long run. It may be that the
routine workflow begets boredom and subsequent errors in
much the same way that errors result when assembly-line
workers seek variety to relieve otherwise repetitive operations.
It has been reported that automotive workers create diversion
on the assembly line, first by not working, and then by working
as fast as they can to complete stockpiled work before it
reaches the next assembly station (Runcie, 1980). In both
cases, the treatment of an otherwise stable set of circum-
stances is varied. The selective perception and the variation-
creating explanations seem equally plausible in the absence of
data to explore them. Both might be pursued in future research.

Our objective has been to identify problems within and to


improve the specificity of existing contingency arguments. Our
data from 17 operating room suites in hospitals have modified
some of the substantive arguments made within earlier con-
tingency arguments. Nevertheless, our results are consistent
with a more enlightened version of the contingency-orienting
strategy in general. The relations that we have found between
technology, structure, and organizational effectiveness support
an approach to organizational design that begins with the
statement that "It all depends...."

REFERENCES

Allison, Paul D. Blalock, Hubert M., Jr. Comstock, Donald E.


1 977 "Testing for interaction in mul- 1965 "Theory building and the statis- 1975a "The measurement of influ-
tiple regression." American tical concept of interaction." ence in organizations." Paper
Journal of Sociology, 83: 144- American Sociological Review, presented at Pacific Sociologi-
153. 30: 374-380. cal Association Annual Meet-
1972 Social Statistics, 2d ed. New ings, Victoria, British Colum-
Althauser, Robert P.
York: McGraw-Hill. bia, April 1975.
1971 "Multicollinearity and non-
1 975b "Technology and context: A
additive regression models." In Burns, Tom, and G. M. Stalker
study of hospital patient care
H. M. Blalock (ed.), Causal 1961 The Management of I nnova-
units." Unpublished Ph.D.
Models in the Social Sciences: tion. London: Tavistock.
dissertation, Stanford Univer-
453-472. Chicago: Aldine-
s ity.
Atherton.

371/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
Flood, Ann Barry Lazarsfeld, Paul F. Schoonhoven, Claudia Bird
1976 "Professionals and organiza- 1958 "Evidence and inference in so- 1976 "Organizational effectiveness:
tional performance: A study of cial research." Daedalus 87: A contingency analysis of sur-
medical staff organization and 99-130. gical technology and hospital
quality of care in short term structure." Unpublished Ph.D.
Leifer, Richard, and George P.
hospitals." Unpublished Ph.D. dissertation, Stanford Univer-
Huber
dissertation, Stanford Univer- sity.
1977 "Relations among perceived
sity.
environmental uncertainty, or- Scott, W. Richard
Galbraith, Jay ganization structure, and 1977 "Effectiveness of organiza-
1973 Designing Complex Organiza- boundary-spanning behavior." tional effectiveness studies."
tions. Reading, MA: Addison- Administrative Science Quar- In P.S. Goodman, J. M. Pen-
Wesley. terly, 22: 235-247. nings, and Associates (eds.),
1977 Organization Design. Reading, New Perspectives on Organ iza-
Lynch, Beverly P.
MA: Addison-Wesley. tional Effectiveness. San Fran-
1974 "An empirical assessment of
cisco: Jossey-Bass.
Hannan, Michael T., John H. Perrow's technology con-
Freeman, and John W. Meyer struct." Administrative Science Scott, W. Richard, Sanford M.
1976 "Specification of models for Quarterly, 19: 338-356. Dornbusch, C. J. Evashwick, L.
organizational effectiveness: A Magnani, and 1. Sagatun
March, James G., and Herbert
comment on Bidwell and 1972 "Task conceptions and work ar-
Simon
Kasarda." American Sociologi- rangements." Research
1958 Organizations. New York:
cal Review, 41: 136-143. Memorandum, No. 97. Stan-
Wiley.
Harvey, E. ford, CA: Stanford Center for
Meyer, Marshall W., and Associates Research and Development in
1968 "Technology and the structure
1978 Environments and Organiza- Teaching.
of organizations." American
tions. San Francisco: Jossey-
Sociological Review, 32: 247-
Bass. Scott, W. Richard, William H. For-
259.
rest, Jr., and Byron Wm. Brown, Jr.
Mohr, Lawrence B.
Heydebrand, Wolf V. 1976 "Hospital structure and post-
1971 "Organizationalstructureand
1973 Hospital Bureaucracy: A Com- operative mortality and morbid-
organizational technology."
parative Study of Organiza- ity: Preliminary findings from a
Administrative Science Quar-
tions. New York: Dunellen. survey of 17 hospitals." In
terly, 1 6: 444-459.
Stephen M. Shortell and Mon-
Hellriegel, Don, and John W.
Namboodiri, N. Krishnan, Lewis R. tague Brown (eds.), Organiza-
Slocum, Jr.
Carter, and Hubert M. Blalock, Jr. tional Research in Hospitals:
1978 Management: Contingency
1975 Applied Multivariate Analysis 72-89. Chicago: Blue Cross
Approaches, 2d ed. Reading,
and Experimental Designs. Association.
MA: Addison-Wesley.
New York: McGraw-Hill.
Southwood, Kenneth E.
Hickson, David J., D. S. Pugh, and
Pennings, Johannes 1978 "Substantive theory and statis-
Diana C. Pheysey
1975 "The relevance of the tical interaction: Five models."
1969 "Operations technology and
structural-contingency model American Journal of Sociology,
organization structure: An em-
for organizational effective- 83: 1154-1203.
pirical reappraisal." Administra-
ness." Administrative Science
tive Science Quarterly, 14: Stanford Center for Health Care Re-
Quarterly, 30: 393-410.
378-397. search
Perrow, Charles 1974 Study of Institutional Dif fer-
Kast, Fremont E., and James E.
1970 Organizational Analysis: A ences in Postoperative Mortal-
Rosenzweig
Sociological View. Belmont, ity: A Report to the National
1974 Organization and Manage-
CA: Wadsworth. Academy of Sciences, DH EW.
ment: A Systems Approach, 2d
Pfeffer, Jeffrey, and Gerald R. Springfield, VA: National Tech-
ed. New York: McGraw-Hill.
Salancik nical Information Service, PB
Khandwalla, Pradip N. 250 940.
1978 The External Control of Organi-
1974 "Mass output orientation of
zations: A Resource Depen- Thompson, James D.
operations technology and or-
dence Perspective. New York: 1967 Organizations in Action. New
ganizational structure." Ad-
Harper and Row. York: McGraw-Hill.
ministrative Science Quarterly,
19: 74-97. Roth, Julius A. Tosi, Henry L., and Stephen J.
1974 "Professionalism: The Carroll
Lawrence, Paul
sociologist's decoy." Sociology 1976 Management: Contingencies,
1975 "Strategy: A new conceptuali-
of Work and Occupations 1: Structure, and Process.
zation." In L. S. Sproull (ed.),
6-23. Chicago, IL: St. Clair.
Seminars on Organizations at
Stanford University, 2: 38-40. Runcie, John F. Tushman, Michael L.
1980 "Bydayslmakethecars."Har- 1978 "Technical communication in
Lawrence, Paul, and Jay Lorsch
vard Business Review, May- R. & D. laboratories: The impact
1967 "Differentiation and integration
June: 106-115. of project work characteris-
in complex organizations."
Rushing, William A. tics." Academy of Manage-
Administrative Science Quar-
1968 "Hardnessofmaterialasanex- ment Journal, 22: 624-645.
terly, 12: 1-47.
1969 Organization and Environment. ternal constraint on the division Woo dwa rd, J oa n
Homewood, IL: Richard D. of labor in manufacturing indus- 1965 Industrial Organization: Theory
Irwin. tries." Administrative Science and Practice. London: Oxford
Quarterly, 13: 229-245. University Press.

372/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
Problems with Contingency Theory

APPENDIX A: Operational Measures of Variables

Workflow Uncertainty

The measure of workflow uncertainty is an index developed from a sample of


the operating room schedules compiled by each hospital's OR staff the evening
before surgery and revised the next day to reflect the actual work performed.
Sampled every four days over eight and a half months, the schedules record the
original operations scheduled, the number of surgical cases added during the
day's surgery, and the number of surgeries cancelled from the schedule. From
these schedules two measures were developed, the proportion of additions to
and cancellations from the schedule, which, when combined, express overall
change in the workf low of the operating room. The proportion of operations
cancelled was computed by dividing the number of operations originally
scheduled into the number cancelled. The proportion of operations added was
computed by dividing the number of operations originally scheduled into the
number of operations added to the schedule during the work day. The two
measures are significantly correlated at .39, p <.05. The actual measure
expressing the amount of overall change in the OR's workflow combines the
two by adding the number of additions to and cancellations from the schedule,
divided by the number of operations originally scheduled.

Structure of the Operating Room

Standardization of rules and procedures was measured by asking the OR staff


nurses (by questionnaire) how explicit procedures were regarding the condi-
tions under which staff may be requested to work overtime. This item was
taken from a much longer list explored in pilot work and was reverse-scored, so
that a value of "1 " indicated low destandardization and a value of -5" indicated
high destandardization. This item was selected because extensive preliminary
field work (using OR nurses and their directors as informants) revealed that the
item related specifically to workflow of the OR and that it governed the conduct
of the workers themselves. Thus, the measure reflects an area of behavior
likely to be governed because of its relationship to variation in the workflow and
is task related to the subunit studied rather than to administrative and policy
items remote from the subunit of interest.

By focusing on participants' perceptions of explicitness, we believe that our


approach provides a more accurate reflection of standardization than the more
commonly used measures based on the organization's written documents. By
relying on official documents, we encounterthe risk that published rules are not
enforced, are otherwise disregarded, or at best are a poorguide to actual
practice. Consequently, written documents may be of suspect value when
measuring the actual standardization of procedures. By asking for the percep-
tions of staff nurses, we measured the extent to which both documented and
unwritten but collectively understood rules existed.

Decentralization of decision making is the second variable of subunit structure.


We have argued that subunit uncertainty is most likely to interact with decisions
germane to the work in the operating room. Consequently, the measure
developed here relates only to decisions relevant to the operating room's tasks.
To compute decentralization of decision making for the OR, a two-step
procedure was used. First, influence scores for the position of staff nurse and
the position of the director of the OR were calculated; then the centralization of
influence was determined by subtracting the score of the staff nurses from the
score of the director of the OR. The scale was then reversed to reflect
decentralization.

The measure combines two questions asked of both staff nurses and the
director of the OR. They were asked to rate on a five-point scale the relative
influence of several hospital positions with regard to task-relevant decisions: (1)
the decision to determine the appropriate disciplinary action for a staff nurse
who has committed a serious medication error and (2) the decision to change
the rules for scrubbing and gowning for the OR nursing staff. Taken from a
pre-test sample of many different decisions, these two cover task-related
rather than administrative or policy decisions. Only their judgments about the
influence of staff nurses and the OR director were used.

The ratings of influence by staff nurses were averaged and then combined with
those of the OR director, giving equal weight to the two ratings. In general, we
subscribe to the view that more attention needs to be devoted to determining
appropriate weig hting schemes for combining data from different organiza-
tional positions (Scott et al., 1972: 141), but in this case weighting schemes had
little effect on the results because of the high consensus between staff nurses
and their head nurses in their judgments of influence (ComstQck, 1975a). For
example, the correlation between the staff nurses' and the OR director's

373/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
ratings on the serious medical error decision is .97. As a consequence, weights
were not assigned in this situation.

The measure of decentralization for each of the decisions was calculated by


subtracting the influence score of the staff nurses from the influence score of
their director and was reverse-scored. Measured in this manner, the scores
indicated the disparity of influence between the staff nurses and the OR
director. The two task-related decision questions displayed convergent validity
(r= .46, p <?.05); therefore, the two questions were combined by taking a mean
to form a single measure of decentralization.

Following Heydebrand (1973), we indicated professionalization of the work-


force by measuring characteristics of the nursing staff of the OR. Heydebrand
argued that there is abundant evidence in the literature that it is the nursing
personnel who constitute "staff" in the traditional industrial sociology terminol-
ogy. He argued further that it would be ". . . misleading and theoretically
indefensible to use proportion of physicians as the crucial measure of profes-
sionalization . . ." (1973: 55). For readers otherwise convinced of the impor-
tance of surgeon qualifications in determining surgical quality of care, see
Flood's (1976) findings that of seven measures of surgeon qualifications, none
reached significance, and most were not in the predicted direction when
regressed on measures of surgical mortality and morbidity.

The two measures of professionalization used in this research were devised to


distinguish between two dimensions of the variable: initial level of training and
the extent to which this training was supplemented and maintained by current
professional activities. The two measures were derived from a larger set of five.
Among the five, two measured initial level of training (B.S. degree and R.N.
ratio), and the remaining three measured the extent of current professional
activity (number of articles in professional journals read per month, staff nurse
membership in professional associations, and number of professional courses
taken during the past year for which certification was received or that
significantly enhanced professional knowledge exclusive of in-service training).

Although the two clusters of measures were expected to be at least modestly


and significantly correlated to indicate convergent validity, there were no
significant correlations within the two clusters. This indicated insufficient
convergent validity to justify combining the measures into indices. However,
one measure from each dimension was significantly correlated. As a conse-
quence, we selected the two that were significantly, but modestly, correlated
(B.S. degree and articles, r=.57), and together they represent both dimensions
of professionalization described above, which we wished to capture. The first
measure was based on the proportion of the OR's nursing staff with a B.S.
degree or higher. These data were obtained during interviews with the directors
of each OR. The extent of current professional activity was measured by the
number of articles in professional journals read per month. Taken from the OR
staff nurse questionnaire, the actual measure was the mean frequency
response.

Resou rces

The variable organizational resources was included as a control, since a


hospital's resources may affect surgical outcomes. The ability to afford
expensive facilities, equipment, and professional staff are among the more
obvious means bywhich resources arethoughtto influence quality of care and,
thus, effectiveness. The measure of resources was expressed as the average
expenditures per patient-day, a standard of comparison used frequently among
hospitals. It combined in a ratio the three variables of total expenses for 1973,
the number of acute care beds, the occupancy rate for 1973, and the number
365, since the ratio is computed per patient-day. Using data from the hospital
administrator questionnaire, it was expressed as follows:
Resources Expenses for 1973

(No. of acute beds x 1973 occupancy x 365)

Effectiveness

We defined effectiveness as an organization's ability to create acceptable


outcomes and actions (Pfeffer and Salancik, 1978: 1 1). Consistent with the
definition, we measured the outcomes of surgical care. Although death after
surgery is an enviably clean measure of effectiveness, it suffers the disadvan-
tage of being a relatively rare event in today's modern hospital in the U.S. The
overall death rate for all patients studied was only 2.8 percent in the 17
hospitals. In order to increase the number of cases available for study in which
severely negative consequences followed surgery, the number of deaths
within 40 days of surgery was added to the next most severe outcome, severe

374/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
Problems with Contingency Theory

complications (morbidity) assessed at the seventh day following surgery.


Although the number of deaths and the number of patients suffering morbidity
were not highly correlated (correlation equals .09), the combination was
thought to improve the reliability of the data sufficiently to justify combining
them into a single category. If the patient was discharged before the seventh
day, the information collected pertained to his or her condition at the time of
discharge.

Severe morbidity is described as an Indirect Standardized Adverse Event Ratio


(ISAER): the ratio of observed adverse outcomes to the expected adverse
outcomes. We describe the indirect standardization method; however, inter-
ested readers are referred to Stanford Center for Health Care Research, 1974:
143-150, for greater detail on the statistical treatment of the approach. To
permit valid comparisons of surgical outcomes across hospitals, adjustments
were made for differences in patient mix within each hospital. The approach
was that of indirect standardization, which involves computing the estimated
probability of death for each patient. Estimates were empirically derived by
using logistic equations relating patient condition variables to outcome mea-
sures. The method can be briefly described: first, all study patients of a given
surgical category were analyzed together, without regard to hospital. The
dependence of the outcome on selected prediction variables was estimated by
fitting a logistic function to the data. Prediction variables used were the various
measures of patient condition, including physical status, stage of surgical
disease, age, sex, stress level, and insurance coverage. Conditional on eco-
nomic status, insurance coverage indicates the ability to withstand the financial
impact of major medical treatment and, thus, is an indicator of care-seeking
behavior. Fifteen logistic regressions were computed for severe morbidity, one
for each of the 15 operations. The data showed that preoperative conditions, as
measured by the model, predicted postoperative outcomes quite well. Second,
hospital crude outcome rates were adjusted fordifferences among patients. An
individual patient's probability of an adverse outcome, independent of hospital
effect, was estimated using the parameters estimated by the logistic regres-
sions. The expected number of adverse outcomes in a given hospital was
simply the sum of its individual patients' probabilities of adverse outcomes. The
statistic of interest is then the Indirect Standardized Adverse Event Ratio
(ISAER), the ratio of observed adverse outcomes to expected adverse out-
comes.

One further adjustment was made. Since the frequency with which study
operations were performed varied considerably among the 17 hospitals, some
estimates were more reliable than others due to variation in sample size. As a
consequence, the standardized ratios were adjusted for their relative reliability
by weighting these estimates back toward the mean, using a Bayes adjustment
of the ISAER's. It was assumed that the prior and posterior distributions are
normally distributed.

We first estimated the amount of variation in hospital outcomes attributable to


hospital effects.

Let the true ISAER for hospital i be


E (Xi+)
ri Pi+
which is estimated by, Xi+

Now

Va r (?rj) = Va r [(^rj -rj ) + (r i) ]2.

Here we assumed that the (rj) are independent random variables with variance
X2. We assumed further that (?^-r1) is independent of ri. Therefore,
Var (?4)=Var (?j-rj) + Var (r1)
or
o-2 Var (Q4)-Var (?^r-r1).

We estimated Var (7) with the variance of the observed ISAERs and estimated
Var (?i-r1) with the mean binomial variation of the ISAERs. Thus, the estimate of
the between-hospital component of variance, 0-2, is given by

2i=_1E [( ) Xi+ 21( 2] 2 1 E jpj(_ l-pj)


H-1 Pi+ H Pi+ H p2j+

We estimated the mean of the prior distribution with the population mean of

ISAERs, the variance of the prior distribution with the hospital component of

375/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
variance estimate, and the conditional variance of the posterior by its binomial
variation. Then, the Bayes-adjusted ISAERs (7,) are given by

rj=w1 *rj+w
Wi + W

where T= I r+,
H

p2i+

and w= 1

on2

The zero order correlations of workflow uncertainty with destandardization,


decentralization, and the two dimensions of professionalization were .29, .12,
.01, and -.02. None were significant at p <.05. The correlations of destan-
dardization with decentralization and the dimensions of professionalization
were -.12, -.02, and .32; none were significant. The correlations of decen-
tralization with professionalization were -.06 and -.29; neither was significant.
With any kind of interaction model that combines elements of independent
variables in the model, one faces the possibility of multicollinearity. Blalock
noted that when " . . . the original variables are themselves highly intercorre-
lated, or belong in blocks, that the cross product terms will be related to these
blocks in peculiar ways" (1972: 464). The low levels of association among
independent variables, none of which were statistically significant, indicated
that subsequent regression analyses would be relatively free of problems of
multicollinearity.

Measures of Technology, Structure, and Effectiveness

Range
Mean S.D. (Min/Max)
Technology
Workflow uncertainty within OR .412 .136 .191- .718
Stru ctu re
Destandardization 2.349 .719 1.000-4.360
Decentralization 2.885 .581 2.000-4.333
Professionalization
Professional training .046 .058 .000- .170
Professional activities 2.548 .373 1.940-3.260

Effectiveness
Severe Morbidity .999 .245 .555-1.447

Resources
Resources 1973 .113 .025 .077- .154

APPENDIX B: Analytic Technique: Main and Interaction Effects

In the table of regressions, we observed three interaction terms. The


uncertainty-destandardization interaction, the uncertainty-professionalization
interaction, and the uncertainty-decentralization interaction had effects on
severe morbidity in the hypothesized direction. The effect of each of these
interaction terms can be interpreted in two ways, depending upon assumptions
that one is willing to make about the main effects corresponding to each
interaction. Of the two variables contained in each interaction, one must make
explicit which variable is presumed to modify the other independent variable's
effects on a third, dependent variable. For example, we can look at a simple
two-variable regression, which includes one first order interaction:

Y=a+b1X1 +b2X2+b3X1X2+e. (2)


In the above regression, the interaction term, if significant, may be interpreted
as changing the coefficientb1 orb2. If the first were the case and we interpreted
the interaction as altering the effect of X1 on Y, then the following partial of the
regression equation would be analyzed:

Y=b1X1 =b3X1X2
Y=X1(b 1?b 3 X2)- (

3761ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms
Problems with Contingency Theory

Were this the case, then we would be interested primarily in the behavior of
and how its effect is modified by X2.

From a mathematical point of view, we could just as readily interpret the


interaction term as affecting the relationship between X2 and Y. In this second
case, we would analyze the following partial of the regression equation:

Y=b2X2 +b3X1X2
=X2 (b2+b3X1). (3a)

This would imply an interest in the effect of X2 on Y and how its effect is
modified over the range of X1. The choice between the two partials is
determined by the substantive assumptions that one is willing to make.
Mathematically, the two are equally valid.

When analyzing the interaction terms in this analysis, we have taken the
position that the structure of a unit can be modified more readily than the level
of technological uncertainty that faces a unit. We assumed thatthe impact of a
dimension of structure on effectiveness will vary overthe range of uncertainty.
As a consequence, uncertainty will always appear on the horizontal axis in the
graphs as the variable that alters the impact of the structural variable on severe
morbidity.

The interpretation of each significant interaction was a two-step procedure.


First, we determined if the modified relationship was monotonic or
nonmonotonic. Then we plotted the relationship between the modified
variable and the dependent variable over the range of the modifying variable.
The effect of a modifying variable on the additive relationship will be either
monotonic or nonmonotonic, depending on the relative value of the coefficients
forthe interaction and additive terms. If we take the position of being interested
in X1 and how it is modified in interaction with X2, Equation (3) above, where

Y=b1X1 +b3X1X2, (3)


may be rewritten as a partial derivative, where

dYldX1 =b1 +b3X2. (4)

Equation (4) indicates that


of b1 and b3. The point w

dY/dX1 =b1 +b3X2=O, (5)


is also where X2, the modifying variable, is equal to the ratio of the coefficients
of the additive and interaction terms:

X2 = -b1/ b3. (6)


This is the point on
variable that is modified, has no effect on the dependent variable, Y. That is to
say, it is the point of inflection of the partial relation dYldX1. If the value forX2
obtained from Equation (6) falls within the observed range of X2in our sample,
this is the point at which the effect of X1 on Y will change signs. As a
consequence, the effect will be non monotonic: negative over a portion of the
observed range of the modifying variable, X2, and positive over the remainder
of its range.

After we calculated Equation (6) to determine if the relationship was non-


monotonic, we than used Equation (4) to plot the effect of the modified variable
on the dependent variable over the range of the modifying variable, that is, the
relation dYldX1. Although several patterns are possible, our data revealed
consistently nonmonotonic patterns. That is, the modifying variable, X2, in-
creased the effect of X1 on Yover a portion of its range and decreased it over the
remainder.

377/ASQ, September 1981

This content downloaded from 134.153.73.94 on Wed, 18 Sep 2019 22:56:16 UTC
All use subject to https://about.jstor.org/terms

You might also like