0% found this document useful (0 votes)
73 views

Sound Container

1. The article analyzes language used by sound engineers to describe their work mixing recorded sounds. It focuses on their use of force dynamic metaphors to conceptualize interactions between sounds in the "phonographic container," or the space of the mix. 2. The author conducted a survey of language in sound engineering textbooks and interviews. He found sound engineers often describe sounds as objects acting on each other, using metaphors like a sound being "pulled back" or a compressor "holding down" a sound. 3. Analyzing these force dynamic metaphors provides insights into how sound engineers structure their understanding of recorded sound and the impact of mixing techniques on the listening experience. The metaphors suggest sounds are conceptualized as

Uploaded by

Coie8t
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
73 views

Sound Container

1. The article analyzes language used by sound engineers to describe their work mixing recorded sounds. It focuses on their use of force dynamic metaphors to conceptualize interactions between sounds in the "phonographic container," or the space of the mix. 2. The author conducted a survey of language in sound engineering textbooks and interviews. He found sound engineers often describe sounds as objects acting on each other, using metaphors like a sound being "pulled back" or a compressor "holding down" a sound. 3. Analyzing these force dynamic metaphors provides insights into how sound engineers structure their understanding of recorded sound and the impact of mixing techniques on the listening experience. The metaphors suggest sounds are conceptualized as

Uploaded by

Coie8t
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol.

12, 2013/2014

The Force Dynamic Structure of the Phonographic Container: How Sound


Engineers Conceptualise the Inside of the Mix

Mads Walther-Hansen, Department of Communication and Psychology,


Aalborg University, Denmark

Abstract
The continuous development of new recording technologies and recording
practices has had considerable impact on how popular music recordings are
produced; yet our ability to articulate the impact of these technologies on the
perception of sounds is limited. To describe what has been done to sounds in the
mix often requires sound engineers to draw metaphorical comparisons with
other experiences. Until now few scholars have studied the language of sound
engineers. This article is based on a survey of metaphorical expressions used in
interviews with sound engineers. The survey showed that sounds and sound
effects are often described as forceful objects that act and interact in the mix.
This interaction is characterised through expressions such as: the sound was
pulled back in the mix; the compressor was holding down the sound; and the
vocals were pushed up front. Using cognitive linguistic theory as a guide, this
article argues that sound engineers use of force dynamic metaphors offers a
better understanding of the structure and manifestation of recorded sound and
the impact of record production on the listening experience.

89
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

1. Introduction
Recordings of musical performances are clearly aesthetically different from the sounds
of acoustic instruments heard in real-world environments. Recording equipment and
post-production effects, such as reverbs, delays, equalisation and compression, allow
recording engineers to modify recorded sounds in creative ways into auditory
phenomena aesthetically distinct from real-world sounds. Yet, as Jay Hodgson (2010)
notes, the musical effect of recording technologies on the listening experience is often
conspicuously absent from most analytical studies of music.
Musicologists have studied record listening in an impressive number of ways,
obtaining great insight into how listeners attend to and extract meaning from recorded
music. Music listening may, for instance, involve attending to the perceived intentions
of the songwriter, feeling moved by the perceived bodily gestures of musicians or
appreciating the more formal structures of the musical material (e.g., harmony, melody
and rhythm) (Frith 1998). Adding to the findings of such studies I find that further
attention should be given to the activity of sounds within the recorded material itself.
Since the late 1990s musicologists have been increasingly concerned with music
recordings, a field Steven Cottrell (2010) has termed phonomusicology. In recognising
record-making as an art form this field seeks to trace the influence of recording
practices on, for instance, the listening experience. There are several difficulties,
however, with such studies. First, music researchers analysing recorded music have
usually not experienced the performances in the recording studio that were later
spliced together and processed to form the final track. For this reason they do not have
the before-and-after perspective that allows them to judge what actually changed in the
recording process. Second, even researchers who do have knowledge about the
production practices behind a particular recording find that limitations of language often
make it difficult to articulate what happened to the sound during the studio sessions. For
this reason we still know little about how recording practice and audio effects change
our perception of recorded music. The question remains as to which kinds of new layers
of meaning are added in the recording and post-production process and how we should
describe these extra layers. Seeking to answer such questions, this article presents the
results of a study examining how sound engineers represent the sound of recording
technologies in language. The approach seeks to probe the before-and-after perspective

90
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

of recordings, opening up an alternative view on the variety of ways in which different


qualities of sound can change the experience of recorded music.

1.1 Conceptualising Sound


A number of scholars have studied language about music from different perspectives.
Lawrence Zbikowski (2002) presented one of the most comprehensive studies of how
music is understood and conceptualised in his book Conceptualizing Music. Building on
cognitive linguistic theory he argues that the cognitive processes we use to understand
music are not unique capacities for music understanding, but the same capacities
through which we structure all experiences in our everyday life. Zbikowskis book
contributes greatly to the understanding of notation-based music. It is, however, not
concerned with non-notational experiences that may arise more from different qualities
of sound, such as timbral and spatial characteristics. Morten Michelsen (1997) accounts
for these experiences of sound (e.g., timbre and space) in his study of how academics
and music reviewers use metaphors to express their experience of musical sounds.
Michelsen argues that sounds are not necessarily experienced as complex phenomena.
The complexity arises because our common language does not allow us to describe
these phenomena precisely. For this reason metaphors are necessary conditions for all
language about sound. In Michelsens research and other related studies the language of
sound engineers and other music production professionals is only touched upon very
briefly, or not at all. One notable exception is the American anthropologist Thomas
Porcellos (2004, 2005) studies of dialogue between recording engineers in the
recording studio. In his 2004 article, Speaking of Sound: Language and the
Professionalization of Sound-Recording Engineers, Porcello explores the different
linguistic resources which such sound engineers make use of in their search for the right
sound. Porcellos work offers important suggestions regarding how a focus on speech
about sound could enrich our understanding of sound engineering practice. Whereas
Porcello, however, finds metaphorical descriptions of sound inherently vague, my study
embraces these metaphors as a means to access how sound engineers think and respond
to recorded sounds in the mix process.
Sound engineers are a specific category of specialised listeners. They distinguish
themselves from most other musicians and composers by their primary focus on getting

91
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

the right sound over other parameters of musical expression. For this reason they are not
just good at deciphering complex sound phenomena. They are also acquainted with the
techniques used to make the sounds. Consequently they may listen more for the
techniques behind the music than to the music itself. We can call this type of listening
recipe listening (Landy 2007: 97) or technological listening (Smalley 1997: 109).
Second, sound engineers are not just specialised listeners. They are also authors of the
mix and have to some extent an idiosyncratic language for conceptualising what they
do. Sound engineers are accustomed to certain ways of talking about sound and thus use
much more elaborate metaphors than most other listeners.

1.2 Scope of the Article


This article explores the use of metaphors in sound engineers' evaluation of their work. I
start by outlining the notion of the phonographic container, which is used to define the
phenomenal frame in which recorded sounds appear. I then proceed to analyse the
results of a survey of sound engineers language. This survey is based on six textbooks
for sound engineers (Alten 2011; Bartlett & Bartlett 2009; Bregitzer 2009; Gibson
2005; Izhaki 2008; Owsinski 1999), 20 interviews with sound engineers published in
Bobby Owsinskis The Mixing Engineers Handbook (1999) and 35 interviews with
sound engineers conducted by Paul Tingen and published in Sound on Sound Magazine
(January 2007 to November 2009, one interview in every monthly issue). The
textbooks, as well as the interviews, centre on a variety of different approaches to
recording and mixing. In this article I will particularly focus on how the impact on
dynamic range compression is conceptualised. This focus was chosen because I found a
very elaborate use of metaphors in the interviews whenever dynamic range compression
was discussed. It also serves to narrow down an otherwise quite complex field.
The analysis of the interviews is concerned with how sound engineers tend to
describe sounds as entities that act and interact in the phonographic container. These
descriptions point to how sound engineers often use force dynamic metaphors (Talmy
1985) when describing what is going on in their mix. This finding, I claim, will provide
music researchers with new insights into the structure and manifestation of recorded
sounds and offer new ways to understand the impact of record production on the
listening experience.

92
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

2. Methodology
My argument rests on cognitive linguistic theory as it has evolved from the work of
George Lakoff and Mark Johnson (1980), who describe how perceptual domains are
structured by projecting patterns of experience from one domain to another. Studying
metaphorical expressions, they sought to explain human meaning and the embodied
origins of imaginative structures. The latter is described further under the heading image
schemas introduced simultaneously in Johnsons The Body in the Mind (1987) and
Lakoffs Women, Fire and Dangerous Things (1987).
Inspired by the Kantian notion of imagination Johnson (1987) describes image
schemas as gestalt structures that consist of parts that are organised into unified wholes.
Kant suggested that concepts of understanding and intuitions were connected through a
transcendental schema. This schema is what structures our awareness of objects, by
sketching out possible applications of the concept. Likewise image schemas are
characterised as abstract structures of recurring patterns of embodied experience that are
activated through experience. These patterns may then organise more abstract
understanding. We should acknowledge, however, the possible bias towards visual
perception implied by the word image. Image schemas are here understood in a
broader sense as a function of all sensory experiences. These schemas emerge from our
bodily experiences in everyday life and are thus closely tied to our perceptual capacities
and bodily motor skills. For this reason we can see image schemas as embodied
schemas that form the basis for perception, thought and language. Since language is
based on the same conceptual system as that governing how we both think and act, we
can gain access to the workings of this system by studying how we speak about certain
phenomena.

3. The Phonographic Container


CONTAINMENT (Johnson 1887; Lakoff & Johnson 1999) is a central schema that
structures our conceptualisation of experience in everyday life as well as in music. The
schema (Figure 1) is activated when we experience events where something is located
within another thing. Such events usually have an inside and an outside, as well as a
boundary between them.

93
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

Figure 1: CONTAINMENT schema

When talking about recorded sounds, CONTAINMENT is a prevalent and established


metaphor. We say that sounds and sound sources are in the recording, although no
substantial entities reside in the medium but only different kinds of audio
representations (e.g., grooves in records or ones and zeroes on CDs) that can reproduce
auditory phenomena. We simply impose a CONTAINMENT schema on the recording.
In the studied interviews with sound engineers the CONTAINMENT schema was often
activated in their description of their mix, e.g., when they were talking about sounds in
the mix, in the track or in the recording. But what does it mean that a sound is in the
track, in the recording or in the mix?
On closer examination of the interviews it became apparent that sounds and
sounds effects are often described in terms of how they act in, and in relation to, the
phonographic container (my italics in all):

There were a lot of things playing but it made the track too full. (Renaud
Letang in Tingen 2008, Apr.)
If you use 96k you have all these frequencies above our hearing range that just eat
up headroom. (Jacquire King in Tingen 2008, Dec.)
I needed a longer reverb to fill in spaces. (Jason Goldstein in Tingen 2007, Apr.)
You have these moments in the track where it is open and soaring and where the
big reverbs open all the floodgates. (Chris Lord-Alge in Tingen 2007, May)
[The sound] jumps out of the track too much. (Joe Chiccarelli in Tingen 2007,
Oct.)
Every time the kick hits [the compressor] ducks the bass track 2-3 dB to give
space for the kick. (Fraser T. Smith in Tingen 2009, Nov.)
94
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

I really start searching out the frequencies that are clashing or rubbing against
each other. (Jon Gass in Owsinski 1999: 31)
Then Ill do some frequency juggling so that everybody is out of everybody
else's way. (Ed Seay in Owsinski 1999: 164)
It was one of these tracks that could easily have sounded way too crowded.
(Manny Marroquin in Tingen 2007, Dec.)
Instead of occupying a small spot in the middle of the mix, I could fill the
whole spectrum. (David Pensado in Tingen 2007, Jan.)

As we can see from these quotes, sound engineers often conceptualise the inner
workings of the mix by mapping agency onto sound and sound effects, e.g., jump out,
eat up, rubbing against each other and Every time the kick hits [the compressor]
ducks the bass. Also we can see how the mix is conceptualised as a spatial container
with dimensions that have relative and absolute positions. Sounds take up space within
the recording, and sounds can potentially get in the way of each other. Each of the
quotes describes different states of the phonographic container and its content, for
instance, the absolute position of sounds (e.g., in the middle of the mix), the relative
position of sounds (e.g., rubbing against each other) or the internal state (e.g.,
crowded).

4. From Static to Force Dynamic Metaphors


We think of spatial language in terms of our bodily perspective rather than as a
geometrical structure. Spaces are expressive in several ways. Likewise, hearing is not
a static phenomenon. When we say that a sound is in the mix we categorise a sound
phenomenon. But meaning does not arise from this categorisation. It is conveyed by the
sound of the physical signal through the perceptual process of listening (cf. Griffith
2002). Thus, when we use metaphors we risk objectifying sound phenomena, reducing
them to static phenomena and thereby failing to represent their meaning (cf. Freeman
2004). Hence, meaning does not stem from the fact that we can describe sounds as
inside or outside a container, but from our involvement with the musical flow of events
that may incorporate aspects of the sounds inness.

95
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

In academic literature recording techniques and post-production effects are often


described as passive devices, i.e., devices through which sound mediates. The virtual
space of a recording is often described as a spatially neutral equilibrium at a given point
in time. But this view does not give us the full picture of what these effects do, and what
sound is for the listener. This survey suggests that sound engineers often articulate the
inner workings of the mix in terms of force dynamic metaphors. In this sense they
appear to think of recording technologies as interactive devices that may cause different
kinds of action in the phonographic container.

4.1 FORCE Gestalts


Force is a prevalent category in our understanding of the world, although we may only
notice it when it acts unexpectedly. Leonard Talmy (1981, 1985) argues that force is an
important aspect of all language structures. These force structures Talmy calls force
dynamics since they refer to how entities interact forcefully with each other. Forces
emerge as an elaborate system with different outcomes: e.g., forces may be resisted,
obeyed, overcome, blocked or absorbed. The dynamic field of forces determines the
outcome. Let us take as an example this expression: John cannot go out of the house.
The outcome of this situation is that John is still in the house. Yet according to Talmy it
is a barrier that causes the outcome (in this situation an unknown barrier), and prevents
John from going out, although he has a tendency to do so (Talmy 1985).
The idea that we ascribe an intrinsic force tendency (action or rest, strong or
weak) to entities in language and thought is central to the present study. As we shall see,
the relation between sounds and container is often characterised by a force dynamic
relation emerging from the force tendencies of the sound and the container. As Mark
Johnson (1987) notes, these dynamic relations have a schematic quality. Johnson
extends Talmys findings to image-schematic FORCE gestalts, by asserting that there is
an overlap between the meanings of verbs as applied in rational argument and as applied
to the physical world. He then identifies a link between the modals must, may and can
to the image-schematic FORCE gestalts COMPULSION, REMOVAL OF
RESTRAINT and ENABLEMENT respectively (Johnson 1987).
Even though force dynamics was originally applied to describe verbs of motion, it
is easy to see how this notion can describe the production and experience of sound. As I

96
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

will show (section 5) FORCE schemas make it possible to account for structures of
recorded sounds that are often neglected in other sound analytical approaches. I will
argue for a broader view on sound experience that acknowledges what Talmy (2003)
calls causative situations, i.e., the view that experience consists simultaneously of the
caused and the causing event. In the following, I will discuss a few of the schemas that I
find most pertinent to the present discussion, although many more influence how we
reason about recorded sound.

4.2 Out-Orientation
As mentioned above we may think of sounds as dynamic objects acting within a three-
dimensional phonographic container. Different characteristics of the container allow
sounds to act in different ways, and different characteristics of the sound itself may
provide for certain kinds of actions. Individual sounds are usually thought of as
bounded objects constrained by other sounds in the mix. Sounds that are tucked in too
much can thus be brought out, making the sound more accessible.

Figure 2: OUT schema

Whereas in and out can relate to physical orientation in space, the spatial orientation
may be more abstract in other cases. In the following quotes sounds are described as
moving entities with an out-orientation (Figure 2).

I did ride a couple of notes that didn't come out clearly. (Robert Carranza in
Tingen 2008, May)
When I put [the sound] through Linear Phase Equalizer it suddenly jumped out.
(David Pensado in Tingen 2007, Jan.)

97
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

A sampled handclap was made to stand out in the track by application of heavy
low-end boost, shelving cut above 12 kHz and stereo widening. (Joe Zook in
Tingen 2008, Jun.)

In these cases the out-movement describes the sounds orientation from a bounded
position to a more accessible position. If a sound engineer takes a sound out of the mix,
it means that the sound is no longer there. He has simply removed the sound from the
mix. Bringing a sound out or making it stand out, however, means bringing it into
prominence, e.g., into the auditory space available to the listener. Coming out is thus a
metaphor that sound engineers use to describe how sounds are made accessible to the
listener in the recording.
We can even think of positions that are neither fully in nor fully out. It seems that
recording engineers often try to achieve a balance between these two positions. We can
therefore consider availability and unavailability as endpoints on a continuum. The
following quotes highlight this feature:

The only thing I did on the bass was manually ride a couple of notes that didn't
come out clearly. (Robert Carranza in Tingen 2008, May)
The Space Designer sounds like a very high-end reverb that brings the vocals
out a little more. (Greg Kurstin in Tingen 2009, May)
I applied quite a bit of L1 on track 48, to bring the vocals out slightly. (Fraser T.
Smith in Tingen 2009, Nov)

Clearly, slightly and a little bit more designate the level of out in each of these
sentences. In quote two the expression brings the vocals out a little bit more describes
how much the vocal is available to the listener; in this case, a little bit more than before
the Space Designer effect was used. Saying that a sound source is more or less available
must mean that some elements of the sound source are not available (like pouring
more of the soup into the cup, but not all of it). It seems that sound sources are never
characterised as fully in or fully out. They always reside somewhere in between.
Therefore sounds are characterised as having both available and unavailable parts.

98
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

4.3 Open and Closed Sounds


Several meanings are attached to the idea of sounds coming out, and technical
explanations that link the use of recording techniques with this metaphorical
understanding not always clear-cut. For instance, in some cases a sound engineer may
bring a compressor into the signal chain to bring certain sounds out, whereas in other
cases compression helps to keep a sound inside.1 Much depends on the treated sound
sources, how the effect is used and what the auditory context is.
As mentioned earlier, we may think of sounds as bounded entities constrained by
other sounds in the mix. We may, however, also think of the sound itself as a container
with a core quality. Sometimes sound engineers describe sounds as fully enclosed,
hiding their inner details, and sometimes sounds are described as more available (open)
to us. Sound engineers thus seem to connect the idea of open and closed with the way in
which a sounds core quality (its details) are afforded to us. Stanley R. Alten (2011:
463) links the openness of a sound with characteristics such as airy, transparent,
natural, or detailed, whereas openness for Bruce and Jenny Bartlett (2009: 42) is
described in terms of gentleness and letting the instrument breathe. Both of these
producers appear to connect openness with unrestricted sounds. Open sounds are given
space to propagate, and are brought through to the listener in a transparent manner.
Such experiences involve notions of force relations (Talmy 1981, 1985) in which
sounds have a tendency to come out unless constrained by the stronger force of the
container.
One way of generating a closed sound is to cut out high frequencies, whereas
more openness is often achieved by boosting high frequencies to bring out more details.
Filtration is thus an effect closely linked to the experience of open and closed. A sound
that we are accustomed to may be perceived as closed when the high frequencies are cut
out, whereas a sound with lots of high frequency content is described as open. These
experiences of open and closed appear to be grounded in the acoustics of real-life
situations, such as: (1) when we hear a sound emanating from within a closed container;
or (2) when we hear a sound that reaches our ears without any obstruction between the
sound source and the listener. In the first example some part of the high frequencies is

1
Quiet sounds are usually brought out when the overall mix is compressed, whereas louder sounds that
stick out too much may be compressed in order to keep them in place.
99
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

absorbed (filtered out) by the container, whereas in the second example we hear the
sound unmediated. For the same reason, the experience of open sounds is related to
accessibility and closed sounds to exclusion.

5. Case: Dynamic Range Compression


The aim of the following section is to explore the experiential effect of dynamic range
compression and see how sound engineers make sense of the auditory outcome of this
effect. Timothy Warner rightly notes that in the academic world dynamic compression
is perhaps the least well explored or understood of all recording processes (2009: 134).
There may be many reasons for this inattention to dynamic compression. It is likely,
however, that musicologists avoid the subject because of the lack of terminology to
articulate the experiential effect of compression.
In physical terms, a dynamic range compressor is a processor that turns down
signals by a certain ratio when the signals reach above a certain threshold. But what
happens to the experience of the sound, when a mix, or individual tracks within the mix,
is processed with a compressor? There is no single answer to this question. Most
listeners notice that a track appears louder (increased RMS) after being compressed and
regained. If this experience were the sole effect of compression, however, its effect
would be similar to shaping the volume with dynamic faders. The effect of the
compressor is often a neglected aspect in musicological analyses of recorded sound.
This is somewhat odd when we consider that almost all recordings have been
dynamically compressed to some extent, and quite extensively in many pop music
genres. Yet it may be precisely because of the conventionality of heavy signal
compression in modern recordings that we rarely pay much attention to it anymore.2

5.1 The Impact of Compression on Auditory Experience


A significant finding in the study of sound engineers use of metaphors was that they
often articulate the effect of compression in terms of force dynamics. In fact, the term
dynamic compression is in itself force dynamic.

2
The increasing focus on loudness in modern popular music recordings has caused recording engineers to
apply still greater levels of compression. This tendency has led critics to talk about a loudness war
(Milner 2009).
100
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

Figure 3: COMPULSION (Johnson 1987: 45)

COMPULSION denotes the force exerted on an object which causes it to move in a


given direction. This force is always headed in a certain direction along a path. Sound
engineers use several expressions that relate to the COMPULSION schema (Figure 3),
when describing the effect of sound editing:

When the drummer hits the snare, [the compressor] sucks down and you get a
good crest on it. (Lee DeCarlo in Owsinski 1999: 5)
If one side gets significantly louder the compressor will grab it and pull it down a
little. (Jason Goldstein in Tingen 2007, Apr.)

These are cases of caused motion in which objects are moved by external forces. The
forces are in both cases specified by the compressor setting. We can also see how the
COMPULSION schema in both cases is dependent upon the PATH schema. The force
moves along a vertical path going downward, whether it is sucking down or pulling
down. In both examples the force is exerted on the sound from beneath it.

5.2 REMOVAL OF RESTRAINT Schema


The force may also follow a path that transcends the boundaries of the container.
Consider the expression the compression just helps [the sound] to cut through a little
better (Serge Tsai in Tingen 2007, Jun.). To say that the sound is cutting through
something implies that music is moving from one container to another. We can say that
the sound follows a path with a starting-point in the phonographic container and an end-
point in listening space. In this way the sound penetrates the boundaries separating these
two spaces. This event activates the REMOVAL OF RESTRAINT schema (Figure 4)
that connects experiences of overcoming a boundary or obstacle that hinders an objects
movement from one point to another:

101
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

Figure 4: REMOVAL OF RESTRAINT (Johnson 1987: 47)

The REMOVAL OF RESTRAINT schema describes an obstacle that is removed by


some entity that follows a path in a certain direction. Tsais quote does not specify what
the removed obstacle might be. Doubtless he is referring to the mix as such but exactly
how the mix constitutes an obstacle to the specific sound remains unsolved. Since the
compressor is the tool needed to fix the problem, we can assume that the obstacle is
mainly related to volume. Cutting through implies making the sound audible by either
making it louder (in the entire bandwidth or only in specific frequencies) or turning
other sounds down.

5.3 EXPANSION and CONTRACTION Schemas


Squeeze is another common way to express the force exerted upon sounds by the
compressor. For instance:

Theres also a compressor, which is working pretty hard, squashing the sound as
hard as possible. (Greg Kurstin in Tingen 2009, May)
What Ill do is put the drums in a limiter and just crush the hell out of it. (Lee
DeCarlo in Owsinski 1999: 55)

The forceful nature of squeeze is not exerted from below the sound, but rather from all
directions. Consequently squeeze has a different image-schematic structure from pull
and suck. Squeeze is also connected to the CONTAINMENT schema. We can
understand squeeze as a process of either making the container smaller or making the
contained object bigger. When a contained object is squeezed, it has less room in which
to move. Accordingly, the image-schematic structure of squeeze is related to the size of
the contained object and/or the capacity of the container. CONTRACTION and
EXPANSION schemas (Figure 5) come to mind here.

102
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

CONTRACTION OF EXPANSION OF CONTENT


CONTAINER
Figure 5: CONTRACTION and EXPANSION schemas (inspired by Brower 2000: 353)

The EXPANSION schema is also argued for in Candace Browers article A Cognitive
Theory of Musical Meaning (2000). Differently from the present study, however,
Brower focuses on harmonic and melodic progression in music. She describes how the
EXPANSION schema is activated when, for instance, a rising melodic line and a
descending bass line occur at the same time. She thus connects CONTRACTION and
EXPANSION with the changing boundaries of the pitch register.
In this article I show how CONTRACTION and EXPANSION are connected to
the interaction between compressor and sounds in the phonographic container: for
instance, by limiting the capacity of the sound container. If the overall mix is
compressed, the boundaries of the sound container come to the fore, since the sound
exceeds force on the boundaries. The capacity of the sound container is then brought to
the fore when the contained sound reaches the maximum volume, or even goes above
this level. This also implies that we must see recorded sounds as squeezable objects,
because of their ability to lower the capacity of the sound container beyond the amount
of sound. In this way the experiential effect of compression is represented as a
contraction of the sound container.

5.4 Sounds as Living Organisms


This experience of CONTRACTION and EXPANSION is bodily embedded. Think
about the heart and lungs that constantly oscillate between contraction and expansion.
The metaphorical connections to bodily organs are articulated by sound engineers when,

103
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

for instance, talking about making the compressor breathe in time with the song
(Owsinski 1999: 55) or making a sound pump in sync to the music. In fact, sound
engineers often conceptualise sounds and sound sources in terms of living organisms.
This is especially so when the conversation revolves around compression: e.g.,
techniques to make the compressor breathe (Owsinski 1999: 62); making the
[sound] come alive (Ed Seay in Owsinski 1999: 231); and over-using compression so
that the sounds are squeezed to death (George Massenburg in Owsinski 1999: 199).
These expressions all circle around the conceptual metaphor THE MIX IS A LIVING
ORGANISM.
As we have seen, sound sources are not static entities. They act and interact, not
just in the phonographic container, but also through, with and against it. When a
dynamic compressor is applied to the signal chain, it will not just alter the signal
independently of the characteristics of the sound routed through it. A compressor reacts
to the level and the spectrum of sound and often there is a strong sense of the
involvement of interaction, causal connections and energy. When sound engineers
make alterations to a sound they do not think of these alterations as something that
happens in the sound source, but consider that something else interferes and causes the
alterations. Therefore, rather than being a stable frame, the phonographic container is,
so to speak, immersed in the dynamic flow of sounds that balance and unbalance each
other, creating different forms of tension.

5.5 Active Containment


As argued, sound engineers appear to understand sound events as causal sequences that
are structured by bodily force dynamics. This claim has implications for how we
understand sound editing on a more general level. We have seen how the compressor
pulls, pushes or ducks the sound, which causes it to come out more clearly, sit well in
the mix and so forth. These actions, caused by the compressor, do not only describe a
cause-effect relationship. They are essentially expressing the compressors control over
the sound sources.
Physical control is a common way to express the more abstract control exerted by
effect units on sounds:

104
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

When you turn the ratio right up and lower the threshold it kind of grabs the
sound in a way that no other compressor does, giving it a really sharp-sounding
front end. (Robert Orton in Tingen 2009, Mar.)

In this example Robert Orton describes how the compressor grabs the sound to
manipulate it in a certain way. Grabbing describes the compressors control over the
sound. In this sense the event of grabbing constitutes an interesting instance of
containment. A common occurrence of grasping is when we reach out to grab an object
with our hands. This event causes the object to be in our hands. The event includes the
act of enclosing our hands around the object. Our hands then constitute an active
container that forces its constraints upon the object. This event corresponds to how
sound engineers often describe the compressor as an active container. In his study of
literary thinking Mark Turner (1996) explains how such action-stories are often
projected onto other events:

It is common to project action-stories of grasping and controlling physical objects


onto other event-stories. Conditions we control and enjoy correspond
parabolically to physical objects we grasp, possess, and control Within this
logic of objects and grasping, something reliably within our grasp is subject to our
control. When we project an action story of grasping, we project this logic.
(Turner 1996: 34)

Accordingly, a compressor is conceptualised as a device that allows sound engineers to


control sounds in different ways. This becomes even clearer in the following quote by
producer Jason Goldstein:

If one side gets significantly louder, the compressor will grab it and pull it down a
little. (Jason Goldstein in Tingen 2007, Apr.)

The event described in this quote includes the act of enclosing, but we also see a
combination of events that precedes and follows the enclosure. The sequence has a
three-part structure: (1) the sound gets louder; (2) the compressor grabs the sound; (3)
the compressor pulls it down. Looking at sequence 1 -> 2, we notice that the compressor
grabs the sound only when the sound is getting louder.

105
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

Such experiences correspond to findings by Robert B. Dewell (2005), who argues


that most of our experiences of containment involve both ENTRY (an object going into
the container) and ENCLOSING. This finding has implications for the understanding of
the CONTAINMENT schema as presented by Mark Johnson (1987). In Johnsons view
a container is generally a passive element, and objects actively move in and out of it. In
Dewells (2005) account both the container and the contained object can act as active
elements. Hence, containing is something the compressor actively does by grabbing the
sound, i.e., enclosing the sound, and exposing force upon it. In this context the idea of
CONTAINMENT as ENTRY CLOSING can be seen as broadening the
CONTAINMENT schema, which adds to the understanding of the phonographic
container by accounting for the dynamic processes that restructure and activate its
internal structure.

6. A Functional Geometric Framework


In this article I have argued for a move from viewing containment in geometrical terms,
as objects located within something else, towards a view of containment as a force
dynamic structure. Geometrical containment is about physically locating an object
within a container. This notion, however, fails to acknowledge two aspects of
containment: (1) that objects and containers interact with each other; and (2) that both
container and objects have specific functional features that affect our perceptions of
containment.
Enclosure can take different forms. For an object to be fully enclosed, it is
normally required to be fully surrounded by something else, e.g., canned beans. If we
pour the beans into a bowl, they are no longer topologically enclosed. The bowl
provides in many ways a weaker form of enclosure than the can, since it only partially
encloses the beans. Thus the can and the bowl reflect two different degrees of
containment (Coventry & Garrod 2004). To be characterised as a container, however,
the object must function as container.
For an object to be positioned within a container the container needs to constrain
the object in some way there must be a functional relation between the container and
the content. If the container moves, the object will move with it. This idea is presented
by Kenny C. Coventry and Simon C. Garrod (2004), who argue that the preposition in

106
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

involves both a geometric relation (enclosure) and an extra-geometrical relation


(location control). This idea is developed further in the following section.

6.1 Location Control


The constraints of different sorts of containers are associated with varying degrees of
location control. For instance, we think of a ball being in a bowl, even though the bowl
does not fully enclose the ball. Nevertheless, the bowl keeps the ball in the same
position, even when the bowl is moved. For this reason the bowl provides some degree
of location control. Coventry and Garrod, however, found that the degree of location
control diminishes gradually if the bowl is tilted. The perception of location control is
thus related to the specific features of the container and the specific event that takes
place. A container may fully enclose an object, providing a strong degree of location
control, or only enclose it partially and provide a weak enclosure.
The specific features of the reference object (e.g., a ball) also contribute to the
perception and representation of containment. Feist and Gentner (1998) demonstrated
that animate objects (e.g., a fly) were less likely to be represented as in something than
inanimate objects (e.g., a coin). Again, this finding is related to the idea of containment
as location control. Since a bowl does not control a fly in the same way as it controls a
ball, test subjects found it less appropriate to use the preposition in for the location of a
fly than for the location of other inanimate objects. Feist and Gentner found that similar
variations existed for different features of the container, e.g., the difference between
something located in a hand (animate container) or a bowl (inanimate container). Thus
the animacy of the container may also have an influence on the perceived degree of
location control.

6.2 The Relation between the Phonographic Container and its Content
Both geometrical space and extra-geometrical features are represented in the way in
which we talk about containment. To be precise, what the contained object and the
container are determines, to some degree, how we put into words where the object is
(Carlson-Radvansky et al. 1999). Spatial relations are not only represented through
geometric routines but also through how objects act and interact.

107
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

Figure 6: Functional Geometric Framework (Coventry & Garrod 2004: 55)

Spatial language is grounded in both geometric routines and extra-geometric


information (Figure 6). Each of these elements may have more or less influence on the
prepositions used to describe a scene. In some instances the actual geometry of a scene,
i.e., the position of objects in Euclidian space, may determine the particular preposition,
whereas extra-geometric information may have more influence in other situations.
Coventry and Garrod divide extra-geometric information into two branches. The first
branch, dynamic-kinematic routines, describes the perceived potential or actual
dynamics in a scene, e.g., how objects act and interact and how the action and
interaction evolve over time. In this context, dynamic-kinematic routines involve the
perceived location control of a scene, i.e., the potential action of the contained object
and its interaction with the container. The second branch, object knowledge, involves
knowledge about the typical function of the object in a specific situation.
In the following quotes we can see how sound engineers use different, though
metaphorically related, expressions to describe the enclosure of the container and the
features of the contained object:3

(1) Container features


If you add around 10k it opens everything up. (Marcella Araica in Tingen 2008,
Feb.)
Open up the bandwidth until you get the snare to jump out. (Owsinski 1999: 33)

3
One of the problems related to applying the principles of the functional geometric framework to the
auditory domain is that the distinction between the features of the reference object and the features of the
container is not as clear-cut as in the visual domain. In other words, what count as features belonging to
the sound source (the contained object) and what count as features belonging to the phonographic
container may in many cases be fluid.
108
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

The compression on all three of them was just to make sure nothing jumped out at
you. (Demacio Demo Castellon in Tingen 2008, Jul.)

(2) Contained object features


If you make the attack harder, something will sound louder. It will cut through the
mix without having to add additional volume. (Jason Goldstein in Tingen, 2007,
Apr.)
I also have to keep the kick and snare really punchy to kind of cut through (Jerry
Finn in Owsinski 1999: 112)

These descriptions not only capture features of the container and the contained object
respectively, they also add to the understanding of the mutual spatial relations between
them. The first quote by Marcella Araica points to the understanding of the potential
transformation of the container: if the frequency spectrum around 10 kHz is boosted, the
container will change from a more closed state to a more open state, providing less
location control for the sound sources in it. Likewise, Jason Goldstein describes how
sound sources should be altered in order to penetrate the container.
Often compression activates a whole series of causally related events. Producer
Tom Elmhirst articulates some of the complexities related to compression in his
description of the tune Rehab (Back to Black, 2006) by Amy Winehouse:

The Urei [compressor 1] will have been set with a very fast attack and a super-fast
release, doing perhaps 10 dB of compression, while the Fairchild [compressor 2]
will have had a very slow release. I can't quite explain what this does, but in my
head the Urei will catch anything that jumps out, while the Fairchild will pick up
the slack and keep a more constant hold of the vocal. (Tom Elmhirst in Tingen
2007, Aug.)

Although Elmhirst claims that he cannot explain what compressors do, he actually
provides a fairly comprehensive description. At least four expressions of forceful action
are detected in this quote: catch, jump out, pick up and hold. Jump out describes the
sounds as forceful objects that act, moving from the inside to the outside of the

109
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

container. This is counter-weighted by compressor 1 (the Urei) that catches the sound,
preventing it from jumping out. A second compressor picks up the slack and keeps a
hold on the vocal, confining it to a fixed position. The forces of the vocal sound are
restricted by the compressors, which on the one hand cause the voice to stay in the
container and on the other hand keep it in a fixed position within the container.
In summary, I have presented two elements in the experience and description of
sound sources in the phonographic container based on the linguistic corpus of
interviews and sound engineering textbooks: (1) a purely geometric component defined
in terms of physical localisation; and (2) a functional component that suggests the
interactional and functional relation between the container and the contained object.
Accordingly, the phonographic container does not constrain sounds in a predetermined
way. It can take different forms and provide various degrees of spatial constraint in
different tracks.

7. Discussion
We have seen how embodied image schemas connect experience and conceptualisation
and thereby represent particular experiences of auditory events. It was shown how
schematic structures foreground the kinaesthetic components of the interaction between
the sound and compression, and bring awareness of the tensions that are central to the
experience of recorded sounds. The bodily response to active sounds presented in this
article, however, is of course only one of several ways in which recorded music makes
sense to us. I have pointed to potential, yet undefined, meanings that musical sound may
evoke in listeners. Consider, for instance, the variety of ways in which the perceived
bodily gestures of musicians can enhance or change the emotional response to music
(Frith 1998). These potential meanings point to an indexical layer of musical
experience, grounded in the agency of actual sound sources (actual events) found
outside the music itself. This study, however, has pointed to the agency of sounds-in-
themselves within the sound structure of recorded music (virtual events), events we
make sense of through bodily embedded experiences.
When we talk about sound phenomena in music we tend to objectify sounds,
reducing them to static phenomena. Musical meaning, however, is not a response to
something static but stems from our involvement with the musical flow of changing

110
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

events. Consider, for instance, how, at the formal level of musical structures, we talk
about the movement of a melody, harmonic progression or the tension of a dominant
seventh chord before it resolves to the tonic (Zbikowski 2002). Such expressions remind
us that force dynamic structures are found on many levels of musical experience, and
constitute one of the essential ways in which sounds make sense to us as music
(Hjortkjr 2011). Recorded sounds, in fact, make sense to us in terms of how they
behave within the phonographic container and succeed each other to be perceived as
musical motion. In this sense sound (the flow of active sound events) and music (in the
sense of formal structure) have mutually related meanings.

8. Conclusions
The metaphorical domain is well established in the study of music, yet there is still
much to be said about the connection between language and the experience of musical
sound. This article has sought to account for how sound engineers conceptualise
recorded sounds. The study revealed that sound engineers often think in force dynamics
when describing the inner workings of an audio mix. Believing with Lakoff and
Johnson that these metaphors are not randomly picked, but form an essential structure
of our musical understanding, I suggest that the identified expressions of force offer
important clues as to the experiential qualities of recording practice and post-production
effects. Sounds act and are acted upon by effects in the phonographic container, e.g., we
may perceive the potential for a sound to move forward if it was not held back by some
other effect. Such experiences were accounted for by referring to Leonard Talmys
conception of force dynamics.
Although we know a lot about the techniques of compression, the experiential
effects of compression have previously been neglected in musicological writings,
possibly because of the lack of an adequate vocabulary. I have suggested that the focus
on FORCE metaphors makes a central contribution to the description of this effect.
CONTAINMENT is the central image schema discussed in this article. Using
Coventry and Garrods notion of location control I pointed to the idea that sounds
interact with the phonographic container. They engage in what we may call a functional
relation that reflects different degrees of containment. I argued that we should think of
the phonographic container as an active container that interacts with the content. The

111
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

phonographic container simply functions as a container in different ways, described in


terms of the container's ability to constrain the sounds. For instance, sound engineers
described some tracks as being full or as having empty regions, some appeared closed or
more open, and in some tracks sounds came out clearly, whereas they were more tucked
in in others. Further sounds have different sizes that take up more or less space in the
container. Generally low frequency or loud sounds are characterised as larger than
high-frequency or low-level sounds (Gibson 2005: 34-35). Sound engineers may also
refer to other characteristics than the content volume. The boundaries of the
phonographic container may have different characteristics and the container may
enclose the sounds in different ways, providing a more closed or open structure, e.g., in
some tracks the sounds may seem fixed and constrained, whereas other tracks have
sounds that are more loosely constrained.
The finding suggests that we should focus more on the active shaping forces of the
phonographic container. Not only are the static characteristics of the sound source and
the position of the sound source felt, but also its potential force, i.e., its tendency to act.
Consequently I suggest that the language of sound engineers yields further insight into
the impact of recording technology on the listening experience and the potential
meaning of recorded music.

References
Alten, S. R. (2011). Audio in Media. Boston, MA: Wadsworth.

Bartlett, B. & Bartlett, J. (2009). Practical Recording Techniques: The Step-by-step


Approach to Professional Audio Recording. Amsterdam and Boston, MA: Focal Press.

Bregitzer, L. (2009). Secrets of Recording: Professional Tips, Tools and Techniques.


Amsterdam and Boston, MA: Focal Press

Brower, C. (2000). A Cognitive Theory of Musical Meaning. Journal of Music


Theory, 44(2), 323-379.

112
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

Carlson-Radvansky, L. A., Covey, E. S. & Lattanzi, K. M. (1999). What" Effects on


"Where": Functional Influences on Spatial Relations. Psychological Science, 10(6),
516-521.

Cottrell, S. (2010). The Rise and Rise of Phonomusicology. In Bayley, A., Recorded
Music: Performance, Culture and Technology. Cambridge: Cambridge University
Press, 15-36.

Coventry, K. R. & Garrod, S. C. (2004). Saying, Seeing, and Acting: The Psychological
Semantics of Spatial Prepositions. Hove and New York: Psychology Press.

Dewell, R. B. (2005). Dynamic Patterns of CONTAINMENT. In Hampe, B., From


Perception to Meaning: Image Schemas in Cognitive Linguistics. Berlin and New York:
Mouton de Gruyter, 369-393.

Feist, M. I. & Gentner, D. (1998). On Plates, Bowls and Dishes: Factors in the Use of
English IN and ON. Proceedings of the Twentieth Annual Cognitive Science Society,
Mahwah, NJ, 345-349.

Freeman, M. H. (2004). Crossing the Boundaries of Time: Merleau-Ponty's


Phenomenology and Cognitive Linguistic Theories. Linguage, Cultura e Cognico:
Estudos de Lingustica Cognitiva, 2, 643-665.

Frith, S. (1998). Performing Rites: Evaluating Popular Music. Oxford and New York:
Oxford University Press.

Gibson, D. (2005). The Art of Mixing: A Visual Guide to Recording, Engineering, and
Production. Boston, MA: Thomson Course Technology.

Griffith, N. (2002). Music and Language: Metaphor and Causation. In McKevitt, P.,
Nuallin, S. & Mulvihill, C., Language, Vision, and Music. Philadelphia, PA: John
Benjamins, 191-203.

113
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

Hjortkjr, J. (2011): A Cognitive Theory of Musical Tension. PhD Thesis. University of


Copenhagen. Department of Arts and Cultural Studies.

Hodgson, J. (2010). A Field Guide to Equalisation and Dynamics Processing on Rock


and Electronica Records. Popular Music, 29(02), 283-297.

Izhaki, R. (2008). Mixing Audio: Concepts, Practices and Tools. Boston, MA: Focal
Press

Johnson, M. (1987). The Body in the Mind: The Bodily Basis of Meaning, Imagination,
and Reason. Chicago and London: The University of Chicago Press.

Lakoff, G. (1987). Women, Fire and Dangerous Things: What Categories Reveal about
the Mind. Chicago and London: The University of Chicago Press.

Lakoff, G. & Johnson, M. (1980). Conceptual Metaphor in Everyday Language. The


Journal of Philosophy, 77(8), 453-486.

Lakoff, G. & Johnson, M. (1999). Philosophy in the Flesh: The Embodied Mind and its
Challenge to Western Thought. New York: Basic Books.

Landy, L. (2007). Understanding the Art of Sound. Cambridge, MA and London: MIT
Press.

Michelsen, M. (1997). Sprog og lyd i analysen af rockmusik. PhD Thesis. University of


Copenhagen. Department of Musicology.

Milner, G. (2009). Perfecting Sound Forever: The Story of Recorded Music. London:
Granta.

Owsinski, B. (1999). The Mixing Engineer's Handbook. Vallejo, CA: MixBooks.

114
Peer-Reviewed Paper JMM: The Journal of Music and Meaning, vol. 12, 2013/2014

Porcello, T. (2004). Speaking of Sound: Language and the Professionalization of


Sound-Recording Engineers. Social Studies of Science, 34(5), 733-758.

Porcello, T. (2005). Music Mediated as Live in Austin: Sound Technology and


Recording Practice. In Green, P. D. & Porcello, T., Wired for Sound: Engineering and
Technologies in Sonic Cultures. Middletown, CT: Wesleyan University Press, 103-117
Smalley, D. (1997). Spectromorphology: Explaining Sound-Shapes. Organised
Sound, 2(2), 107-126.

Talmy, L. (1981). Force Dynamics. Conference on Language and Mental Imagery.


University of California.

Talmy, L. (1985). Force Dynamics in Language and Thought. The Parasession on


Causatives and Agentivity, Twenty-First Regional Meeting. Chicago.

Talmy, L. (2003). Toward a Cognitive Semantics (Vol. 1): Concept Structuring Systems.
Cambridge, MA: MIT Press.

Tingen, P. (2007-2009). Secrets of the Mix Engineers. (Monthly article series January
2008 November 2009). Sound on Sound Magazine.

Turner, M. (1996). The Literary Mind. New York and Oxford: Oxford University Press.

Warner, T. (2009). Approaches to Analysing Recordings of Music. In Scott, D. B.,


The Ashgate Research Companion to Popular Musicology. Farnham and Burlington:
Ashgate. 131-146.

Zbikowski, L. M. (2002). Conceptualizing Music. Oxford: Oxford University Press.

115

You might also like