Explosions in The Mind
Explosions in The Mind
Explosions in
the Mind
Composing Psychedelic Sounds and Visualisations
Jonathan Weinel
Palgrave Studies in Sound
Series Editor
Mark Grimshaw-Aagaard, Musik, Aalborg University,
Aalborg, Denmark
Palgrave Studies in Sound is an interdisciplinary series devoted to the
topic of sound with each volume framing and focusing on sound as it is
conceptualized in a specific context or field. In its broad reach, Studies
in Sound aims to illuminate not only the diversity and complexity of
our understanding and experience of sound but also the myriad ways in
which sound is conceptualized and utilized in diverse domains. The series
is edited by Mark Grimshaw-Aagaard, The Obel Professor of Music at
Aalborg University, and is curated by members of the university’s Music
and Sound Knowledge Group.
Editorial Board
Mark Grimshaw-Aagaard (series editor)
Martin Knakkergaard
Mads Walther-Hansen
Editorial Committee
Michael Bull
Barry Truax
Trevor Cox
Karen Collins
Explosions
in the Mind
Composing Psychedelic Sounds
and Visualisations
Jonathan Weinel
University of Greenwich
London, UK
© The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature
Singapore Pte Ltd. 2021
This work is subject to copyright. All rights are solely and exclusively licensed by the Publisher,
whether the whole or part of the material is concerned, specifically the rights of translation,
reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other
physical way, and transmission or information storage and retrieval, electronic adaptation, computer
software, or by similar or dissimilar methodology now known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this
publication does not imply, even in the absence of a specific statement, that such names are exempt
from the relevant protective laws and regulations and therefore free for general use.
The publisher, the authors and the editors are safe to assume that the advice and information in this
book are believed to be true and accurate at the date of publication. Neither the publisher nor the
authors or the editors give a warranty, expressed or implied, with respect to the material contained
herein or for any errors or omissions that may have been made. The publisher remains neutral with
regard to jurisdictional claims in published maps and institutional affiliations.
This Palgrave Macmillan imprint is published by the registered company Springer Nature Singapore
Pte Ltd.
The registered company address is: 152 Beach Road, #21-01/04 Gateway East, Singapore 189721,
Singapore
Acknowledgements
Thanks are due first and foremost to Rajmil Fischman, for his vision
of what the future of electronic music and audio-visual composition
might hold, and his openness and support for my ideas. I would like to
acknowledge Keele University for supporting me with a grant to study
my MRes from 2005–2006, and the AHRC for providing the funding
that allowed me to study my Ph.D. at Keele from 2007–2010. Thanks
are also due to the other friends and colleagues that have helped shape
various aspects of my work over the years, especially Stuart Cunningham,
Mark Grimshaw-Aagaard, and the teams at University of Greenwich
and EVA London (Electronic Visualisation and the Arts). My friends
Sol Nte and Lyall Williams deserve a special mention for their various
recommendations of music, reading, and video games that I might have
missed. Thanks also to those artists, musicians, programmers, and theo-
rists, whose inspirational work I mention in this book. Lastly, thanks to
my family, especially Jen for the green dots.
v
Praise for Explosions in the Mind
“Weinel’s book draws on his original and brilliant work as both a musi-
cian and visual artist of exceptional linguistic talent, which he uses
to create trailblazing narratives worthy of defining the emerging field
integrating digital sounds and images—his music and visualizations are
wrought from expressive interpretations of psychedelic technicolor states
of the inner mind. The reader’s journey is illuminating, exciting and
scholarly, making this book a must-must have for musicians and visual
artists.”
—Tula Giannini, Professor, Pratt Institute, NYC, Museums and Digital
Culture: New Perspectives and Research by Tula Giannini and Jonathan
Bowen, Springer, 2019, Great Flute Makers of France: The Lot and
Godfroy Families, 1650–1900 by Tula Giannini, Tony Bingham,
London, 1993
“To the uninitiated, the term ‘psychedelic art’ may simply recall sixties
tie-dye, or hippy nostalgia, yet Explosions in the Mind shows a way in
which the topic can remain relevant. Through its use as the subject of
‘practice-based research’, Weinel’s output expertly straddles the divide
vii
viii Praise for Explosions in the Mind
“Weinel has been a leading light in the annual EVA London conferences
on Electronic Visualisation and the Arts over recent years, especially in
the musical aspects of these events. This book represents his thoughts
regarding altered states of consciousness in relation to music. The
book’s contents are founded on practice-based research, and demonstrate
significant insights concerning music and digital culture.”
—Jonathan Bowen, Emeritus Professor of Computing, London South
Bank University, The Turing Guide (Oxford University Press, 2017),
Bowen, J.P., Keene, S., and Ng, K., editors, Electronic Visualisation in
Arts and Culture. Springer Series on Cultural Computing, Springer,
2013. ISBN 978-1-4471-5406-8
Contents
1 Introduction 1
2 Psychedelic Journeys in Sound: Electroacoustic
Compositions 27
3 Melting in the Mind’s Eye: Real-Time Performances 55
4 Tune in, Turn Up, and Trip Out: Audio-Visual
Compositions 81
5 Sensorial Apparatus: Interactive Projects 107
6 Optical Geometry: VJ Performances 131
7 Future Sound Dream: Virtual Reality Experiences 171
8 Conclusion: Design Frameworks 191
References 203
Index 225
ix
About the Author
xi
List of Figures
xiii
xiv List of Figures
xvii
List of Media
xix
xx List of Media
Psychedelic Experiences
Psychedelic experiences, such as those elicited by hallucinogenic mush-
rooms, can be understood as a specific form of ‘altered state of conscious-
ness’ (ASC). The term ASC was coined in the 1960s and describes a
range of perceptual or experiential states such as dreams, hallucinations,
meditations, trances, or hypnotic states (Ludwig 1969). It can be diffi-
cult to define precisely what qualifies as an ASC, since we may presume
that our conscious experience fluctuates constantly throughout the day,
and also varies between individuals. Accepting these limitations, the ASC
term is nonetheless useful as a general description that allows us to
talk about points of significant divergence from a commonly accepted
‘normal-waking consciousness’. The caffeine in your morning cup of
coffee probably induces something we might consider a very mild ASC,
but generally speaking, when using this term we are talking about some-
thing a bit more out of the ordinary, which is unlikely to go as well with
croissants and reading the newspaper.
These extraordinary states can be induced through various means. For
example, synthetic drugs or intoxicating plants found in nature may
induce perceptual changes such as hallucinations. However, hallucina-
tions may also occur without the use of drugs, as in cases of sensory
deprivation, which can be elicited with sensory isolation tanks, which
suspend the body in complete darkness. While reducing the senses in this
way seems to elicit hallucinations, overloading them may also provide an
alternative route for inducing ASCs. We see this in the various indige-
nous trance ceremonies found across the globe, where music, dance, and
the spraying of liquids induce states of sensory overload in which partici-
pants believe they are possessed by spirits (Rouget 1985). Hallucinations
can even occur due to extreme forms of hunger or fasting, such as may
be experienced by explorers low on supplies in perilous environments in
the extremities of the desert or Antarctic. While these are roads seldom
travelled by most people, perhaps more familiar to the reader will be
experiences of dreaming, which can also be viewed as a form of ASC anal-
ogous to a hallucination that occurs during sleep (Hobson 2003). Lucid
dreams, in which one is aware that they are dreaming, can have partic-
ularly intense, hallucinatory qualities; while on the threshold of sleep it
1 Introduction 5
there are clear parallels to be found in this genre via the experimental use
of tape, echo, and reverb.
The approaches of psychedelic rock and dub were among those that
were influential on the electronic dance music culture of the 1980s and
1990s (Collin 1998; Reynolds 2008), and a myriad of other associated
genres such as drum & bass, trip-hop, ambient techno, and psy-trance.
In various ways, these electronic dance music genres wrap forms of
psychedelic sound design and sampling around energetic rhythms and
beats, providing rave music that is tinged with acidic and hallucinogenic
qualities. Electronic dance music producers do not always take drugs,
and neither do the audiences of this music, but there is a proximity to
drug use in these genres that means psychedelic themes are often close
at hand. As St. John (2009) discusses, electronic dance music may well
be used in combination with drugs, where these sounds are likely to
be complementary, but above all, it is the music that appeals to audi-
ences, and these sounds may even have the potential to elicit collective
trance-like experiences of dance.
Of special significance for this book is also those works of electroa-
coustic music,8 which connect with ideas of dreams, hallucinations, or
unreality. Electroacoustic music is not usually seen as part of psychedelic
culture, however, there are examples that relate to various concepts of
ASCs, by using electronic manipulations of sound to elicit dreamlike
aural experiences. A notable work that achieves this with striking success
is Michael McNabb’s Dreamsong (1978), which transitions between
recorded sounds that suggest a real-world location, and synthesiser
sounds that indicate a dream world. Barry Truax has also created several
significant works that traverse similar perceptual boundaries, such as
Pendlerdrøm (commuter dream) (1997), which describes a travel expe-
rience in which a commuter lapses into a dream. In this case, the work
was realised through various field recordings and computer-manipulated
sounds, which allow Truax to move the listener between representa-
tions of normal-waking consciousness and dreaming, and thereby tran-
sitioning between ‘external’ and ‘internal’ points of Hobson’s (2003)
‘input axis’. Along similar lines, Truax’s piece The Shaman Ascending
(2004–2005), takes the concept of an Inuit shamanic ritual, and uses
this to inform the organisation of sonic materials within the piece, in
1 Introduction 15
which droning vocal sounds rapidly circle around the listener within
the spatial field. Several other composers have also explored approaches
such as these. For example, Gary Kendall’s Ikaro (2009–2010) is one of
several compositions based on Peruvian shamanism, which incorporate
soundscape materials to construct sonic experiences that are analogous
to shamanic journeys. Mining similar territories, Adrian Moore’s Drea-
marena (1996) and Åke Parmerud’s Dreaming in Darkness (2005) both
use concepts of dreaming as points of creative departure. There is then a
significant strand of electroacoustic work that explores notions of ‘reality’
and ‘unreality’ as a basis for musical composition. This area is impor-
tant to highlight, because the compositional approaches that I initially
explore in Explosions in the Mind emerge from the field of electroacoustic
composition, and so these works have special contextual relevance.
Psychedelic Visualisations
Of special importance for Explosions in the Mind are also those existing
psychedelic visualisations, which can be found in various contexts
ranging from experimental films and VJ performances to video games
and VR experiences. In this section, I will outline some of the main
examples from these areas that are particularly relevant for Explosions
in the Mind , while acknowledging this is by no means an exhaustive
account of audio-visual practices, which reflect an increasingly broad
spectrum.
‘Visual music’ is an area of visual art and experimental film in which
works are designed based on the form and structure of music (Brougher
and Mattis 2005). Early examples of visual music include ‘colour organs’,
which display lights in correspondence with sound (Moritz 1997), and
the paintings of Wassily Kandinsky, which interpret music through
abstract symbols and shapes. However, the term is now more strongly
associated with the films of artists such as Len Lye, Normal McLaren,
Oskar Fischinger, Harry Smith, John Whitney, James Whitney, and
Jordan Belson. In the mid-twentieth century, these artists created striking
short films in which shapes, patterns, and textures seem to move and
dance around the screen in a way that reflects the rhythms, timbres,
16 J. Weinel
The Chapters
Through the course of this book, I will discuss the compositional
methodologies used to realise various creative projects that represent
ASCs and provide psychedelic sounds and visualisations. This work
has been undertaken primarily as ‘practice-led research’ in an academic
context. The works discussed in earlier chapters of the book were
completed as part of my Ph.D. in music at Keele University, while many
of the other projects covered later on were undertaken as postdoctoral
research elsewhere. For readers unfamiliar with this term, ‘practice-led
research’ is based on the premise that the production of the creative works
themselves constitutes a contribution to knowledge, leading to innova-
tions that could not otherwise be obtained through alternative means
(Smith and Dean 2009). Practice feeds into the generation of theory,
which in turn, informs practice.14 This methodology is often appropriate
for academic research in areas such as sound design and music composi-
tion, because it allows researchers to gain new insights by developing new
tools or compositional strategies that expand the repertoire. Practice-led
research is also sometimes used in industry, and may be combined with
interdisciplinary approaches, for example, by using empirical methods to
test and evaluate outcomes (Weinel and Cunningham 2021). Practice-led
research is sometimes described as ‘research through design’, emphasising
the potential for innovation through the process of designing and making
1 Introduction 21
new things in relation to specific objectives. For our purposes here, those
‘new things’ are electronic music and audio-visual compositions, which
explore possible strategies for representing ASCs and psychedelic visual-
isations of sound. These projects approach this area from various angles,
and the continuity between works represents a journey through this
subject across different media technologies. Chapters group the compo-
sitions thematically, and with a few exceptions made to accommodate
the logical grouping of works, the discussion is also chronological.
Chapter 2 begins by discussing fixed-media compositions of electroa-
coustic music composed between 2007 and 2011, which seek to repre-
sent ASCs. The initial works discussed in this chapter were composed
by taking typical features of ASCs, such as visual patterns of hallucina-
tion, or distortions to time perception, and translating them into sound.
Extending this idea, later works use the typical form of hallucinations
to inform the structural organisation of materials, so that the composi-
tion as a whole becomes analogous to what one might see or hear in a
psychedelic hallucination. The elaboration of this approach is explored
through Nausea (2011), a long-form multichannel composition, which
exhibits several distinct musical movements.
Chapter 3 takes the discussion into the realm of real-time perfor-
mances of electronic music. Several of my electroacoustic compositions
were realised with a specially designed software tool: the Atomizer Live
Patch, which facilitates the creation of sonic materials based on halluci-
nations, and can also be used for live performances. In this chapter, both
the design of this tool and its use for a live performance in New York City
are discussed. Following this, I also examine another piece of software:
Bass Drum, Saxophone and Laptop, which provides a real-time perfor-
mance system for live instrumentation and electronics, in which DSP is
automated in order to suggest the shifting perceptual changes that one
may experience during hallucinations.
Chapter 4 moves into the area of audio-visual composition. First,
I discuss Tiny Jungle (2010), a fixed-media piece, in which various
materials were designed based on the concept of visual patterns of hallu-
cination. This piece was also created with a bespoke software tool, the
Atomizer Visual. Following this, I discuss a trio of fixed-media visual
music compositions: Mezcal Animations (2013), Cenote Zaci (2014), and
22 J. Weinel
∗ ∗ ∗
Notes
1. Possession of ‘magic mushrooms’ and many of the other drugs mentioned
in this chapter is illegal in most countries. Many of the ASCs discussed
in this chapter are also potentially dangerous, and are not recommended
to the reader. In contrast, psychedelic visualisations of sound are generally
safe, although some may use stroboscopic visual elements that should be
avoided by anyone with photosensitive epilepsy.
2. Related topics are also explored in my book Inner Sound: Altered States of
Consciousness in Electronic Music and Audio-Visual Media (Weinel 2018a).
While the focus of Explosions in the Mind is on my own practice-led
research in electronic music and audio-visual composition, Inner Sound
provides a wide-ranging analysis of existing works related to ASCs. For an
expanded discussion of the wider area, readers may also wish to refer to
Inner Sound , which is complementary, and could be read either before or
after Explosions in the Mind .
3. ‘Psychonaut’ is a colloquial term for a person who takes a particular interest
in exploring the nethereaches of the human psyche through means of
psychedelic drugs and other ASCs.
4. Accounts of ASC experiences are available in various books, scientific
studies, and websites. For example, Hayes (2000) provides a collection of
experience reports; Strassman (2001) documents participant studies with
DMT; while the website erowid.org has a vast database of self-reports
covering the effects of almost every known intoxicating plant or substance.
5. Rouget’s (1985) use of the term ‘ecstasy’ describes ASCs that occur
in shamanic rituals characterised by quiet stillness, and should not be
confused with those states produced by the euphoric stimulant MDMA,
commonly known as ‘ecstasy’.
6. For a further discussion of auditory hallucinations, see also Weinel et al.
(2014).
1 Introduction 25
Put on the headphones and lower the needle on to the record, which
crackles faintly in anticipation as it makes its way into the side-A groove.
It begins with a flickering high-frequency tone. Waves of noise gradu-
ally fade into the mix with sweeping filters, above a layer of throbbing
percussive pulses, which gradually rise in amplitude. Now a brief phase of
sonic reconfiguration, with crunchy 8-bit noise, echoing electronics, and
a deep pounding bass that cuts through the mix, as fragmented hissing
sounds splash over the top like purifying electronic waves breaking onto
a shoreline of e-waste. The rhythmic pulsing shifts into a new gear, as the
intensity ramps up and icy sounds shatter over the top. If you close your
eyes, you can almost see them exploding in your mind’s eye, going off
like electric-blue fireworks rendered in sharp fragments of crisp digital
audio. Now an oscillator cycles up and down in frequency, as layer upon
layer of tight rhythmic pulses are pressed on top of each other with the
cold precision of technological automation. Sonic lasers fan out across
the mix, as you are drawn deeper into a dense black sonic mass. Just as
this audio system seems on the verge of collapse, the motors power down
and the acoustic energy dissipates into the surrounding atmosphere.
Sometimes episodes of listening can have a particular potency that
stays with you. I can distinctly remember listening to ‘Il Etait… “Mag-
netique” / Une Possibilite’, by La Peste (Laurent Mialon) (2005) on
headphones one afternoon in my university halls of residence. With vivid
detail, I can recall the room I was in, the feeling of awe that came from
the composition, and having almost visual or synaesthetic sensations that
emerged from the sonic experience. Mialon described his music in terms
of ‘sonic atoms’, where each individual unit of sound had capabilities
for triggering neural impulses in the mind of the listener, giving rise to
micro-responses at any given point in time (Weinel 2007). Individual
pulses, encoded as analogue audio signals on the groove of the record
trigger tiny fluctuations in the magnetic field of the needle. Translated
into a varying electrical voltage, these pulses flow through the amplifica-
tion system, driving moving coil headphone transducers that turn the
analogue signal into acoustic sound waves. These sound waves prop-
agate the electronic pulsing into the ears, activating vibrations on the
eardrums, setting the hammer, anvil and stirrup in motion, causing the
basilar membrane to quiver inside the cochlea. Different areas of this
membrane trigger nerve impulses that stimulate the auditory cortex of
the brain. From here, the brain makes sense of the stimuli, as differ-
ences in the incoming signals in the left and right ears give rise to spatial
impressions of sound. As the brain interprets ambiguous shard-like sonic
materials, it draws upon past recollections of familiar sounds and associ-
ated events, latching on to memories of shattering glass or ceramics. This
leads to visual associations and emotions, as one imagines what the sonic
event might look or feel like. In an instant, the dark plastic of the record
transmits vivid impressions of sound, emotions, and associative images
that can almost be seen, lending the music a synaesthetic, psychedelic
potency.
Electronic music has unique capabilities for realising almost any sound
imaginable. Composers working with synthesis, sampling, and digital
2 Psychedelic Journeys in Sound: Electroacoustic Compositions 29
When we hear a sound, this may trigger episodic memories if they cause
us to think of a specific past event. For example, if we heard a recording
of shattering glass, this might allow us to remember a specific time when
we accidentally dropped a glass. However, more usually if we heard
such a sound, we would probably access our semantic memory, which
allows us to comprehend its associations in a more general sense. Hence,
hearing shattered glass might not make us think of any specific episode,
but rather we would grasp the significance of the sound in a more
general, semantic way from past patterns of experience.2 These recollec-
tions are often ‘multimodal’, in that they may include not only sound,
but also other modes of sensory information such as visual impressions,
or emotions. By manipulating the spectromorphological characteristics
of sound, composers can play with our experience of source bonding,
and this in turn opens up many possibilities for electroacoustic music
to trigger multimodal associations and visual images in ‘the mind’s eye’
(Taruffi and Küssner 2019; Trickett 2020). These associations might be
used to conjure impressions of real-world environments; for instance, one
could compose a soundscape3 with the sounds of buzzing insects, tall
rustling grass and distant birdcalls, and this might paint a vivid mental
image of a meadow in the height of summer. However, the illusory prop-
erties of sound may also allow the representation of imaginary or unreal
places, as in compositions such as Barry Truax’s Chalice Well (2009),
which takes the listener on a sonic journey from a well in Glastonbury
into the spaces of a mythical underworld.4
Our journey in Explosions in the Mind begins within the field of
electroacoustic music, understood from this perspective as an illusory
medium that is capable of conjuring associative multimodal journeys that
represent real or unreal places and spaces. In 2007, I began composing
works of electroacoustic music that explored ways in which to represent
altered states of consciousness (ASCs) through sound. In this chapter, I
will discuss the design of these works and the compositional method-
ologies that were used to realise them. At first, the approach that I used
was one of ‘adaptation’, in which the compositional form was modified
by designing sonic materials based on typical features of ASCs. This was
initially explored through three compositions: Night Breed (2008), Surfer
Stem (2008), and Night Dream (2008). Later, the development of this led
2 Psychedelic Journeys in Sound: Electroacoustic Compositions 31
An Adaptive Approach
The first compositions that I created utilise an ‘adaptive’ approach,
whereby the form is modified to incorporate features that are related to
ASCs. As with many of the works discussed in this book, this approach
emerged through the process of composition as a ‘bottom-up’ process.
That is, the approach was not entirely pre-conceived at the beginning,
but arose through iterative processes of composition and reflection.5
These works can be understood as ‘adaptive’, because they utilise
existing approaches for composing electroacoustic music and electronic
dance music, but modify the form of the composition to incorporate
materials that relate to aspects of ASC experiences. This process of
‘adaptation’ can be explained by considering how one might render a
watercolour painting that represents a hallway as one might see it during
an episode of hallucination. To paint a hallway as it may appear during a
state of normal-waking consciousness, we could use watercolour paints to
render the scene, indicating the contours of the architecture, the colours
and textures of the walls, floor, and ceiling. We could attempt to render
32 J. Weinel
God’s Made Love’ (1968) also uses tape recordings with variable speeds,
perhaps suggesting the distortions to time perception that may occur
during an ASC (Ludwig 1969, pp. 13–14). Across these various exam-
ples, we find that familiar musical forms are modified towards various
concepts of psychedelia. Furthermore, if we look more broadly at other
‘psychedelic’ genres of music—psychedelic folk, psy-trance, or stoner
rock, for example—we would also often find similar adaptations, where
the core approach owes a debt to established genre forms, but aspects of
production and sound design modify these and orientate them towards
tangible notions of psychedelia in cultural circulation.
The electroacoustic compositions that I discuss in this section use a
similar process of adaptation, but do so within the domain of electroa-
coustic music. These works are composed using the familiar toolkit of
synthesis, sampling, and DSP techniques, including the use of processes
such as granular synthesis, which are often used in this field of compo-
sition. However, the form of the work is adapted by designing various
sonic materials based on features of ASCs. This process can also be
understood in terms of Emmerson’s (1986) ‘mimetic discourse’, which
describes the signifying potential of sound that results from referential or
extrinsic qualities. By taking features of ASCs, such as concepts of visual
or auditory hallucinations, these can be used to inform the design of
mimetic sonic materials, which are then used to adapt the compositional
form. This is the main approach that is used in the three fixed-media
electroacoustic compositions that I discuss below.
Night Breed
Breed , the piece uses various filter, EQ, reverb, convolution, and ampli-
tude envelopes to create an effect where sounds move in and out of
the listener’s awareness, or morph into different versions of themselves.
These shifting transitions can be heard from 1:50 to 2:10. In the section
from 4:50 to 5:50, these morphing transformations were designed using
organic contours, giving an aural impression of gradual submersion,
almost as if the listener is moving underwater.
In summary, Night Breed adapts typical approaches of electroacoustic
music and electronic dance music, by using various techniques that are
based on features of psychedelic ASCs. While the main form of the piece
is derived from electroacoustic music and electronic dance music, the
piece also incorporates various techniques that interpret features of hallu-
cinations, focusing particularly on organic approaches to sound design,
which are related to a cellular concept of hallucinations.
Surfer Stem
Night Dream
In summary, Night Breed , Surfer Stem, and Night Dream form a trio
of works based upon Leary’s ‘seven levels of energy consciousness’.
These pieces develop a variety of techniques that are used to adapt
typical forms of electroacoustic composition, while also incorporating
ideas from electronic dance music genres such as dubstep and flashcore.
These forms were adapted by designing mimetic sonic materials, which
are related to features of ASC experiences. Leary’s concepts inform the
general approach that is taken for each piece: Night Breed interprets
‘cellular consciousness’ through organic sounds; Surfer Stem translates
‘atomic consciousness’ through a digital aesthetic; and Night Dream
renders ‘sensory consciousness’ through low-frequency drones. Within
these approaches, specific types of sonic materials are designed, which are
considered to correspond with features of hallucinations. For instance,
micro-rhythmic sounds are designed based on visual patterns of hallu-
cination; while drone sounds are used to represent distortions to time
perception. Various echo and reverb effects are also used to provide
40 J. Weinel
Entoptic Phenomena
Swamp Process
Nausea
Light Dark L
Light
ent
sounds sounds so
sounds
∗ ∗ ∗
Notes
1. The call for music made from noise-based sounds was made in the Futurist
manifesto The Art of Noises (Russolo 1913), which is widely regarded
as an important precursor to electroacoustic music. In this manifesto,
Luigi Russolo famously proposed that “we must break at all cost from
this restrictive circle of pure sounds and conquer the infinite variety of
noise-sounds”.
2 Psychedelic Journeys in Sound: Electroacoustic Compositions 51
Fig. 2.3 Painting representing visual patterns of hallucination, as used for the
label artwork of the Entoptic Phenomena in Audio vinyl EP
1965–1968 (1972). For a further discussion see also Hicks (2000) and
Weinel (2018).
7. Hybrid combinations of electroacoustic music with various popular music
and electronic dance music forms were also explored by several other
composers working in the Keele University music studios at this time,
for example see also Shave (2008, 2013) and Ratcliffe (2012).
8. Cellular and organic forms are explored elsewhere in various forms of
computer art. For example, see the morphogenetic 3D sculptures of Andy
Lomas (2020); or the evolutionary computer graphics art of Stephen Todd
and William Latham (1992).
9. For an expanded discussion of electronic dance music genres such as jungle
and techno, see Reynolds (2008). ‘Dubstep’ here refers to the genre of elec-
tronic dance music popularised by South London artists such as Skream,
Loefah, Burial, and Digital Mystikz in the 2000s; see Walmsley (2009).
10. In contrast with the ‘organic sounds’ of Night Breed , with Surfer Stem
I wanted to achieve a more futuristic, digital sounding composition to
express Leary’s atomic electronic level of energy consciousness. ‘Digital
sounds’ can be achieved through various means such as emphasising the
use of linear or stepped envelopes (as is possible with synthetic sound
sources), and quantisation.
11. ‘Speedcore techno’ is a form of techno music that uses fast tempos, typi-
cally above 200 bpm. ‘Flashcore’ is the term used by Laurent Mialon to
describe work released on his label Hangars Liquides, particularly his own
music as La Peste, which uses fast and irregular tempos, and dense waves
of percussive electronic sounds. Flashcore extends the approaches of speed-
core techno, while also taking influences from electroacoustic music. The
term has since been adopted by other underground techno artists who
have taken inspiration from these approaches; for instance, a search of
the term on Toolbox Records (http://www.toolboxrecords.com/) will yield
many results. For a further discussion of flashcore and Mialon’s music, see
also Weinel (2007) and Migliorati (2016).
12. Max/MSP is a visual programming language for sound and music.
13. Surf rock is a specific genre of (often instrumental) rock n’ roll music that
was popular in the 1960s; for a further discussion see Crowley (2011).
14. This use of time-stretching can be understood in terms of Smalley’s (1986)
discussion of continuant phases of sound, which can achieve dissociation
from temporal notions of onset and termination.
15. As discussed in Chapter 1 with reference to Veal (2007, pp. 209–210), dub
reggae can be understood as a form of ‘psychedelic Caribbean’ music. In
2 Psychedelic Journeys in Sound: Electroacoustic Compositions 53
some ways drawing parallels with electroacoustic music, dub uses exper-
imentation with tape, audio effects, and other music technologies. As
discussed by Jones (2017), reggae music is interwoven with multi-cultural
British music culture, and signature traits from genres such as dub are
often found in electronic dance music genres such as jungle/drum & bass
(Belle-Fortune 2005) and dubstep.
16. In the 2000s, dubstep music was often associated with the idea of ‘bass
meditation’. For example, the DMZ club night was advertised with the
phrase “come meditate on bass weight”, and the ‘bass meditation’ trope
was used by various MCs and producers at this time.
17. Sensory isolation tanks were used in combination with psychedelic drugs
as a means to elicit hallucinations in the work of John Lilly (1972). This
provided the inspiration for the movie Altered States (Russell 1980), and
more recently, the character of Dr. Martin Brenner in the Netflix series
Stranger Things (Duffer and Duffer 2016–present), which depicts various
hallucinations in sensory isolation tanks.
18. Various ‘breakthrough’ experiences are described by participants in
Strassman’s (2001) DMT studies (e.g. pp. 179, 213), in which the halluci-
nations move beyond visual patterns and various encounters with entities
may be experienced.
19. ‘Hollow earth theory’ presumes that the earth contains substantial inte-
rior space. The theory has been used as a source of inspiration for many
fictional novels, most famously Jules Verne, Journey to the Centre of the
Earth (1864); and also films such as The Core (Amiel 2003) and Journey
to the Center of the Earth (Brevig 2008).
20. One of the main points of inspiration here is The Bug’s Pressure (2003)
album, which uses syncopated dancehall rhythms throughout, and can also
be linked with the emerging form of dubstep music in the early 2000s.
21. For example, hallucinogens such as the peyote cactus or yagé (the hallu-
cinogenic brew containing DMT, which is used in shamanic ceremonies of
the Amazon rainforest) are known to cause physical discomfort and nausea.
Bodily sensations in general may also be heightened during episodes of
hallucination.
22. Although psychotic experiences and LSD hallucinations are distinct forms
of ASC, it is possible to draw some comparisons between the form of these
experiences. For further discussion see Adams (1994), Hobson (2003,
pp. 5–6), and Blackmore (2003, pp. 307–308).
3
Melting in the Mind’s Eye: Real-Time
Performances
Atomizer
As seen in the top-left corner of the user interface (Fig. 3.2), the ‘atom-
iser’ is the main sound generation module of the software, which creates
streams of micro-rhythmic sounds based on visual patterns of hallucina-
tion. The module provides sample banks of 10 short percussive sounds,
which can be selected from a list of presets, or by loading other samples
using the ‘custom’ button, which opens a sub-panel of the interface.
These sample banks store variations of electronic pulses, organic sounds,
metallic percussion, recordings of pieces of wood being struck, and cactus
spines being plucked. When the module is triggered, either a specific or
randomly selected sound is played. Sounds can be played as polyphonic
‘one-shots’, or as monophonic loops.
The module can be triggered at random intervals by using the ‘random
speed’ slider from the bank of controls directly beneath the module.
Increasing this slider from the zero position activates triggers at random
intervals with increasing regularity. This enables the streams of percussive
sounds to be organised irregularly in time.
Alternatively, the module can be triggered with a matrix sequencer.
This is activated using the ‘sequence’ switch on the module, while
3 Melting in the Mind’s Eye: Real-Time Performances 59
The panel of controls beneath the atomiser module on the user inter-
face provides various dials and sliders for manipulating the sound. A
‘loop point’ dial adjusts the end point of the loop, when the looping
mode is switched on. ‘BPM’ changes the speed of the sequencer when
active. ‘L/R’ pans the signal from left to right, while ‘F/B’ moves it from
front to back in 5.1. These are applied additively with a Doppler shift
based on a design by Rajmil Fischman, which provides a rotating effect.
Using the ‘Doppler width’ and ‘Doppler pitch’ controls, the width and
speed of rotation can be adjusted to elicit various circular and spiral
spatialisation patterns that can be applied to the streams of percus-
sive sound, thereby generating patterns that correspond with the form
constants. A ‘deform’ control transforms the sound with a semi-random
pitch bend effect based on the random interpolation of two internal
breakpoint graphs. The ‘deform’ control adjusts the amount by which
this value affects the pitch of the sample being played. The ‘speed’ control
also affects the overall speed at which the sample is played, changing the
pitch.
The ‘atoms volume’ slider changes the amplitude of the samples, while
the remaining dials and sliders provide controls for various effects: distor-
tion, ring modulator, delay, reverb, and filter, which allow the rhythmic
streams to be coloured in various ways. ‘A’ and ‘B’ buttons are also
provided to quickly switch between preset effects configurations.
Atomizer Joystick/Ribbon
Drone Machine
The ‘drone machine’ module (seen at the centre-top of Fig. 3.2) provides
facilities for making drones, which correspond with distortions to time-
perception (as discussed in Chapter 2). The module provides a granular
synthesiser, which is configured specifically for making drones by means
of granular time-stretching. A source sound can be loaded, from which
the grains are extracted. The time value changes the point in the source
sound that the grains are taken from, and moving this value slowly back
and forth when using a continuous source such as a vocal sample, creates
droning sounds that retain the sonic characteristics of the source. The
‘scrub’ slider on the control panel beneath the ‘drone machine’ allows
this value to be controlled via MIDI, while the ‘scrub speed’ dial directly
above modifies the rate at which this parameter changes. The ‘grain size’
dial changes the size of the grains, while ‘width’ adjusts their left/right
spatial distribution. The ‘grain volume’ slider changes the overall ampli-
tude of the drone, and the dials above provide spatialisation controls
to pan the sounds between left/right and front/back positions. A reverb
control is used to apply a plate reverb effect.
Lastly, as discussed in the previous chapter, several of my composi-
tions use low-frequency bass tones. These sounds can be generated with
the sine-wave oscillator on this module, using the ‘bass’ (amplitude) and
‘frequency’ dials. This feature was partly inspired by Z’EV’s manipulation
of low-frequency sounds with respect to the acoustic and resonant prop-
erties of the concert venue, as described in the opening of this chapter. By
providing a sine-wave oscillator, it is possible to create droning sub-bass
sounds that can be tuned to the performance space in live situations.
62 J. Weinel
DJ Mixer
The ‘DJ mixer’ at the bottom-right corner of the user interface (Fig. 3.2)
allows pre-planned sound materials to be triggered and mixed sponta-
neously, in the style of a continuous DJ mix. This provides two playback
modules, ‘deck A’ and ‘deck B’, which are analogous to the two turnta-
bles of a typical DJ setup. Each ‘deck’ provides 3 sound file players,
which allow several audio files to be triggered simultaneously. These are
intended for use with either long pre-composed sections of music or
short gestural sounds. Using the crossfader it is possible to fade between
the sounds from ‘deck A’ and ‘deck B’. A ‘fade angle’ is provided, which
provides different crossfade envelopes.8 In summary, drawing on my
background as a DJ,9 the ‘DJ mixer’ module allows for the spontaneous
live mixing of pre-composed sections of music, while various comple-
mentary drones and streams of rhythmic sounds can be improvised
alongside these using the other modules.
Audio Output
Finally, the ‘audio output’ module (visible in the top-right of Fig. 3.2)
provides essential functions for selecting the soundcard, digital signal
processing (DSP) status, and output levels. Amplitude meters are
provided for 5.1 configurations. A ‘record’ module opens a sub-panel of
the user interface, allowing the output to be recorded in real-time. This
can be used to record whole performances; or improvised ‘jam sessions’,
which may be used to generate sounds that are later edited for further use
in fixed-media compositions. The former approach was used to make the
recording discussed in the next section of this chapter, while the latter
was used for many of the compositions discussed in Chapter 2.
Swamp Breed
DJ Mixer Ent01 Ent02 Tiny01 Tiny03 Ent02 Ent03
01 01
Atomizer Ent Sax Wood Sax Metal (Loop) Blip Ent Blip Ent
Fig. 3.3 Performance notes indicating the structure of Entoptic Phenomena in Audio
3 Melting in the Mind’s Eye: Real-Time Performances
65
66 J. Weinel
This is the ‘eye of the storm’; a calm phase during the central ‘plateau’
of the hallucination. In terms of the performance techniques, here the
drone sound is created by ‘scrubbing’ back and forth slowly with the
1960s surf rock vocal source sound from Surfer Stem, in such a way that
the drone alternates between two notes, so that this single moment seems
to trail on indefinitely. Beneath the drone, a throbbing low-frequency
sub-bass from Tiny Jungle can be heard.
At around 15:00 the beach vision begins to tumble away, as rapid
pulsing sounds begin to overtake the spatial auditory field. An intense
wave of sounds suggestive of visual patterns of hallucinations circle and
spiral around the listener, created using the various ‘atomiser’ modules of
the Atomizer Live Patch, while drones persist in the background. This
section is essentially a reconfiguration of the closing section of Entoptic
Phenomena, and also includes pre-recorded source materials from this
composition. As in the fixed-media version of this composition, from
19:20 onwards, the hallucinations melt away, returning the listener from
the ‘internal’ world of hallucinations to the familiar ‘external’ world of
the sensory isolation tank as the experience ‘terminates’.
At the top-left corner of the user interface we can see the main patch
input controls and presets panel. This provides options for selecting the
audio device, a bank of presets, and an ‘info’ button which opens an
instruction manual. Much as one would find on any good keyboard,
there is a ‘demo song’ button, which triggers pre-made recordings of a
bass drum and saxophone, which are then processed in real-time, thereby
demonstrating the patch in action for testing purposes. This panel also
features a red light to indicate when the output is being recorded.
70
J. Weinel
Fig. 3.4 User interface for the Bass Drum, Saxophone & Laptop Max/MSP application
3 Melting in the Mind’s Eye: Real-Time Performances 71
Moving right from the top-corner, the top two rows provide various
modules related to the saxophone and bass drum instruments, respec-
tively. The ‘sax input and trigger’, and ‘drum input and trigger’ modules
receive microphone inputs from the respective instruments, which the
patch will process with various DSP effects, and use to generate trigger
messages. These modules allow the correct input channels of the sound-
card to be selected and the amplitude of the incoming signals from the
microphones to be adjusted. These modules activate triggers when the
input signal reaches a certain level, which can be set with the ‘trigger
level’ control. When a trigger is activated, it cannot be triggered again
until the module reaches the ‘retrigger delay’ threshold in milliseconds.
These modules also allow pre-recorded sound files to be loaded and used,
instead of live microphone signals.
The ‘sax effects rack’ and ‘drum effects rack’ modules provide various
DSP effects, which are similar to those used in the Atomizer Live Patch.
These include a rotating Doppler effect, distortion, ring modulator,
delay, and a plate reverb unit. Various parameters and bypass options are
available for each effect. These parameters can be set manually, however
the patch also provides the option to automate these using control data
from various other modules, received via the ‘patch bay’.
The ‘sax trigger envelope’ and ‘drum trigger envelope’ modules receive
trigger messages from the ‘sax input and trigger’ and ‘drum input and
trigger’ modules. Here, these messages are used to trigger three different
envelope generators. When a trigger message is received, three new
envelopes are generated with different attack and decay properties. These
can then be used as control envelopes, as selected with the ‘patch bay’
72 J. Weinel
Along similar lines, the ‘sax scatter envelope’ and ‘drum scatter envelope’
modules provide another means of generating control envelopes that can
be used to automate DSP parameters elsewhere in the patch, via the
‘patch bay’. When this module receives a trigger, it generates an envelope
using an array of 50 slider values that can be manually adjusted. These
slider values set where peaks of the envelope will occur in time; so each
non-zero value will generate a new peak, thereby providing a complex
envelope with multiple ‘scattered’ peaks each time a trigger is received.
The overall time distribution of these peaks in milliseconds is effected
by the ‘total scatter time’ parameter. These modules include options for
‘attack’ and ‘decay’ of the scattered envelopes; a ‘combine’ option that
generates the scattered envelopes additively; and an ‘invert’ option which
allows the peaks to be subtracted from a maximum value.
Moving along to the right on the user interface panel, we find the
‘sax sustained playing envelope’ and ‘drum sustained playing envelope’
modules. These modules trigger envelopes only when sustained playing
on the corresponding instrument is detected. This detection system
works by receiving triggers from the input trigger modules. If the target
number of triggers (x ) is reached within a specified period of time (y),
then a ‘sustained playing envelope’ is activated. This sustained playing
envelope remains active until a specified period of time (z ) passes without
any triggers being heard. These modules provide an effective means of
differentiating shorter bursts of sound from sustained playing, generating
control envelopes that can be routed via the ‘patch bay’ module.
3 Melting in the Mind’s Eye: Real-Time Performances 73
While the various saxophone and drum envelope modules allow control
envelopes to be generated based on the input signals from the corre-
sponding instruments, the ‘LFO bank’ and ‘LFO bank 2’ modules each
provide two low-frequency oscillators (LFOs), which operate indepen-
dently. Different waveforms, frequency, amplitude, and offset properties
can be selected for each LFO. The LFOs can also be routed via the ‘patch
bay’ to control other parts of the patch.
The ‘drunk bank’ and ‘drunk bank 2’ modules each provide further
means of generating two envelopes independently of the input signals.
These modules generate randomised envelopes that wander ‘drunkenly’
between values. Parameters for ‘speed’, ‘range’, ‘set’, and ‘offset’ affect
how these envelopes are generated. As before, each control envelope can
be routed via the ‘patch bay’ module.
Atomiser
Located at the bottom left of the user interface, the ‘atomiser’ module
is an adaptation of the same module described earlier in the Atomizer
Live Patch. This module triggers streams of sound in correspondence
with visual patterns of hallucination. As before, this module can be used
with various selectable sound banks, each of which contains 10 percus-
sive samples. The module receives a trigger, either from the saxophone
or the bass drum, which then triggers one of the percussive samples
from the bank at random. Whereas the Atomizer Live Patch provided
a ‘random speed’ slider, which triggered the module at random intervals,
Bass Drum, Saxophone & Laptop replaces this with a ‘scatter’ func-
tion, which initiates a series of retriggers according to a pattern selected
by the user (using a similar interface to the ‘sax/drum scatter envelope’
modules). When this option is selected, the module initiates a series of
‘scattered’ percussive sounds from a single trigger.
74 J. Weinel
Patch Bay
Moving rightwards along the bottom of the user interface, the ‘patch
bay’ module provides a modular system, which allows complex routing
of data between modules. Input control envelopes can be selected from
the various modules of the patch, and assigned to an output such as a
DSP parameter on the saxophone, bass drum, or atomiser effects racks.
This allows for many possible configurations of the patch, which produce
different results during performances, whereby sounds of the saxophone
can transform the bass drum and vice versa; or the various LFO and
drunk modules can be used to change parameters independently of either
instrument. This opens up many possibilities for organic, gradual trans-
formations of sound, as effects morph between different settings which
colour the sound, while spatial properties of reverb open and close on
an on-going basis, thereby reflecting concepts of shifting perception
in psychedelic experiences. Once effective patching configurations have
been found, these can be saved using the ‘presets’ module.
Lastly, in the bottom-right corner of the user interface, the ‘mixer and
audio output’ module provides a mixing desk, allowing levels to be
balanced between the wet and dry signals from the saxophone, bass
drum, and atomiser modules. An overall master output fader is provided,
3 Melting in the Mind’s Eye: Real-Time Performances 75
as are options for recording the output of the software in stereo or 5.1
multi-channel.
At its core, the Bass Drum, Saxophone & Laptop application is a highly
specialised multi-effects unit, providing various options for processing
the sounds of live instrumental performances with DSP effects, while
also triggering additional streams of percussive sound. In this way, the
software continues to develop the electroacoustic approaches discussed
earlier, adapting these for use in real-time performances. Using the patch,
streams of percussive sound related to visual patterns of hallucination can
be triggered by live instruments. Where before drones and sensory bass
sounds were provided by the ‘drone machine’ module of the Atomizer
Live Patch, these could now be performed using the bass drum and saxo-
phone instruments. In particular, hanging the bass drum from a gong
stand increases the acoustic resonance of the instrument, so that thun-
derous bass drones can be performed live; while the long notes performed
on the saxophone can be extended with the delay, reverb, and feedback
effects. The patching capabilities of the software allow various proper-
ties to be routed between instruments, so that the system seems to take
on a life of its own. The laptop becomes an autonomous agent15 that
initiates gradual shifting frequencies and spatial properties, in correspon-
dence with experiences of shifting and morphing perception that may
occur during psychedelic ASCs. Overall then, the software facilitates the
psychedelic adaptation of live instrumental performances, and though
conceived for bass drum and saxophone, it can potentially be used with
one or two instruments of any type.16
Media 3.4 Bass Drum, Saxophone & Laptop (23 February 2010, Session
1), instrumental performance with live electronics, 11 minutes 5 seconds
The audio example (Media 3.4) provides a general illustration of the
performance system in action, and is the best available recording from
this period. In the opening of the recording we hear the bass drum,
processed with a ring modulator sound, while the saxophone signal fluc-
tuates in pitch via use of the Doppler effect. The ‘atomiser’ module is
heard being triggered at 0.47 and 1:15, as the drum hits the target level,
activating the scattered rhythmic sounds. For most of this section the
drum is deliberately played beneath the trigger level, so activation of
these sounds can be controlled, introducing the rhythmic sounds only
when the performer intends. As before, these sounds are conceptualised
in relation to visual patterns of hallucination, so as the performance level
gradually increases, waves of rhythmic sounds engulf the auditory field,
in correspondence with the ‘onset’ of a hallucination.
3 Melting in the Mind’s Eye: Real-Time Performances 77
∗ ∗ ∗
Through the course of this chapter we have seen how the psychedelic
approaches to electroacoustic composition discussed in the previous
3 Melting in the Mind’s Eye: Real-Time Performances 79
Notes
1. Z’EV kindly provided me with constructive feedback on the composi-
tion Bass Drum, Saxophone, and Laptop discussed in this chapter, and
shared some of his writings on music and animism. Animism is the belief
often held in shamanic societies, that forces and entities within nature
have a soul (for example, see Eliade 1964; Vitebsky 1995). Z’EV’s writ-
ings discussed his practices making music and constructing percussive
instruments using various ritualistic, animist practices.
2. As discussed by Tanner (2016), vaporwave is an Internet-based music genre
and visual art style originating in the early 2010s, which remixes sound
sources such as 1980s pop, elevator music and late night TV commer-
cials, looping and slowing does these sounds ad infinitum. James Ferraro
is often credited as one of the originators of the style; for example, see
Kalev (2018).
3. The extent to which electronic music using pre-recorded sounds can be
considered ‘live’ has been a subject of much debate; for example, see
Sanden (2013). For our purposes here, it will be sufficient to acknowledge
that ‘liveness’ in electronic music performances is often distinct from that
of instrumental performances, but offers its own valid set of approaches.
4. For further visual representations and discussion of visual patterns of
hallucination, see also Bressloff et al. (2001).
80 J. Weinel
At first you see a flickering noise pattern, which crackles with bril-
liant shades of red and green. Shapes begin to form and emerge from
the chaos with increasing rapidity. First cell-like structures with nuclei
that multiply, coalesce, and melt away before your eyes; then strobo-
scopic rectangles that engulf the visual field and dilating circular patterns.
Nothing is static, everything is shifting and evolving, as colours and
shapes distort, fade into each other, and fall apart. What you hear is
jazz music. Hot saxophones flutter through the mix, dripping in sweat
as punchy drum sounds bounce along amicably with a throbbing bass
and percussive high-hats that mark out a quickening groove. Everything
is playing and rolling together in sync. A cluster of notes explodes before
you and suddenly there are circles everywhere; now angular rectangles
animated and move to music. Smith created his own visual music paint-
ings and films, and his Early Abstractions (1939–1957) series clearly owes
a debt to Fischinger’s work. Yet as discussed in the film American Magus
(Igliori 2002), Smith’s work also drew inspiration from other sources
such as the occult, and American folk traditions.1 In discussing his films,
he interestingly claims they were inspired by experiences of sleep depri-
vation, intoxicated hallucinations (Sitney 1979, p. 233), and a Dizzy
Gillespie concert, where he ‘had gone… very high, and… experienced all
kinds of colored flashes’ in response to music (Sitney 1965, p. 270). His
films can therefore be understood as psychedelic visualisations of sound,
and were projected with jazz music2 as a soundtrack.
The idea of creating films that represent the synaesthetic visuals
that one might see during a hallucination is the main focus of this
chapter. My work in this area falls under the category of ‘audio-visual
composition’, a branch of electroacoustic composition closely related
to visual music, where sonic artworks are designed with corresponding
images or video elements.3 In what follows, I will explore strategies
for composing fixed-media audio-visual works based on altered states
of consciousness (ASCs) and psychedelic hallucinations. First, I discuss
the Atomizer Visual, a Max/MSP/Jitter4 application designed for gener-
ating stroboscopic visual materials. I will describe how this was used to
create Tiny Jungle (2010), in which electronic music is used in combi-
nation with hand-drawn materials and computer graphics to paint a
psychedelic journey in sound and image. Following this, I will examine
a series of three audio-visual compositions that were inspired by a trip
to Mexico: Mezcal Animations (2013), Cenote Zaci (2014), and Cenote
Sagrado (2014). Using direct animation, these compositions are visual
music works with electroacoustic sound, which were also developed
in relation to concepts of hallucination. These pieces were composed
using various combinations of analogue and digital materials, including
some elements made using Processing, a programming language for
visual design. Through the discussion of these fixed-media audio-visual
works, this chapter will explore possible methodologies for composing
psychedelic sounds and visualisations.
84 J. Weinel
Tropical Hallucinations
Extending the approaches developed through my electroacoustic compo-
sitions and transferring them into the audio-visual domain, Tiny Jungle
was based on the idea of a psychedelic journey realised in sound and
image. This audio-visual composition was performed at NoiseFloor
festival (Staffordshire University, 20–22 September 2011), where the
sound was spatialised via live diffusion on a multi-channel system. Soni-
cally, the piece explores a similar ‘adaptive’ approach as was described in
Chapter 2, where electroacoustic music and electronic dance music forms
are modified to incorporate approaches based on features of ASCs. Visu-
ally, the work uses hand-painted materials in order to elicit organic and
atomic forms and a journey through a forest-like environment, thereby
representing visual hallucinations according to the concepts explored in
the previous chapters. As with the electroacoustic compositions and real-
time performances discussed earlier, the artistic process is interwoven
with coding, and in this case some materials were designed using a
bespoke software application: the Atomizer Visual.
Fig. 4.1 User interface and example visual noise output of the Atomizer Visual
Max/MSP/Jitter application
the ‘drop file’ area of the patch. The four channels are mixed together
to form a single video output. Each channel triggers a strobe effect by
modifying the opacity of the source materials, which flicker rapidly at a
tempo set by using the metronome controls. Images fade out when they
disappear, according to specified time durations in milliseconds, which
can be controlled using the ‘decay’ values. The blending of channels is
defined using various mathematical expressions (e.g. addition, subtrac-
tion, multiplication, and division), which are selectable using the ‘exp’
menus beneath the channels. Depending on the source materials and
combination settings, ‘visual noise’ effects can be produced (see Fig. 4.1).
The video output of the software can be recorded using the controls at
the bottom-right of the patch, and instructions are provided on the main
user interface panel.
The Atomizer Visual is a relatively simple Max/MSP/Jitter patch that
was conceived as a tool for generating stroboscopic visual materials in
real-time. While working in real-time comes with some constraints with
regard to image quality,5 it also provides a more improvisational, perfor-
mative workflow, and allows organic spontaneity to be incorporated in
the design of the work.
86 J. Weinel
Tiny Jungle
While these forms are borrowed from jungle/drum & bass, Tiny
Jungle adapts their presentation based on the ASC concepts discussed
in Chapter 2. For example, the long introduction section from 0:10
to 1:30 would usually be designed using synthesiser pads, but in this
case it was realised using granular synthesis techniques that correspond
with distortions to time perception. Along similar lines, the rhythms
heard from 1:33 to 2:05 were constructed from organic source materials
such as wood sounds, rather than using the sampled breakbeat7 loops
that would be more common in jungle/drum & bass. The composition
also uses sounds reminiscent of birdcalls, which emphasise the tropical
theme. These birdcall sounds can be heard at 0:02–0:10, 1:21–1:31, and
throughout. They were made by experimenting with pitch transforma-
tions (they are not actually recordings of birds), and were arranged into
rhythm patterns, as heard from 2:00 to 2:20, where they fade into a wave
of entoptic rhythmic sounds based on visual patterns of hallucination.
These entopic sounds are also heard later in the piece at 4:45–5:00.
Structurally the music of Tiny Jungle is based on concepts of energy
levels, which relate to Fischer’s (1971) ‘cartography of ecstatic and medi-
tative states’. As noted in Chapter 1 (p.1), Fischer describes ‘ergotropic’
states of energy expenditure and ‘trophotropic’ states of energy conser-
vation, which in Rouget’s (1985) discussion correspond with states
of trance and meditation, respectively. As indicated in Fig. 4.2, these
concepts inform two distinct phases in the composition. The section
from 1:33 to 3:47 uses fast, syncopated rhythms, which relate to
ergotropic states of trance, while the section from 3:48 to 6:10 slows the
tempo of the music to a throbbing bass groove, reflecting a trophotropic
state of ‘bass meditation’. In this way the structure corresponds with
the concept of an ASC that moves between ergotropic and trophotropic
states, and this idea is also developed through the visual materials of the
piece.
We may now turn to consider the visual design of Tiny Jungle. The
opening of the piece ‘onsets’ with a ‘landscape flight’ over rocky hills
(0:00–1:00). During this section, the sky is depicted using digitally
transformed footage of ink droplets falling into water, creating organic
turbulence patterns. In the sky we see orbiting spheres reminiscent
of Harry Smith’s Early Abstractions (1939–1957) suggesting planetary
88 J. Weinel
motion (0:25). During this section there are also faint traces of visual dot
patterns suggestive of ‘entopic phenomena’, which were made by digi-
tally scanning hand-drawn still images, and then processing them with
the Atomizer Visual software.
At 1:00 these effects begin to intensify, until 1:34 where a ‘break-
through’ occurs and we move into a phase that is characterised by many
dots representing visual patterns of hallucination. These were similarly
created by processing hand-drawn images with the Atomizer Visual and
combining multiple layers to form composites. This section includes
mysterious hallucinatory forms such as an animated head, which appears
from 1:33 to 1:55; three phallic or mushroom-like ‘weird sticks’, which
emerge at 2:00; lizard-like creatures which crawl across the screen from
2:02; and flickering green triangle patterns which scroll vertically across
the screen from 2:14.
At 2:25 another transition occurs and a gyrating 3D atom appears.
At 2:36, we fly through tunnels of spheres suggestive of Klüver’s (1971)
form constants, this time rendered in 3D. We then see more atomic
patterns (2:51) and various flickering visual strobe effects created with
the Atomizer Visual, including fleeting impressions of a landscape with a
forest in the distance (3:04). From 3:08 to 3:30, we then move into the
‘forest flight’ section, as various trees and a disorienting mass of branches
whizz past us. Further hallucinatory patterns and forms are then seen,
until a ‘breakthrough’ occurs at 3:47.
With a clicking sound, at 3:47 we pop out of the previous wave
of hallucination and the pace of the music changes, signalling a brief
‘plateau’ and the beginning of the ‘trophotropic’ section of the composi-
tion. In this section we see more ‘bizarre/mysterious forms’ suggestive of
hallucinations. First, there is a strange rotating shape covered in spikes,
which has four bone-like spokes. Then at 4:04 we move inside one of
the spokes, entering an interior space with melting oily columns, before
moving out again at 4:36. At 4:54 we drop through another of the
spokes, where we see yet more visual patterns of hallucination (made with
the Atomizer Visual) and 3D spheres (Fig. 4.3). A throbbing, trance-like
bass sound is heard as a capsule floats past (5:12); discs pulse around a
sun (5:21); and the orbiting spheres we saw at the beginning move across
the screen (5:36).
4 Tune in, Turn Up, and Trip Out: Audio-Visual Compositions 89
Fig. 4.3 Still image from Tiny Jungle representing visual patterns of hallucina-
tion
Realised with the aid of the Atomizer Visual software, Tiny Jungle
combines hand-drawn materials, computer graphics, and an organic
electroacoustic soundtrack with jungle/drum & bass elements. These
materials are structurally organised to provide an audio-visual experi-
ence analogous to a mystical flight through a hallucinatory forest. The
theme of this composition relates to descriptions of visionary experi-
ences that include extreme macro and micro perspectives. For example,
one might have an experience of flight above a forest (macro perspec-
tive), or visualise cells or atoms (micro perspective).8 Extending the idea
of psychedelic journeys in sound, Tiny Jungle provides a hallucinatory
audio-visual journey that traverses these macro and micro perspectives.
90 J. Weinel
Synaesthetic Underworlds
Continuing my explorations in the use of hand-drawn materials, I subse-
quently began making my own direct animation films in order to access
the unique organic visual qualities that this technique provides. Direct
animation was used not only by Harry Smith, but also various other
visual music filmmakers. An early work of this type was Len Lye’s A
Colour Box (1935), which was commissioned by the General Post Office
Film Unit (Russet and Starr 1976, p. 65). On this film Lye used Dufay-
color (a film dying process) to apply various coloured patterns on to
sections of film, which were then matched to ‘La Belle Creole’ by Don
Baretto and his Cuban Orchestra (Horrocks 2001, p. 137), a dance
piece popular in Paris at that time. Other films by Lye such as Kalei-
doscope (1935), Trade Tattoo (1937), and Musical Poster #1 (1940) also
utilise camera-less animation techniques. In a later work: Free Radicals
(1958, 1979), rather than dying the film Lye scratched images on to
black 16 mm leader film, producing abstract moving figures that dance
to tribal drums in a manner perhaps suggestive of trance rituals (Lux
2020). Free Radicals was screened at the 1958 Brussels World’s Fair (Expo
58) where it won second prize in the International Experimental Film
Competition (Len Lye Foundation 2020).
In the 1940s, Norman McLaren also utilised direct animation to
compose films such as Dots (1940) and Begone Dull Care (McLaren and
Lambart 1949) (Russet and Starr 1976, pp. 116–128). These films both
provide striking, close synchronisation between sound and image, so that
visual animations express the timing of music, while different patterns,
shapes, and colours also correspond with pitch and timbre. In the case of
Dots, McLaren achieves these results in audio by drawing on to an optical
sound strip (Peters 1951), thereby translating visual images directly into
sound.
Stan Brakhage is also well known for his direct animation work. His
film Mothlight (1963) was made by attaching moths and various other
debris gathered from a moth trap to 16 mm film stock. Though this piece
is silent, Brakhage considered the piece to have various ‘musical’ aspects
to its composition (Ganguly 1994), and thus it is an example of visual
music where sound and musical qualities inform visual design, but are
4 Tune in, Turn Up, and Trip Out: Audio-Visual Compositions 91
Mezcal Animations
Fig. 4.4 Various frames of animation showing visual artefacts and textures
produced by direct animation on 8 mm film
are unique in each performance and run the risk of failure at any
moment. Even the speed of projection is unpredictable, as the film moves
more slowly or quickly through the projector at different points in time.
The results produced by projecting these reels can be incredibly exciting.
Rapid bursts of colour, visual noise, and textures flash upon the screen
4 Tune in, Turn Up, and Trip Out: Audio-Visual Compositions 95
around 2:34 were made using synthetic bass drum sounds processed with
a guitar distortion pedal. As heard at the beginning (0:15), the visual
noise arising from dust particles on the film is also matched with sounds
of surface noise recorded from the run-in groove of a vinyl record.
The audio-visual composition of Mezcal Animations organises the
sonic materials in relation to the digitised direct animation footage. The
video has three distinct movements, as follows: {I. Mezcal Reposado
/ Pensamiento; II. Mezcal Tobala / El Golpe; III. Sal de Gusano}. In
the first movement (0:13–2:30) we hear the gradual onset of percussive
patterns and drones, while corresponding visual textures and patterns
unfold. The second movement (2:30–3:44) increases the intensity of
sounds and visuals, and here various figurative images emerge from the
visual noise, including mysterious symbols, geometric shapes, an eye, and
a skull (3:14, Fig. 4.6). The final movement ‘Sal de Gusano’ (3:44–3:53),
features short, shard-like audio-visual forms that relate to the sharp, spicy
taste of worm salt, which is often consumed with a piece of citrus fruit
after drinking mezcal.
4 Tune in, Turn Up, and Trip Out: Audio-Visual Compositions 97
Cenote Zaci
echoes of splashing and excited children grow louder, until you turn a
corner and see a vast hole in the rock filled with deep turquoise waters.
A path around the outside meanders its way down towards the basin, the
moist stone glistening in a deep brownish grey colour. Queues of chil-
dren line up to jump in from a great height, dropping from the ledge one
after the other, pausing only occasionally when a more tentative child has
second thoughts. The line momentarily stalls, until finally, egged on by
his friends, he too leaps into the deep cool waters below, with a loud
splash that sends water cascading in all directions amidst cries of jubila-
tion. For everyone else, dipping into the water happens at the bottom of
the path, where one can slip gently from the edge of the rocks into the
refreshing pool that is shared with the ‘lub’ (eyeless, charcoal-coloured
fish).
Taking this cenote as a source of inspiration, Cenote Zaci (Media 4.4)
is an audio-visual composition based on the idea of an aquatic dream
or hallucination of the cenote. Visually, the piece was made using a
similar method of direct animation as that described for Mezcal Anima-
tions, where 8 mm film was first bleached and then painted with various
inks. Rather than the orange tones of the former film, this piece uses
predominantly green and turquoise shades, reflecting the deep waters of
the cenote.
From 0:27 and throughout the piece various fish are seen, rendered
in a bright red colour. These were created by cutting out shapes of
fish, which were then photographed in various positions over a lightbox
to make stop-motion animation clips. These animation clips were then
superimposed over the digitised direct animation footage. This method
preserves the organic, textural qualities of the stop-motion animation,
while adding other moving shapes that would be difficult to draw directly
on 8 mm film with the same level of control.
Along similar lines, Cenote Zaci also uses computer-generated anima-
tions, which were programmed in Processing. As first seen at 0:29, waves
of triangle patterns scroll across the screen, which are reminiscent of the
scrolling triangles used in Tiny Jungle, and similarly reflect visual hallu-
cinations. Elsewhere (e.g. 0:40), a hypnotic diamond tunnel effect is
produced in a similar way, which once again relates to Klüver’s (1971)
form constants. In Processing, the code used to draw these shapes utilises
4 Tune in, Turn Up, and Trip Out: Audio-Visual Compositions 99
that rise and fall in pitch, as formations of fish circle and dive on and off
screen. Lastly at 3:56, the piece ends abruptly with a sweeping noise.
In summary, Cenote Zaci is a direct animation composition that imag-
ines a dream or hallucination of a cenote inhabited by various fish.
This idea provides a point of creative departure leading to an abstract
audio-visual composition that combines hand-painted materials, with
colours characteristic of the cenote; stop-motion animations of fish;
oscillating geometric patterns; and a pulsing electronic noise soundtrack
that provides peaks and troughs in sonic intensity, lending the piece its
structure.
Cenote Sagrado
the dead souls of the Mayan underworld, who rest in the dark waters at
the bottom of Cenote Sagrado.
∗ ∗ ∗
resemble the type of synaesthetic imagery that one might see exploding
in the mind’s eye, as a psychedelic visualisation of sound.
Notes
1. For more information on Harry Smith including his visual music artworks
and involvement with folk music, see also Perchuk (2010).
2. Harry Smith’s films were screened with various soundtracks (Singh 2010),
and seem to reflect mutable relationships with music. In this regard his
films might perhaps be understood as exploiting what Chion (1994)
describes as ‘synchresis’, the natural tendency of the brain to integrate
sounds and cinematic images.
3. For a further discussion of connections between electroacoustic music and
visual music, see Garro (2012).
4. Max/MSP/Jitter is a visual programming language for sound and audio-
visual design.
5. The output resolution of the Atomizer Visual software is 320 × 240 pixels.
This is very low by modern standards, but reflects the capabilities of real-
time video processing using Max/MSP/Jitter on a laptop with average
specification in 2010.
6. For indicative examples of late 1990s drum & bass, see the Metalheadz
compilation Platinum Breakz (Various Artists 1996), Arcon 2’s self-titled
album (1997), and Alpha Omega’s Journey to the 9th Level (1999).
7. Jungle/drum & bass music typically makes use of sampled drum breaks
taken from funk tracks by artists such as James Brown, Lyn Collins, or
The Winstons, whose track ‘Amen Brother’ (1969) is the source of the
famous ‘amen’ drum break used on countless tracks (Harrison 2004). The
tempo of these breakbeats is usually increased to 160 bpm or higher, and
they are often sliced, rearranged, and treated with various digital effects in
order to construct complex rhythmic syncopations.
8. For example, in Jeremy Narby’s (1999) The Cosmic Serpent: DNA and the
Origins of Knowledge the author describes visualisations of DNA structures
while under the influence of ayahuasca.
9. More information about Internacional del Mezcal (International Mezcal
Festival) is available online (https://www.oaxaca-mio.com/fiestas/feriadelm
ezcal.htm). For a colourful, descriptive account see also Lizotte (2014).
4 Tune in, Turn Up, and Trip Out: Audio-Visual Compositions 105
Imagine a device that allows you to play, record, and share sensory expe-
riences. Placing a cap of wires and sensors on your head, suddenly your
visual and auditory field changes to the sights and sounds of a deep
canyon that you once visited. However, this is not like looking at a
photograph, or watching a video recording—rather, it is a digital hallu-
cination that makes you feel as though you are actually there. You can
look and see the deep turquoise waters around you; hear birdcalls above
and water lapping onto the rocks; and sense a gentle breeze and the sun
beating down on your skin. Not only are your senses engulfed, but you
can also feel the emotions you once felt on that day, a sense of quiet
ecstasy and calm. Or perhaps, instead of the canyon trip, you could
download something you’ve never experienced before, like taking a walk
on the moon, or diving a sunken wreck forty fathoms deep. This device
would not merely replicate the patterns of light and sound that would
enter the eye during these episodes, but would also be capable of repro-
ducing subjective sensory experiences and emotions. So if you fancied
something really exotic, how about a shamanic peyote trip on a desert
cliff-top? You could watch the mountains melt into black and white
chessboards before your eyes, while the night sky above reorganises into
a vortex of stars from which animals rendered in luminescent, pin-point
dots emerge and whisper the secrets of the universe.
Devices like this are depicted in various works of science fiction from
the cyberpunk genre. In the movie Brain Storm (Trumbull 1983), scien-
tists have invented a brain-computer interface (BCI) that can play and
record experiences, which they demonstrate with a first-person perspec-
tive recording of someone rushing down a water slide. Elsewhere, in
William Gibson’s Neuromancer (1984), the ‘SimStim’ is a sensorial appa-
ratus that records and broadcasts a person’s sensory input, much like a
video recorder, but for experiences. A similar device, the SQUID, was
also depicted in the movie Strange Days (Bigelow 1995). Consisting of a
network of sensors that fits on the head, the SQUID allows the wearer
to play and record sensory experiences and physical sensations stored on
minidisc. In the film, the device is used to do ‘playback’: street slang
for illicit, addictive, virtual thrills, which leave people strung out with
their brains fried from sensory overload. In the story, an underground
economy has sprung up around the SQUID, in which crimes are being
committed to record rushes of adrenaline, which can then be sold to
addicts looking for their next vicarious fix. While the SQUID is used
to capture normal-waking consciousness, elsewhere in the anime film
Paprika (Kon 2006), a BCI called the DC Mini allows dreams to be
recorded and played back on a laptop computer, providing a different
take on a similar idea.
These cyberpunk works highlight the ethical risks that the specula-
tive sensorial technologies could bring. Yet fictional sensory recorders
like the SimStim or the SQUID are logical extensions of what was
already becoming possible with video technologies in the late twentieth
century. For McLuhan (1964), television and video were extensions of
the human nervous system, which allowed us to tap into hallucina-
tory global networks of sensory experience. Fictional sensory recording
5 Sensorial Apparatus: Interactive Projects 109
Delirious Hellscapes
In the 1990s, video games were rapidly advancing. In a short period
of time, the pseudo 3D graphics of first-person shooter (FPS) games
like id Software’s Doom (1993) were quickly replaced by the true 3D of
Quake (1996). Graphics accelerators like the 3dfx Voodoo Graphics card
became essential hardware for anyone using a PC to play video games.
3D hellscapes could be rendered in exquisite detail like never before.
Slotting a Voodoo Graphics—or better yet, two Voodoo2 cards in an
SLI (Scan-Line Interleave) configuration—would allow the performance
of games to skyrocket, while also unlocking a variety of enhance-
ments including higher resolutions and OpenGL effects such as texture
smoothing, transparent water, fog, and coloured lighting. When the first
Unreal (Epic MegaGames and Digital Extremes 1998) game hit the
shelves, it harnessed many of these possibilities in exciting new ways. I
distinctly remember playing this game for the first time. In the opening
sequence, your character (prisoner 849) must make their way out of a
crash-landed prison-spacecraft. On the way out, you crawl through an
air duct, which is filled with smoke and bathed in green light. What
struck me most was the way in which you could almost smell and taste
the smoke in this scene. Somehow the graphical effects, together with the
immersive audio in this sequence, seemed to activate the multimodality
of your senses, invoking past experiences of smoke. For me, the taste
was synthetic smoke of the kind produced by fog machines, which I
had probably encountered before at school discos or playing Quasar (a
form of laser-tag) in Bournemouth. This is the real power of video game
engines, and a big part of what makes 3D games appealing. With just
a few textures, polygons, sounds, and visual effects, designers can paint
rich, immersive worlds that activate the senses. Our multimodal interpre-
tation of these environments completes the picture, filling in the blanks
as we make sense of them in relation to our past interactions and memo-
ries of the real world. The environments may be virtual, but in some
ways, we can experience forms of presence and embodiment that make
us feel as though we are really there. Today, titles like Half Life: Alyx
extend these ideas in VR, providing uncanny hyper-realistic audio-visual
environments that one can almost taste, touch, smell, and feel.
112 J. Weinel
Quake Delirium
Keyboard and
mouse control MIDI
input controller
Game Altered
audio graphics
Fig. 5.2 Quake Delirium in operation, with various graphical distortions repre-
senting a hallucinatory perceptual state
116 J. Weinel
Hellblade: Seanu’s Sacrifice (Ninja Theory 2017) are just two examples of
games that represent hallucinations through morphing graphical effects
and corresponding sounds. Both these games use similar ideas to those
outlined here, but do so with more sophisticated game engines, providing
more polished results that readers interested in these techniques should
experience. Certainly, there is a utility to these approaches then, which
can allow games designers to enrich storylines through depictions of
hallucinations and ASCs. In the case of Quake Delirium EEG, the use
of biofeedback may also indicate exciting new ways to experience these
psychedelic simulations interactively, and perhaps we will see developers
exploring these ideas in the future too.
Psychedelic Apparatus
Go to bed wearing an elegant headband, and in the morning wake
up and access a video recording of your dreams, playing back any
section you wish with transport facilities for play, pause, rewind, and fast
forward. This is the alluring possibility suggested by the DC Mini in the
anime movie Paprika (Kon 2006), a fictional BCI that records a person’s
dreams. The concept of capturing such ephemeral experiences is an inter-
esting one, since dreams are otherwise hard to recollect and explain or
share with others. Yet it is not inconceivable that such a technology could
one day allow this by using some form of neuroimaging, generating a
visualisation from the data. With such a device, perhaps we could also
record the visual and auditory components of sensory experience during
ASCs, thereby generating visualisations that begin to resemble what one
might see or hear during hallucinations. Taking inspiration from the
DC Mini, Psych Dome is an interactive audio-visual project that uses a
consumer-grade EEG headset to control a psychedelic visualisation with
a corresponding soundtrack. The project was first presented in a mobile
fulldome at Wrexham Glyndŵr University (16 October 2013).
118 J. Weinel
Psych Dome
Processing
EEG signals
OSC
BrainWave OSC
OS
Sou
C
nd
Max/MSP
device, while video outputs to a projector via the display adapter. For
this project, an inflatable mobile fulldome was used, in which a single
projector was directed at a convex mirror that reflected light onto a
hemispherical projection screen inside the dome.13
The sounds and visuals generated by the respective Max/MSP and
Processing applications follow a pre-determined structure. Each time the
project runs, sounds and images are produced for 1 minute 40 seconds
using a generative system, where the resulting audio-visual materials
change based on the real-time input data from the EEG headset. Media
5.3 provides an example of the audio-visual materials that the project
generates, and visual evidence of the project running in the mobile full-
dome. Table 5.1 shows the compositional structure of the piece. From
0:00 to 0:24,14 we see the onset of spiral patterns based on Klüver’s
form constants, while high-frequency rhythmic sounds are heard. Next,
from 0:24 to 0:31, a tunnel of lines is seen, while we hear a bass tone
and drone. From 0:31 to 0:41, the visualisation provides a different
spiral pattern, accompanied by high-frequency percussive sounds. This
is followed from 0:41 to 0:58 by another spiral variation and mandala,
while rhythmic sounds and filtered noises are generated (Fig. 5.4). At
0:58–1:15, a tunnel of triangles is accompanied by a bass tone and
drone; then at 1:15–1:40, there is a further spiral, and high-frequency
rhythmic sounds, before the piece terminates. While the timing of each
of these sections remains consistent between performances of the piece,
EEG modulates aspects of the sounds and visualisations that are gener-
ated. Hence, the EEG signals affect oscillator frequencies and various
DSP properties, while also transforming the visualisations by causing
temporal shifts in colour, transparency, form, and size attributes of shapes
Fig. 5.4 Spiral animations based on visual patterns of hallucination and EEG
signals in Psych Dome
and lines. As before, these gradual changes reflect the shifting perception
that one may experience during an ASC, and so Psych Dome provides
an interactive, biofeedback-driven audio-visual experience that visualises
psychedelic hallucinations with a corresponding synaesthetic audio track.
Altered Simulations
The final interactive project that will be discussed in this chapter is ASC
Sim, a prototype made with the Unity game engine, which explores
approaches for representing auditory hallucinations through sound. The
project was informed by the outcomes of an earlier empirical study
carried out in the context of the Affective Audio research group at
Wrexham Glyndŵr University, in which nearly 2000 qualitative self-
reports of intoxication were analysed in order to collect experiential
accounts of auditory hallucinations (Weinel et al. 2014b; Weinel and
Cunningham 2017). Using this dataset, ASC Sim takes three features
of auditory hallucination described in these experience reports, and
provides interactive designs based on these. In doing so, the project
shows how collaborative, interdisciplinary work involving empirical
studies may feed into the design of interactive systems that simulate
psychedelic ASCs.
ASCs can vary in their intensity, and the concept of structural dynamics
that move through phases of onset, plateau, and termination (see
Chapter 2) were utilised in many of the projects discussed so far. The
Quake Delirium and Psych Dome projects utilised this idea as an under-
lying design principle, so that representations of hallucination onset
gradually, increasing in intensity over time; reach some form of plateau
122 J. Weinel
Fig. 5.5 Screenshot from the ASC Sim project. Coloured boxes in a simple game
scene provide sound sources located in 3D space for testing purposes. Metres
for ‘attention’ and ‘enhancement’ (top-left) indicate the current values of these
properties
Fig. 5.6 Diagram showing the ‘selective auditory attention’ mechanism. When
the player attends an object, all unattended sound sources fade out
value, which ranges from 0.0 to 1.0. ‘Enhance level’ is also represented
with a metre on the user interface (Fig. 5.5). As indicated in Fig. 5.7,
increasing the ‘enhance level’ causes the sound to crossfade between three
different pre-recorded versions of the source sound. These provide ‘dull’,
‘medium’, and ‘bright’ variations, for which graphic equaliser DSP effects
have been used to reduce or enhance the frequency content of the orig-
inal source sound. Increasing the ‘enhance level’ crossfades between these,
so the same sound source is heard, but it becomes much brighter, with
more high-frequency content as the value approaches 1.0. This mech-
anism simulates the subjective experience of sounds being enhanced
through a more detailed frequency spectrum.
‘Spatial disruption of sound’ is reflected in the virtual environment by
manipulating the spatial location of sounds. This is achieved by using
Fig. 5.8 Diagram showing the ‘spatial disruption of sound’ mechanism. Each
sound source moves in oscillating spatial patterns around the object with which
it is associated
∗ ∗ ∗
Through the course of this chapter, we have seen how three distinct
projects, realised through various forms of creative coding and sound
design, represent psychedelic hallucinations through interactive systems.
These projects draw together the approaches in sound, interactivity, and
audio-visual design that were outlined in the previous chapters. In doing
so they provide interactive audio-visual forms that reflect shifting distor-
tions to perception, and a variety of other features that resemble what
one might see or hear during a hallucination. Of course, these are all
very much prototypes, and perhaps what matters most here is not the
end result, but the idea they all point towards: the possibility of repre-
senting subjective experiences, including ASCs, through specific uses of
game-engine simulations. This idea can be described as ‘avatar-centred
subjectivity’, since it modifies the way graphics and sounds are used to
communicate the virtual, subjective experience of a game avatar from
128 J. Weinel
Notes
1. For a further discussion, see also the Adam Curtis documentary HyperNor-
malisation (2016).
2. The potential of psychedelic ASCs for treating depression has been
suggested (e.g. Carhart-Harris et al. 2016a), and it is possible that immer-
sive technologies capable of inducing ASCs might also have therapeutic
applications.
3. For further discussions regarding representations of subjectivity in VR,
see also and Weinel et al. (2018); and Weinel and Cunningham (2019).
Of wider relevance to these discussions are the debates surrounding pres-
ence and immersion (Slater and Wilbur, 1997), which address the ways in
which users feel a sense of embodiment in VR (Sanchez-Vives and Slater
2005; Slater 2009; Landau et al. 2020); and the role of sound in providing
this (Grimshaw-Aagard 2019).
4. While this may not yet be possible with available technologies, there
is early research that explores the generation of images based on
neuroimaging techniques using fMRI; see Nishimoto et al. (2011).
5. Pure Data is a visual programming language similar to Max/MSP (both
languages were developed by Miller Puckette).
6. For a further discussion of these systems and the idea of remixing video
games, see the original article on Quake Delirium (Weinel 2011).
5 Sensorial Apparatus: Interactive Projects 129
19. Dissociation between object and sound source is achieved in the Unity
object hierarchy by creating a separate child object to the cube, which the
sound source is then attached to. The child is animated with a C# script,
while the parent object remains static.
20. While ASC Sim uses C# scripts and the Unity audio engine, similar mech-
anisms could alternately be devised using combinations of scripting and
game audio middleware such as FMOD or Wwise.
21. Informing sound design through the use of empirical studies may provide
a route towards more accurate representations of experiences such as audi-
tory hallucinations; see also Weinel, Cunningham, and Griffiths (2014b);
Weinel and Cunningham (2017).
22. The concept of ’avatar-centred subjectivity’ was first proposed in Weinel
and Cunningham (2019).
23. For example, Weinel et al. (2018) discuss representations of autism
through specific uses of sound design and graphics in VR, in order to
raise public awareness.
6
Optical Geometry: VJ Performances
Selected Examples
Live in London
Building on my library of VJ loops and initial experiments, I decided to
create an audio-visual performance by mixing existing music in the form
of a DJ mix, while simultaneously providing my own original visuals
through a VJ mix. First, I recorded new visuals for existing pieces of
music, essentially creating original music videos for each track. For the
final performance, I then combined these videos together to make a
DJ/VJ mix, while also triggering other audio-visual elements. This DJ/VJ
mix was first performed under the alias Soundcat as part of a concert
organised by VJ London15 held at New River Studios (London, 12 July
144 J. Weinel
Korg
Akai MPC Studio MIDI
Controller MIDI
Controller
MIDI control
data
MIDI control
data
Laptop running VDMX
Audio-visual outputs
Audio-visual monitoring
An inverted colour effect used in this video produces a result that I find
reminiscent of the visual effects used in the hallucination sequences of
the movie Altered States (Russell 1980), while the moving figures recall
Hex’s Global Chaos (1993).
‘A London Sumtin’ by Code 071 (1992) uses scrolling patterns of acid
house smiley faces, and a détournement of the London underground
logo which I modified to read ‘Soundcat’. The mix cuts between this
visual and the purple ‘boxworld’ tunnel, with the ‘cycler’ graphic super-
imposed over it. Here an oscilloscope effect is seen, which was generated
using audio analysis in VDMX. Later in the clip, a fractal sequence
designed in the software Mandelbulb 3D is also used.
‘Wipe the Needle’ by the Ragga Twins (1991) begins with a view
of a Photoshop user interface, within which the ‘dreamscape’ visual
is seen rotating, superimposed over the ‘boxworld’ tunnel (Fig. 6.6).
Behind this, a scrolling graphic displays discordian symbols including
an animated apple with the number 23 written on it.16 The back-
ground cuts to scanning tropical palm trees (photographed from a trip to
Rapa Nui, and modified with a variation of the ‘scanshroom’ Processing
sketch discussed earlier), and the animated ‘purple interference’ pattern.
We then see an explosion of pop-up windows (‘they live’), which are
synchronised with the beat of the music using VDMX’s beat detection
functionality, and terminate with the ‘blue screen of death’ image of a PC
crashing with a fatal system error. Taking inspiration from approaches
used in the vaporwave genre, the various visuals of user interfaces, trop-
ical palms, and failing computer systems expose the illusory mechanisms
behind media fantasias (see also Chapter 7, p. 191).
‘Mil Vidas’ by Bixiga 70 (2015) also uses a 3D dancing figure. In the
background, various red and yellow patterns move to the beat. These
visuals were made using stop-motion animation of jagged pieces of paper
over a light-box, resulting in short ‘one-shot’17 clips, which were trig-
gered in time to the beat using the Akai MPC Studio. These were also
processed with a mirror effect in VDMX. This background layer of the
video was first recorded using the Blackmagic Hyperdeck Shuttle, and
the footage was then recombined with the dancing character and other
visuals, adding video feedback and other effects.18
‘Animal’ by Jaguar (1998) uses a ‘cryptic messages’ effect similar to
the one used in ‘Yes to Satan’, where horizontal patterns are triggered
rhythmically in time with the beat. To produce more accurate time-
synchronisation between the breakbeats and the visuals, I loaded the
‘Animal’ track into a music tracker (Renoise) and programmed a MIDI
sequence that matches key percussive elements in the beat. This MIDI
sequence was then used to trigger the one-shot ‘cryptic messages’ visuals
in time with the beat, in real-time. Behind these, the VJ mix cuts between
the ‘trancecore’ and ‘ghosty’ visuals.
‘Ape Shall Never Kill Ape (Twin Tower Mix)’ by U.N.K.L.E featuring
Nigo & Scratch Perverts (1998) takes a different approach, using a ‘video
mashup’ technique.19 The music track ‘Ape Shall Never Kill Ape’ uses
various audio samples from trailers for the Planet of the Apes films (1968–
1973). The main visual for this track was made by lining up footage from
the original Planet of the Apes trailers with the samples in the music,
thereby providing images one might associate with these sounds. This
video track was then mixed in VDMX, where it was overlaid with oscillo-
scopes, the ’plasma’ visual, scrolling patterns of animated ape heads, and
the statue of liberty smoking a spliff (the latter an irreverent reference
to the ending sequence of the first Planet of the Apes [Schaffner 1968]
148 J. Weinel
for ‘Acid Rain VIP (Breakage Final Chapter Mix)’ (which itself is a ‘dub
version’ of the original Equinox track) was produced using a different set
of visual filters to give an electric pink version of the video, which I can
choose to use in some performances instead of the blue version.
Video
output
Akai Serato Scratch control data
AMX
Audio output Audio output
(monitoring) Mixer
Fig. 6.8 Diagram showing the live setup used for the VJ London performance
Fig. 6.9 Performing as Soundcat at VJ London (New River Studios, 12 July 2018)
(Photo credit: Laurie Bender [L’Aubaine])
first cut through the mix at 11:08, merging with the rhythms of Bixiga
70, while also creating a rich composite of rhythmic visuals from the
two videos. Punctuated by another audio-visual one-shot at 12:58, the
two tracks are allowed to ride together until ‘Mil Vidas’ runs out at
13:13. ‘Animal’ ends with an abrupt ‘deck stop’ slow-down effect, which
digitally simulates the effect of stopping a turntable.
The second half of the mix drops down to a slower tempo, begin-
ning at 14:06 with the sounds of U.N.K.L.E featuring Nigo & Scratch
Perverts’ ‘Ape Shall Never Kill Ape’, with the new Planet of the Apes
visuals. During this track, various audio-visual one-shots are used, and
from 15:31–15:49 the mix cuts in with beats from ‘Bed Jam Session’.22
At around 18:18, a filter effect is applied together with a deck stop, as
the mix transitions to Tipper’s ‘LED Down’, accompanied by video game
visuals and motorway footage.
At 19:52 the mix brings the tempo back up by bringing in Aphrodite’s
‘Siren Bass’, in double time relative to ‘LED Down’.23 From 21:48–
21:57, the drum & bass rhythms of the Aphrodite track temporarily
drop out, which occurs due to a technical issue with the software, and
is masked using an audio-visual one-shot. ‘Acid Rain VIP (Breakage
154 J. Weinel
Final Chapter Mix)’ drops into the mix at 22:38, superimposing the
electric-blue 3D sculpture over the flickering video game visuals of ‘Siren
Bass’. The two tracks ride together until 22:53, with emphasis gradually
shifting on to ‘Acid Rain VIP (Breakage Final Chapter Mix)’ via a gradual
crossfade and various rhythmic cutting effects on the mixer. During this
track, we see various deformations of the 3D sculpture as it squashes and
flies apart in correspondence with relentlessly deconstructed breakbeats,
until 27:31 when another abrupt deck stop is used to transition to the
final track. ‘Message in Bottle’ provides an irreverent finale, combining
raucous punk rock with various collisions of psychedelic visual noise and
sensory overload. As the track runs out, an audio-visual one-shot adds
the ‘soundcat’ stamp and brings the DJ/VJ mix to a close.
Visual Artworks
My DJ/VJ performance as Soundcat developed from the work described
earlier in this book, while also exploring some new directions. Alongside
working on this project, I continued to compose various other works
of electronic music and visual art. I have often worked on visual art
in parallel with music, sometimes incorporating this into audio-visual
works or using it as record sleeve artwork, as noted earlier for the Entoptic
Phenomena in Audio vinyl (Weinel 2014; Chapter 2, Fig. 2.3, p. 56); or
the Flood City (2015) 10 dubplate (acetate) records by Teknoshaman
(2015), each of which featured unique hand-produced artwork. In order
to further consider how work can traverse the boundaries between the
sonic and the visual, in this section I will discuss some visual artworks
I produced during this period, thereby illustrating the broader context
from which the VJ work emerges. These synaesthetic visual artworks can
be considered as complementary pieces, which inform the development
of, and respond to, my VJ work, providing space to experiment with
different symbolic forms and associations that can be formed between
sounds and images.
156 J. Weinel
Synaesthetic Paintings
31 Seconds (2017, Fig. 6.13) is one of several works that explores the
use of typographic references to music. The airbrushed text ‘31 seconds’
references a sample used in the drum & bass track ‘Valley of the Shadows’
by Origin Unknown (1996). This was placed over an acrylic flow24
Fig. 6.14 Bug Powder Dust, acrylic and collage on canvas, 25.4 × 30.5 cm
160 J. Weinel
referencing a track of the same title by Bomb the Bass featuring Justin
Warfield (1995). In this case, acrylic flow techniques and airbrushed
stencilling were used to provide a cityscape with skeletons flying above.
The gothic impression of flying skeletal figures draws influence from
the ‘x-ray’ figures of William Burroughs’s ’shotgun paintings’ (Riflemaker
2005), in correspondence with the Bomb the Bass song, which is based
on the Burroughs (1959) novel The Naked Lunch.
Seasons in the Abyss (2017, Fig. 6.15) is based on the album of the same
name by the thrash metal band Slayer. Acrylic flow techniques were used
to create the main background for this painting, providing abstract white
and dark reddish-brown shapes, from which it is possible to perceive
forms in the manner one might do with a Rorschach painting. In this
sense the painting may unlock the unconscious in the manner of surre-
alism, perhaps allowing one to see various screaming faces that reflect
the hellish themes of Slayer’s Seasons in the Abyss (1990). On top of
the abstract images, various red designs are rendered, which may suggest
‘cryptic messages’ of an occult nature.
Holo Point Break (2018, Fig. 6.16) uses a multilayered approach that
bears comparison with my VJ productions. The background layer is
created with red and purple acrylic flow painting techniques, over the
top of which we see wireframe geometric shapes, audio waveforms, and
other fragmented elements. This provides a visual structure similar to
my VJ productions, where shapes, waveforms, and other fragments are
superimposed over abstract psychedelic textures. ‘Holo Point Break’ is
one of several paintings that take inspiration from street art, such as the
work of New York graffiti and hip-hop artist Rammellzee,26 whose work
crossed boundaries between the sonic and the visual. The breakbeat hard-
core discussed in this chapter is interwoven with hip-hop culture through
the producers’ use of breakbeats and hip-hop samples, and so it is logical
to explore visual connections of street art when visualising these musical
forms.
Many of my paintings explore visual ideas of motifs that were later devel-
oped into VJ materials or vice versa. In three of my paintings: Enter
Soundcat (2017), Soundcat 2000 (2017), and Soundcat S-101 (2017),
I drew connections with VJing by integrating VJ loops, which can be
162 J. Weinel
Fig. 6.16 Holo Point Break, acrylic and collage on canvas, 50.8 × 76.2 cm
6 Optical Geometry: VJ Performances 163
activated using AR, when viewing the paintings through a mobile phone
(Media 6.3).
Enter Soundcat (Fig. 6.17) uses a background with yellow and red
acrylic flow painting and torn newspapers. Hardcore punk imagery is
placed over this background, including ‘soundcat’ written as a détourne-
ment of the Suicidal Tendencies band logo, and a mannequin head,
which references the album artwork for The Joke’s on You by Excel
(1989), which the song ‘Message in a Bottle’, discussed earlier, was taken
from. The piece also includes typographic ‘cryptic messages’ related to
visual hallucinations. Three rectangular prints of stills from my VJ loops
are also used as collage elements, which are brought to life when viewed
through AR, thereby integrating moving images into the painting.
Soundcat 2000 (Fig. 6.18) takes its name from the Rainbow 2000 elec-
tronic dance music festival in Japan (1996, see Masaaki Kobari 2012),
and the logo is a détournement of a design created for that festival by
The Designers Republic.27 This element associates the painting with
1990s rave culture in Japan, and still images of VJ loops are included
as collage elements, which are animated when viewed through an AR
device. As before, these are placed over an acrylic flow background, and
various other designs related to visual hallucinations are laid on top of
this, including waves of triangles programmed in Processing, which were
rendered in orange neon using digitally cut stencils and an airbrush.
Soundcat S-101 (Fig. 6.19) also uses an acrylic flow painting back-
ground, over which the title of the painting is rendered in text. This
text uses the same typography as the titles for the film Terminator 2:
Judgement Day (Cameron 1991), and references the Cyberdyne Systems
Model 101 terminator from the movie. The cinematic reference used
in this painting is also a musical one, since science-fiction movies were
widely sampled in 1990s hardcore rave tracks, such as Terminator by
Metal Heads (1993). These visual elements can therefore be understood
as a form of ‘visual sampling’ analogous to the use of audio sampling of
film quotations in drum & bass tracks, like The Terminator. The film
is also referenced through the use of airbrushed bullet holes across the
painting, and a red wireframe animation of a Neural Net CPU (the
Terminator’s CPU in the film) exploding. Beneath this, another image
shows green ‘cryptic messages’, which were taken from an animated
164 J. Weinel
Fig. 6.17 Enter Soundcat, acrylic and collage on canvas, 30.5 × 40.6 cm
6 Optical Geometry: VJ Performances 165
Fig. 6.18 Soundcat 2000, acrylic and collage on canvas, 30.5 × 40.6 cm
Fig. 6.19 Soundcat S-101, acrylic and collage on canvas, 30.5 × 40.6 cm
6 Optical Geometry: VJ Performances 167
∗ ∗ ∗
have in response to music. Yet just as languages evolve over time as they
are continually used and revised, the language of VJing may not only
restate existing visual associations, but also redefine them. In doing so,
the VJ channels music, to reform and renew our multimodal visual inter-
pretations of it, thereby eliciting the shape of synaesthetic hallucinations
to come.
Notes
1. For a further discussion of VJ projections at Mo:Dem festival, see also
Weinel (2018d, p. 131).
2. The audio-visual performing arts festival Splice is also discussed in Weinel
(2018).
3. For a further discussion of the affective and representational functions
of VJ performances in relation to music, see Weinel (2018d). Symbolic
correspondences are also considered in Weinel (2020).
4. Breakbeat hardcore music (also ‘hardcore rave’, ‘old skool rave’, or ‘UK
hardcore’) is a form of rave music popularised in the 1990s by artists such
as The Prodigy and others, which is based around sped-up drum breaks
sampled from funk and hip-hop tracks. For a further discussion see Weinel
(2018c, pp. 86–87).
5. Many demoscene videos are available online, for a classic example see
Future Crew’s ‘Second Reality’ (DemosceneVids 2015).
6. The VJ Loops demonstration video is provided for educational purposes
only; please do not use these video clips in your own VJ performances
without permission.
7. In computer graphics is a sprite in a two-dimensional bitmap image, as
commonly used in video games to represent moving characters.
8. Pixel art is a style of video game art in which graphics are designed and
edited at the pixel level.
9. This slightly unusual setup was used partly for convenience, since at the
time I was regularly flying back and forth from Denmark, and the porta-
bility of this setup allowed me to work on animations while on the
move.
10. For a technical description of how to program plasma effects, see Vande-
venne (2004).
6 Optical Geometry: VJ Performances 169
11. HAP is a video codec for Mac OS X, which performs image decompres-
sion on the computer’s video card, thereby reducing the CPU usage when
playing back the videos. At the time this was the preferred video codec to
use for VJ performance in VDMX.
12. For examples and discussion of 1990s rave flyers and other imagery, see
Savage (1996), Berlin (2018), and Tomlin (2020).
13. In this section, ‘representational properties’ refers to features of sound
and audio-visual media that represent spatial locations, places, events,
or concepts; while ‘affective properties’ communicate mood and emotion
(see Weinel 2018c). For a discussion regarding the affective properties of
motion graphics, see Bartram and Nakatani (2010).
14. For an example of one of my practice mixes, see my unofficial VJ mix for
Paul Oakenfold’s ‘Goa Mix’ radio DJ set from 1994 (Soundcat VJ 2018).
15. VJ London (http://vjlondon.com/) are a London-based VJ collective.
16. Discordianism is a philosophical movement sometimes considered a
parody religion, which is based on the worship of Eris, the goddess of
chaos in ancient Greek mythology (see Hill and Thornley 1994). Both the
apple and the number 23 are discordian symbols, and the latter has been
used in rave culture by the Spiral Tribe collective. In an interview with
Mark Harrison of Spiral Tribe, he attributes use of the number 23 to its
significance as an ‘anti-icon icon’ (Transpontine and Harrison 2013).
17. The term ‘one-shot’ is borrowed from electronic music production, where
‘one-shot’ audio-visual samples are short sampled sounds that are not
looped. In the context of this chapter ‘one-shot’ is used to describe short
non-looping video samples.
18. The approach described here, whereby improvisational video mixing is
undertaken in the studio, and the materials are iteratively reprocessed, was
partly inspired by the studio techniques used by dub reggae artists. For
example, Lee ‘Scratch’ Perry recorded his dub mixes by repeatedly mixing
down (or ‘bouncing’) the tracks on four-track and two-track tape recorders
in order to add more elements (Katz 2006, pp. 175, 330; see also Weinel,
2018c, pp. 78–79).
19. Video mashup is a style of audio-visual performance in which video music
is constructed through rhythmic collaging of audio-visual samples. For
example, see the music videos Timber by Coldcut and Hextatic (1997) or
‘The Wolf of Wall Street (Eclectic Method Chest Thump Mix)’ by Eclectic
Method (2014).
20. The approach used in this video draws influence from the Autechre
Gantz Graf video by Alex Rutterford (Autechre and Rutterford 2002).
170 J. Weinel
Rutterford says his work on this video was partly inspired by geometric
hallucinations seen on LSD (Kilroy and Rutterford 2010).
21. This timestretching effect was made with the Akaizer (http://the-aka
izer-project.blogspot.com/) application, which simulates the timestretching
features of popular Akai samplers such as the S950/S1000/S2000/S3000
series, which were widely used in hardcore rave music during the 1990s.
22. This section attempts to replicate scratch-DJ techniques, but this is diffi-
cult with the Akai AMX mixer and could perhaps be improved by
exploring the use of alternative controllers such as turntables with control
vinyl.
23. DJs often move between tempos by mixing tracks that are half or double
the tempo of each other. For an example, see DJ Food & DK’s (2001)
transition between ‘Mirror in the Bathroom’ by The Beat and ‘Square Off ’
by Mask, on the Solid Steel—Now Listen mix.
24. Acrylic flow painting (or ‘acrylic pouring’) is a technique where additives
are mixed with acrylic paint to improve the flow properties of the paint.
Multiple colours of paint can then be poured on to a surface resulting
in interesting colourful patterns similar to those produced by marbling
techniques.
25. For example, see rave flyers for Spiral Tribe events and related sound
systems, as documented in Seana Gavin’s visual diary Spiralled (2020).
26. The work of Rammellzee was exhibited at Rammellzee: A Roll of the Dice
(Laz Inc. 2018).
27. The Designers Republic (https://www.thedesignersrepublic.com/)
produced various graphic designs for electronic dance music artists,
events, and record labels such as Warp.
28. These designs draw upon approaches for designing abstract algorithmic
computer text and music notation that are explored in Manfred Mohr’s
computer artworks such as P-021 (1970–1976). This and other related
works are included in the V&A’s collection of computer art.
29. Of course, it should be acknowledged that this chapter reflects my own
personal journey, and other VJs may use entirely different approaches and
workflows that are equally valid. The discussion in this chapter should in
no way be taken as a definitive ‘guide’ to VJing.
7
Future Sound Dream: Virtual Reality
Experiences
we have now are extended. For example, the cybernetic music hallu-
cination sequence in the story is actually based on various real-world
equipment, such as keyboards, modular synthesisers, electronic reed
instruments, computer sequencers, biofeedback technologies, and music
visualisations. Although these technologies haven’t yet been combined to
generate synaesthetic hallucinations in quite the way that the episode
suggests, they soon could be. Indeed, in Explosions in the Mind we
have already looked at psychedelic electronic music; computer music
systems that augment acoustic instruments; synaesthetic music visuali-
sations; biofeedback-driven simulations; and VJ performances. It is not
so difficult to imagine these forms being extended to provide immersive
audio-visual experiences of music that surround and engulf individuals,
giving them a feeling of presence as they drift through glistening virtual
landscapes of sound and image.
Moving towards the design of such immersive music visualisations,
in this chapter I will discuss Cyberdream (2019–2020), a VR experience
that extends the concepts of VJ performance described in the previous
chapter, placing these compositional forms in an immersive, interactive
context, which allows users to fly through synaesthetic virtual worlds of
electronic music. First, we shall look at the initial iteration of this project
for the Oculus GearVR headset, which provided a series of symbolic
virtual environments accompanied by fragments of hardcore rave and
vaporwave music. Following this, we will examine a later iteration of
the project for the Oculus Quest, which provides various improvements,
allowing a more seamless journey in which the user can also create parts
of the experience by using the controllers to ‘paint with sound’. Through
the discussion of Cyberdream, in this chapter, we shall see how elec-
tronic music, creative coding, VJing, and VR can be brought together
to compose psychedelic visualisations of sound.
174 J. Weinel
Dreaming in Cyberspace
Media 7.1 Cyberdream GearVR demonstration video, 4 minutes 57
seconds and software
Cyberdream extends the approaches of VJ performance, bringing them
into the domain of immersive technologies. While VJing allows audi-
ences to watch visualisations of music via rectangular projections,
VR holds the potential for immersive experiences that surround and
completely engulf the user. The feeling of ‘being there’ in a virtual space
is known as ‘presence’ (Slater and Wilbur 1997). Where it is provided,
presence in VR may allow users to feel as though they are actually there
inside psychedelic visualisations of music. Exploring this idea practically,
Cyberdream aims to provide an immersive rave music and vaporwave
experience for the Oculus GearVR.
Conceptually Cyberdream incorporates many of the ideas discussed
in the previous chapter, emerging as a visual synthesis of hardcore rave
music, vaporwave, and 1990s-style VJ visuals. As discussed, the visual
images associated with hardcore rave music in the 1990s were surrealistic
techno-utopian and dystopian visions. We see these images in the various
rave flyers of the era or VJ mixes.4 These images were often developed
using various combinations of airbrushed art and/or computer graphics,
which though cutting-edge at the time, may now seem primitive relative
to newer forms of 3D rendering.
The aesthetics of these visual images have more recently been revis-
ited by vaporwave, an Internet-borne music genre emerging in the
early 2010s. Sonically vaporwave uses loops of 1980s and 1990s corpo-
rate lounge music, advertisements, and banal pop as source materials,5
creating a soundtrack that is deeply nostalgic for the capitalist optimism
of this period. Yet the loops are slowed down, repeating ad infinitum,
and warped as if playing from an old cassette player in a broken-down
hotel elevator. According to Tanner’s (2016) discussion, these flawed
representations reflect a critical view of the capitalist excesses of this
period. Drawing on Mark Fisher’s (2014) concept of ‘hauntology’, which
suggests that the present day is haunted by lost futures that were once
imagined, Tanner argues that vaporwave exposes the broken mechanisms
7 Future Sound Dream: Virtual Reality Experiences 175
construction of the image; it shows us that the exotic illusion is not real,
it is synthetic and computer generated. Vaporwave is a utopian vision of
cyberspace rendered in low-polygon 3D, where an error in drawing gives
rise to an infinite trail of replicated desktop cursors.
Drawing these ideas together, Cyberdream is a virtual hallucination
through the broken techno-utopias of cyberspace, set to a soundtrack
of hardcore rave and vaporwave music. The project provides a journey
through a series of symbolic VR scenes that draw upon the visual
languages of these music genres in order to provide synaesthetic 3D
spaces that correspond with the music. The scenes are dystopian in that
they represent broken techno-utopian vistas in cyberspace, but they can
also be read as euphoric deconstructions, where these fragments once
liberated from their formal constraints become a playground of new
possibilities. In what follows, I will discuss the design of the music
and these 3D environments as they appear on the original version of
Cyberdream for the Oculus GearVR.
Symbolic Environments
The first version of Cyberdream was designed for the Oculus GearVR,
an untethered VR solution that runs on a mobile phone placed inside a
headset. As this device provides only limited controller facilities, Cyber-
dream was designed to provide an automated journey, in which the user
flies through a series of synaesthetic environments based around the rave
and vaporwave concepts described (Fig. 7.1). The project was created in
the Unity video game engine.
The menu screen of Cyberdream is based on the Fantazia New Years
Eve 1991/1992 rave flyer described earlier. A giant face hovers above a
wireframe 3D landscape beneath a pink sky. This landscape was techni-
cally constructed in Cinema4D and rendered as a skybox.9 At the top
of the screen, titles and instructions tell the player that they can begin
178 J. Weinel
Fig. 7.1 Various still images showing scenes from Cyberdream GearVR
In this scene, the user flies across a bridge of oscillating tiles, which
were created with a C# script that modifies the size of the tiles using
a sine-wave equation. The location of the tiles changes the phase of the
waveform, thereby producing a wave effect. Statues of mysterious enti-
ties are situated at either side of the bridge on top of checkerboards. The
music in this scene is an acid techno/hard house track.
As the camera flies into the pyramid, the scene transitions to an
infinity pool hovering in the sky (Fig. 7.1b). Giant mannequin heads
float in the pool, staring blankly into space. These symbols reference
the vaporwave tropes of exotic capitalist lifestyles and high fashionistas.
As we fly over the pool we hear a vaporwave track, before the camera
disappears into the eye of one of the heads.
The next scene revisits ideas developed through my VJ work discussed
in the previous chapter (Fig. 7.1c). The scene is based on a hand-
painted background, which was made by digitally scanning a painting
and manipulating it to produce a skybox. In this scene, a C# script is
used to animate objects in spiral patterns that draw transparent trails
behind them. This provides psychedelic patterns that relate to Klüver’s
(1971) visual patterns of hallucination. Musically the scene is accom-
panied by a track that draws on instrumental styles of UK garage and
grime.10
The scene fades into a field of pop-up computer windows suspended
in a clear blue sky (Fig. 7.1d). This scene develops the same idea used
in the ‘they live’ VJ visual described in Chapter 6, which was based
on the capitalist advertisements depicted in the science-fiction movie
They Live (Carpenter 1988). Pop-ups in the style of Windows 95 read
‘work’, ‘buy’, ‘watch TV’, ‘obey’, ‘consume’, ‘no thought’, ‘update status’,
and ‘take selfies’. These windows open and close, an effect technically
accomplished with a C# script, which modulates the size of the windows
using float values that change following a sawtooth waveform. An easter
egg11 in this scene is a window with text that references the cyberpunk
movie Johnny Mnemonic (Longo 1995). The scene includes a techno
soundtrack.
Following this, we fly across a landscape of purple checkerboard
mountains and Grecian statues (Fig. 7.1e). Giant 3D cursors rain from
the sky, flickering black and white. These are based on the pixel art
180 J. Weinel
default cursor of the Atari ST operating system, and the colour oscilla-
tions are generated with square wave values. The cursors bounce around
the scene chaotically, falling into the sea, where broken statues lie with
their heads bowed mournfully. The music in this scene is a breakbeat
hardcore track using ‘hoover’12 synthesiser sounds.
The next scene finds us suspended in an artificial blue sky once
again, surrounded by waves of brightly coloured cubes which flow across
the screen. Symbolically this scene references the Windows 95 artwork,
and is based on the idea of being inside a computer screen, where the
individual red, green, and blue (RGB) pixels become enlarged. The
soundtrack is one of the vaporwave tracks. This is one of the most
striking scenes in Cyberdream, which users have often remarked upon
during demonstrations of the piece. An aspect that seems to be partic-
ularly effective is the way in which the cubes move through where the
body would be, giving the feeling of being waist-deep in digital waves.
This seems to create a subtle physical sensation and slightly disorientating
vestibular effect, which triggers traces of the sensations that one might
experience standing in an actual sea at the beach. Technically the oscil-
lating movement of the cubes is once again achieved with a C# script
that generates sine-wave values, where phase is offset based on the 3D
coordinates of each cube. This script uses combinations of sine-waves to
change the size of the cubes, while also modifying the RGB colour values
of them, creating a plasma effect.
The scene after this is a variation of the previous one, in which the user
is suspended in a room in the sky made of oscillating cubes (Fig. 7.1f ).
As in the previous scene, the user does not move, but can look around
and observe the wave patterns. The soundtrack consists of a techno beat
that incorporates vaporwave samples via a kitsch synthesiser bell and a
pitched-down vocal sample.
With another transition, the scene then fades into a reddish room with
a hand-painted background (Fig. 7.1g). As before, this was created by
digitally scanning hand-drawn artwork. In this scene, metallic spheres
move following Lissajous curves, which are generated using combina-
tions of sine-wave values that move the spheres. Trail effects allow the
spheres to draw arcs of electric blue across the room, providing a similar
7 Future Sound Dream: Virtual Reality Experiences 181
One of the original ideas behind Cyberdream was the concept of creating
a continuous audio-visual experience analogous to a DJ/VJ set in VR.
The idea was to create an experience where each audio-visual scene would
have its own musical soundtrack and be analogous to a record that a
DJ would mix. Yet each record would have a synaesthetic visual compo-
nent, and be rendered as a 3D space in VR, allowing users to feel as
though they were actually ‘inside the music’. Following this metaphor,
the project would be like a VR equivalent of a rave ‘tape pack’ such as the
packs produced by Fantazia or Dreamscape. I was particularly interested
in capturing the chaotic energy of Carl Cox’s DJ mixes (e.g. Fantazia:
The Big Bang, 1993), where he rapidly blends tracks in quick succession
across three turntables, making abrupt cuts between them.
Although the original Cyberdream was partly successful in this regard,
in the transitions between scenes the music simply fades out before the
next track fades in, rather than cross-fading tracks in sync, in the manner
of a DJ mix. To improve the sonic continuity between scenes, the revised
7 Future Sound Dream: Virtual Reality Experiences 183
...
Fig. 7.2 Diagram showing the structure of Cyberdream, which resembles the
form of a DJ/VJ mix with crossfades
‘sound toy’ generates micro visuals that correspond with the macro
visuals provided through the sequence of symbolic environments in
Cyberdream, allowing the user to ‘paint with sound’ sonically and visually.
In the current iteration of the project, there are three audio-visual
sound toys. Each sound toy is manipulated using the left and right
Oculus Touch controllers, as shown in Fig. 7.3. The ‘index trigger’ on
each controller activates the sound toy currently selected. The ‘hand
trigger’ modifies the sound toy, initiating a secondary function if one
is available. On the top of the controller, moving the position of the
‘thumbstick’ applies various effects to the sound toy, changing the way
it sounds and looks. Lastly, the A/B and X/Y buttons cycle up and
down through a list of available sound toys, thereby allowing different
combinations of toys to be used with the left and right hands.
‘ZigZagToy’ (Fig. 7.4, right) emits an orange laser beam that twists
and produces percussive pulses in synchronisation with the music. The
rhythmic patterns that the toy can generate are stored in 16-bit binary
sequences as shown in (Fig. 7.5). Much like a drum machine, each
Reserved (Oc
Secondary sound to
Trigger the sound
d toy
Fig. 7.4 Still showing the ‘ZigZagToy’ (right) and ‘StreamerToy’ (left) sound toys
in operation
Pattern A:
1000100010001000 = 34953
Pattern B:
0010001000100010 = 8738
Pattern C:
1111111111111111 = 65535
Pattern D:
1001001010010010 = 37522
Pattern E:
1010001010000101 = 41605
∗ ∗ ∗
‘music in the holodeck’. However, like the holodeck in Star Trek, immer-
sive technologies like VR include both sonic and visual components, and
we must therefore consider what ‘music in the holodeck’ might look like.
It is my hope that Cyberdream may point towards some possible solu-
tions, however, there is more work to be done. While I plan to extend
this project, I also hope that others will forge new pathways through the
frontiers of this exciting field.17 Towards this end, the final chapter will
consolidate the approaches explored in Explosions in the Mind , with a
view to supporting further advances in this field, drawing us ever closer
towards the future sound dream.
Notes
1. The gabber nightclub in Culturesport: Rotterdam 1995 (Boling 2019)
appears to be a simulacrum of the Club Parkzicht techno nightclub, which
existed in Rotterdam during the 1990s (see Housenation 1992).
2. For example, ‘Terminator’ by Metal Heads (on Terminator, 1992) samples
The Terminator (Cameron 1984); ‘Underworld’ by SP 23 (1993) samples
Alien (Scott 1979); and ‘The Angels Fell’ by Dillinja (1995) samples Blade
Runner (Scott 1982).
3. Toffler’s work has also been credited as inspiring Detroit techno artists, for
a further discussion see Sicko (2010).
4. As discussed in Chapter 6, example VJ mixes include Dance in Cyberspace
(Dr. Devious and the Wiseman 1992); Global Chaos (Hex 1993); Future
Shock (Frost et al. 1993); and the X-Mix series (Studio !K7, 1993–1998).
5. The predominant use of sampled materials in vaporwave can be considered
in terms of Oswald’s (2004) concept of plunderphonics.
6. Toto’s ‘Africa’ (1982) playing in an empty shopping mall was portrayed in
a YouTube video by Cecil Robert (2017). For a further discussion see also
Tolentino (2018).
7. For a related discussion of sonic symbolism see also Tagg’s (2016) ‘Intel
Inside Jingle Analysis’.
8. For example, Eccojams Vol. 1 by Chuck Person (2010) references the video
game Ecco the Dolphin (Appaloosa Interactive, 1992), and dolphins are also
featured on GATEWAY 2000 by MindSpring Memories (2018), Saccharine
Synergy by Vaperror (2020), and others. For a further discussion of the
190 J. Weinel
Blood red lasers punctuate the darkness of a vast arena, drawing flickering
patterns in the smoke. Thousands of bodies move to a pounding 4/4
techno beat, as a monotonous vocal sample repeats ‘we could go higher…
higher… higher…’. The DJ is bathed in red light, and moves her body
to the beat before an array of VJ projections showing sound waves and
geometric patterns. A bubbling acid bass line weaves its way through the
mix, the shifting filters of the Roland TB-303 synthesiser increasing the
intensity in cascading waves of sound, as cycling spotlights creep across
the ceiling of the arena.
This could be a description of almost any acid techno rave from the
past few decades, but the year is 2020, and the event is entirely virtual,
existing only as a live stream designed for viewers to watch at home on
TV. The DJ, Amelie Lens, is superimposed over a computer-generated
dance tent, complete with a laser light show and VJ projections. As video
cameras pan around the scene, it resembles what one might expect to
see if a real music festival were televised. Yet the bodies dancing to the
music are not real, instead, the DJ performs to a crowd of 3D avatars—
automatons who dance monotonously with blank expressions.
This was the scene at Tomorrowland 2020, one of the many virtual
music festivals designed in 2020 in response to the COVID-19
pandemic.1 The social distancing measures needed to suppress the spread
of the coronavirus meant that music festivals could not take place as
in-person events. As a result, some promoters cancelled or postponed
their festivals, while others moved to online alternatives in the hope that
audiences could gain some enjoyment by partying in their living rooms.
These virtual concerts typically worked by recording performances and
presenting them as live streams. In some instances, this was achieved by
broadcasting from a studio or empty theatre, as in the case of Alexis
and VJ L’Aubaine’s performance (Resolution 2020), which was filmed in
Venue MOT in London without an audience. In other cases, the artists
were placed in computer-generated environments made using game-
engine technologies. For example, Prospa (2020) presented their music
in a virtual warehouse complete with smoke machines and strobe lights;
rap megastar Travis Scott performed as an avatar in the multiplayer video
game Fornite (Epic Games 2017), allowing players to attend a concert
with psychedelic visualisations in the game world (Epic Games 2020;
Webster 2020); while Lost Horizon allowed audiences to walk around
a virtual festival including performances by DJs and VJs using a player
avatar, and could be viewed with a virtual reality (VR) headset (Kocay
2020).
As Hogan (2020) notes, these concerts were not without precedent—
before the pandemic, Gorillaz, Bjork, and Hatsune Miku were among
those artists who had already been exploring virtual modes of perfor-
mance. MelodyVR had also been providing 360-degrees concerts for
audiences to watch in VR, and Mbryonic had created Amplify VR
(2018), an interactive VR music experience that allowed audiences to
manipulate the music with game controllers. New ways to experience
music in immersive spaces were not only dependent on computers,
but were also being forged in exciting new ways through Unkle and
Punchdrunk’s Beyond the Road (Saatchi Gallery, 12 June–8 September
2019) exhibition, which turned the band Unkle’s music into a multisen-
sory experience that represented different tracks on the album through
surrealistic, aromatic, smoke-filled neon-lit rooms. In different ways,
these projects had already been pushing music into new immersive,
8 Conclusion: Design Frameworks 193
the frameworks will undoubtedly treat them as starting points for their
own projects, making adaptations to further evolve their practices and
find new directions.
Onset Plateau
Light Whispering
voices
Dark
Drones
Sensory
bass
Breakthrough Breakthrough
Generalised approach
Data inputs:
Headtracking
Gamepad
Biofeedback Dynamic envelopes
∗ ∗ ∗
Notes
1. At the time of writing, videos of Tomorrowland 2020 were available online
via https://www.tomorrowland.com on a time-limited basis. A list of virtual
concerts and music festivals are available online, see Stubhub (2020) and
Billboard (2020). These events have also been discussed in various press, for
example see Pollard (2020) and Kocay (2020).
2. The framework in this section continues the discussion of ’ASC simulations’
in Weinel (2018b).
3. As noted previously, the concept of representing the subjective perceptual
experiences of avatars can be described as ’avatar-centred subjectivity’; see
also Weinel and Cunningham (2019).
4. ’Cyberdelics’ is a portmanteau of cyberculture and psychedelics, and
describes immersive technologies that seek to provide forms of consciousness
expansion. For a further discussion see Weinel (2018a); Valentish (2019);
and Filimowicz and Weinel (2020).
References
© The Editor(s) (if applicable) and The Author(s), under exclusive 203
license to Springer Nature Singapore Pte Ltd. 2021
J. Weinel, Explosions in the Mind, Palgrave Studies in Sound,
https://doi.org/10.1007/978-981-16-4055-1
204 References
Barik, K., Daimi, S.N., Jones, R., Bhattacharya, J., and Saha, G. (2019) ‘A
Machine Learning Approach to Predict Perceptual Decisions: An Insight
into Face Pareidolia’, Brain Informatics, 6(2). https://doi.org/10.1186/s40
708-019-0094-5
Bartram, L. and Nakatani, A. (2010) ‘What Makes Motion Meaningful? Affec-
tive Properties of Abstract Motion’, Fourth Pacific-Rim Symposium on Image
and Video Technology (PSIVT), Singapore, 14–17 November 2010. https://
doi.org/10.1109/PSIVT.2010.85.
Batchelor, S.F. (2019) ‘A Framework for Future Paintings’, EVA London 2019
(Electronic Visualisation and the Arts), pp. 310–317. https://doi.org/10.
14236/ewic/EVA2019.59.
Beat Games (2019) Beat Saber [VR experience]. Oculus Quest, Beat Games.
The Beatles (1966) ‘Tomorrow Never Knows’, Revolver. LP, Capitol Records.
Becker, J. (2004) Deep Listeners: Music, Emotion and Trancing. Bloomington:
University of Indiana Press.
Bell, E. (2019) ‘Hacking Jeff Minter’s Virtual Light Machine: Unpacking the
Code and Community Behind an Early Software-Based Music Visualizer’,
Volume! 16(1). https://doi.org/10.4000/volume.7254.
Belle-Fortune, B. (2005) All Crews: Journeys Through Jungle/Drum & Bass
Culture. Vision Publishing.
Belson, J. (1962) LSD [unfinished artist film]. Screened at: ‘Found: New
Restorations and Discoveries from Center for Visual Music’, Tate Modern,
London, 26 September 2013.
Berezan, D. (2010) Lecture and Concert at Keele University Music Depart-
ment, 5 November 2010.
Berlin, C.L. (2018) Rave Art: Flyers, Invitations, and Membership Cards from the
Birth of Acid House Clubs and Raves. London: Carlton Books.
Berrin, K. (ed.) (1978) Art of the Huichol Indians. New York: Harry N. Abrams.
Bigelow, K. (1995) Strange Days [feature film]. Lightstorm Entertainment.
Billboard (2020) ‘Here Are All the Livestreams & Virtual Concerts to Watch
During Coronavirus Crisis’, Bilboard , 30 July 2020. Online: https://
www.billboard.com/articles/columns/pop/9335531/coronavirus-quarantine-
music-events-online-streams (Accessed: 6 August 2020).
Bixiga 70 (2015) ‘Mil Vidas’, III. LP, Glitterbeat.
Blackmore, S. (2003) Consciousness: An Introduction. London: Hodder Educa-
tion.
Bliss, E.L. and Clark, L.D. (1962) ‘Visual Hallucinations’, in West, L.J. (ed.)
Hallucinations. New York: Grune & Stratton, pp. 92–107.
References 205
Calvert, G.A., Bullmore, E.T., Brammer, M.J., Campbell, R., Mcguire, P.K.,
Iversen, S.D., and David, A.S. (1997) ‘Activation of Auditory Cortex During
Silent Lipreading’, Science, 276(5312), pp. 593–596.
Cameron, J. (1984) The Terminator [feature film]. Orion Pictures.
Cameron, J. (1991) Terminator 2: Judgement Day [feature film]. TriStar
Pictures.
Capcom (1992) Street Fighter II [video game]. Super Nintendo Entertainment
System, Capcom.
Carhart-Harris, R.L., Bolstridge, M., Rucker, J., Day, C.M.J, Erritzoe, D.,
Kaelen, M., Bloomfield, M., Rickard, J.A., Forbes, B., Fielding, A., Taylor,
D., Pilling, S., Curran, V.H., and Nutt, D.J. (2016a) ‘Psilocybin with
Psychological Support for Treatment-Resistant Depression: An Open-Label
Feasibility Study’, The Lancet Psychiatry, 3(7), pp. 619–627.
Carhart-Harris, R.L., Muthukumaraswamy, S., Roseman, L., Kaelen, M.,
Droog, W., Murphy, K., Tagliazucchi, E., Schenberg, E.E., Nest, T., Orban,
C., Leech, R., Williams, L.T., Williams, T.M., Bolstridge, M., Sessa, B.,
McGonigle, J., Sereno, M.I., Nichols, D., Hellyer, P.J., Hobden, P., Evans,
J., Singh, K.D., Wise, R.G., Curran, H.V., Feilding, A., and Nutt, D.J.
(2016b) ‘Neural Correlates of the LSD Experience Revealed By Multimodal
Neuroimaging’, Proceedings of the National Academy of Sciences of the United
States of America, 113, pp. 4853–4858. https://doi.org/10.1073/pnas.151
8377113.
Carpenter, J. (1988) They Live [feature film]. Universal Pictures.
Carroll, L. (1865–1871) Alice’s Adventures in Wonderland & Through the
Looking-Glass. Reprint, Ware: Wordsworth Editions, 1993.
Castaneda, C. (1968) The Teaching of Don Juan: A Yaqui Way of Knowledge.
Reprint, London: Penguin Books, 2004.
Cecil Robert (2017) ‘Toto- Africa (Playing in an Empty Shopping Centre)’,
YouTube. Online: https://youtu.be/D__6hwqjZAs (Accessed: 3 September
2020).
Chandler, S. (2016) ‘Escaping Reality: The Iconography of Vaporwave’, Band-
camp Features, 16 September 2016. Online: https://daily.bandcamp.com/fea
tures/vaporwave-iconography-column (Accessed: 4 September 2020).
Chion, M. (1994) Audio-Vision: Sound on Screen. New York: Columbia
University Press.
Chuck Person (2010) Eccojams Vol. 1. Digital audio, The Curatorial Club.
Clayton, R. (2018) Welcome to Venice RXCX. Los Angeles: Ginko Press/KYI.
References 207
Dr. Devious and the Wiseman (1992) Dance in Cyberspace. VHS, Parade
Video.
Drool (2016) Thumper [video game]. PC, Drool.
Duffer, M. and Duffer, R. (2016–) Stranger Things [TV series], Netflix.
Eclectic Method (2014) ‘The Wolf of Wall Street (Eclectic Method Chest
Thump Mix)’, YouTube. Online: https://youtu.be/XR9-y4u9mew (Accessed:
12 August 2020).
Eigenfeldt, A. (2017) ‘Musebots: Collaborative Composition with Creative
Systems’, eContact! 20(2). Online: https://econtact.ca/20_2/eigenfeldt_crea
tivesystems.html (Accessed: 16 April 2020).
The Electric Prunes (1966) I Had Too Much To Dream (Last Night). 7”, Reprise
Records.
Eliade, M. (1964) Shamanism: Archaic Techniques of Ecstasy. Reprint, Princeton,
NJ: Princeton University Press, 2004.
Eliade, M. (1978) ‘The Eleusinian Mysteries’, in A History of Religious Ideas:
Vol.1 From the Stone Age to the Eleusinian Mysteries. Chicago: University of
Chicago Press, pp. 290–301.
Emmerson, S. (1986) ‘The Relation of Language to Materials’, in Emmerson,
S. (ed.) The Language of Electroacoustic Music. Basingstoke: Macmillan Press,
pp. 17–39.
Epic Games (2017) Fortnite [video game]. Playstation 4, Epic Games.
Epic Games (2020) ‘Travis Scott’s Astronomical’. Online: https://www.epicga
mes.com/fortnite/en-US/news/astronomical (Accessed: 6 August 2020).
Epic MegaGames and Digital Extremes (1998) Unreal. Windows, GT Interac-
tive.
Equinox (2006) ‘Acid Rain V.I.P. (Breakage Final Chapter Mix)’, Acid Rain
V.I.P. Digital audio, Planet Mu.
Excel (1989) ‘Message in a Bottle’, The Joke’s On You. CD, Caroline Records.
Exit EEE (1993) ‘Dreaming of a Better World’, Atrax. 12”, No Respect
Records.
Fachner, J.C. (2011) ‘Time Is the Key: Music and Altered States of Conscious-
ness’, in Cardeña, E. and Winkelman, M. (eds) Altering Consciousness:
Multidisciplinary Perspectives: Volume 1: History, Culture and Humanities.
Santa Barbara: Praeger, pp. 355–376.
Faulkner, M. (D-Fuse) (2006) VJ: Audio-Visual Art and Vj Culture. London:
Laurence King.
Filimowicz, M. and Weinel, J. (2020) ‘Altered States of Consciousness in
Electronic Music and Audio-Visual Media: An Interview with Dr. Jonathan
Weinel’, Medium. Online: https://medium.com/sound-and-design/altered-
References 209
states-of-consciousness-in-electronic-music-and-audio-visual-media-112d30
034ee (Accessed: 26 October 2020).
Fischer, R. (1971) ‘A Cartography of the Ecstatic and Meditative States’,
Science, 174(4012), pp. 897–904.
Fischinger, O. (1938) An Optical Poem [artist film].
Fischman, R. (2011) ‘Back to the Parlour’, Sonic Ideas/Ideas Sónicas, 3(2),
pp. 53–66.
Fischman, R. (2012) ‘Ruraq Maki (Hand Made, Hecho a Mano) for
Digital Glove’, Vimeo. Online: https://vimeo.com/55093629 (Accessed: 4
September 2020).
Fisher, M. (2014) Ghosts of My Life: Writings on Depression, Hauntology and
Lost Futures. Alresford: Zero Books.
Fitterer, D. (2008) Audio Surf. PC, Dylan Fitterer.
Flattery, D.S. and Schwartz, M. (1989) Haoma and Harmaline: The Botanical
Identity of the Indo-Iranian Sacred Hallucinogen ‘Soma’ and Its Legacy in Reli-
gion, Language and Middle Eastern Folklore. Berkley: University of California
Press.
Freud, S. (1899) The Interpretation of Dreams. Reprint, Ware: Wordsworth
Editions, 1997.
Frost, M., Irwin, C., Jamieson, D., and Simpson, P. (1993) Future Shock. VHS,
Prism Leisure Video.
Gabrielsson, A. and Lindstrom, E. (2012) ‘The Role of Structure in the Musical
Expression of Emotions’, in Juslin, P.N. and Sloboda, J.A. (eds) Hand-
book of Music and Emotion: Theory, Research, Applications. Oxford: Oxford
University Press, pp. 367–400.
Ganguly, S. (1994) ‘Stan Brakhage: The 60th Birthday Interview’, Film Culture
78 (Summer 1994).
Garro, D. (2012) ‘From Sonic Art to Visual Music: Divergences, Convergences,
Intersections’, Organised Sound , 17(2), pp. 103–113.
Gavin, S. (2020) Spiralled. London: Idea Books.
Gebhart-Sayer, A. (1985) ‘The Geometric Designs of the Shipibo-Conibo in
Ritual Context’, Journal of Latin American Lore, 11(2), pp. 143–175.
Gibson, W. (1984) Neuromancer. Reprint, London: Harper Voyager, 1995.
Gong (1974) You. LP, Virgin.
Grimshaw-Aagard, M. (2019) ‘Presence and Biofeedback in First-Person
Perspective Computer Games’, in Filimowicz, M. (ed.) Foundations in Sound
Design for Interactive Media. Abington: Routledge, pp. 78–94.
Grunenberg, C. (2005) Summer of Love: Art of the Psychedelic Era. London:
Tate Publishing.
210 References
Kocay, L. (2020) ‘Inside Lost Horizon: A Virtual Reality Music Festival’, Forbes,
1 June 2020. Online: https://www.forbes.com/sites/lisakocay/2020/06/
30/lost-horizon-virtual-reality-music-festival/-cec5e7d39003 (Accessed: 6
August 2020).
Konami (1999) Silent Hill [video game]. Sony Playstation, Konami.
Kon, S. (2006) Paprika [feature film]. Madhouse.
Kounen, J. (director) (2019) Ayahuasca: Kosmik Journey [VR application]. PC,
Atlas V.
Landau, D.H., Hasler, B.S., and Friedman, D. (2020) ‘Virtual Embodiment
Using 180° Stereoscopic Video’, Frontiers in Psychology, 11, pp. 1229.
https://doi.org/10.3389/fpsyg.2020.01229
La Peste (2005) Il Etait... ‘Magnetique’/Une Possibilite. 12”, Hangars Liquides.
Laserdisc Visions (2011) New Dreams Ltd. Digital audio, Beer On The Rug.
Lavrov, D. (2010) Polynomial [video game]. PC, Dmytry Lavrov
Laz Inc. (2018) Rammellzee: A Roll of the Dice, 2 October–8 December
2018. Information available online: https://www.lazinc.com/exhibitions/
1394/ (Accessed: 14 August 2020).
Leary, T. (1968) ‘The Seven Tongues of God’, in The Politics of Ecstasy. Reprint,
Oakland: Ronin, 1998, pp. 13–58.
Len Lye Foundation (2020) ‘Free Radicals, 1958 (revised 1979)’. Online:
http://www.lenlyefoundation.com/films/free-radicals/33/ (Accessed: 14 July
2020).
Lewis-Williams, J.D. (1996) Discovering Southern A ican Rock Art. Claremont:
David Philip.
Lewis-Williams, J.D. (2007) ‘Shamanism: A Contested Concept in Archae-
ology’, Before Farming, 4, pp. 223–261.
Lewis-Williams, J.D. and Dowson, T.A. (1988) ‘The Signs of All Times:
Entoptic Phenomena in Upper Paleolithic Art’, Current Anthropology, 29(2),
pp. 201–245.
Liggett S. (2020) ‘Positioning the Arts in the Research Process: Perspec-
tives from Higher Education’, in Earnshaw R., Liggett S., Excell P., and
Thalmann, D. (eds) Technology, Design and the Arts—Opportunities and
Challenges. Cham: Springer.
Lilly, J.C. (1972) Center of the Cyclone: Looking into Inner Space. Reprint,
Oakland: Ronin.
Lizotte, C. (2014) ‘Oaxaca’s All-You-Can-Drink Mezcal Festival Is as Crazy as
You’d Expect’, Vice, 6 August 2014. Online: https://www.vice.com/en_us/
article/jpawdx/oaxacas-all-you-can-drink-mezcal-festival-is-as-crazy-as-youd-
expect (Accessed: 14 July 2020).
References 213
Lloyd, J.U. (1895) Etidorhpa: Or the End of the Earth—The Strange History of
a Mysterious Being and the Account of a Remarkable Journey, Revised edition.
Whitefish: Kessinger, 1992.
Lomas, A. (2020) ‘Enhancing Perception of Complex Sculptural Forms Using
Interactive Real-Time Ray Tracing’, EVA London 2020 (Electronic Visuali-
sation and the Arts), pp. 206–211. https://doi.org/10.14236/ewic/EVA202
0.37.
Longo, R. (1995) Johnny Mnemonic [feature film]. TriStar Pictures.
Lowry, M. (1947) Under the Volcano. London: Penguin Classics.
Lubman, D. (1998) ‘An Archaeological Study of Chirped Echo from the Mayan
Pyramid of Kukulkan at Chichen Itza’, Acoustic Society of America, Norfolk,
VA, 12–16 October 1998. Summary online: http://www.ocasa.org/MayanP
yramid.htm (Accessed: 15 July 2020).
Ludwig, A.M. (1969) ‘Altered States of Consciousness’, in Tart, C.T. (ed.)
Altered States of Consciousness: A Book of Readings. New York: Wiley,
pp. 9–22.
Luke, D. (2010) ‘Rock Art or Rorschach: Is There More to Entoptics Than
Meets the Eye?’, Time and Mind: The Journal of Archaeology, Consciousness
and Culture, 3(1), pp. 9–28.
Lux (2020) ‘Free Radicals, Len Lye’. Online: https://lux.org.uk/work/free-rad
icals1 (Accessed: 14 July 2020).
Lye, L. (1935) A Colour Box [artist film]. Available on: Len Lye: Rhythms, DVD,
Revoir.
Lye, L. (1935) Kaleidoscope [artist film]. Available on: Len Lye: Rhythms, DVD,
Revoir.
Lye, L. (1937) Trade Tattoo [artist film]. Available on: Len Lye: Rhythms, DVD,
Revoir.
Lye, L. (1940) Musical Poster [artist film]. Available on: Len Lye: Rhythms,
DVD, Revoir.
Lye, L. (1958, 1979) Free Radicals [artist film]. Available on: Len Lye: Rhythms,
DVD, Revoir.
Magnetic Fields (1992) Lotus Turbo Challenge II [video game]. Atari ST,
Gremlin Interactive.
Manning, P. (2004) Electronic & Computer Music, 2nd edn. Oxford: Oxford
University Press.
Masaaki Kobari (2012) ‘Rainbow 2000 富士1996’ [parts 1–5], YouTube.
Online: https://youtu.be/C7RXIVZpqJM (Accessed: 14 August 2020).
Maturana, H.R. and Varela, F.J. (1998) The Tree of Knowledge: The Biological
Roots of Human Understanding. Boston: Shambhala.
214 References
Rubin, D.S. (ed.) (2010) Psychedelic: Optical and Visionary Art Since the 1960s.
Cambridge, MA: MIT Press.
Rush, R. (1968) Psych-Out [feature film]. American International Pictures
(AIP).
Russell, J. (1980) ‘A Circumplex Model of Affect’, Journal of Personality and
Social Psychology, 39(6), pp. 1161–1178.
Russell, K. (1980) Altered States [feature film]. Warner Bros.
Russet, R., and Starr, C. (1976) Experimental Animation: Origins of a New Art.
New York: Da Capo Press.
Russolo, L. (1913) ‘The Art of Noises: The Futurist Manifesto’, in Cox, C.
and Warner, D. (eds) Audio Culture: Readings in Modern Music. New York:
Continuum, pp. 10–14, 2004.
Sabina, M. (1957) Mushroom Ceremony of the Mazatec Indians of Mexico. LP,
Folkways.
Sanchez-Vives, M.V. and Slater, M. (2005) ‘From Presence to Consciousness
Through Virtual Reality’, Nature Reviews Neuroscience, 6, pp. 332–339.
https://doi.org/10.1038/nrn1651.
Sanden, P. (2013) Liveness in Modern Music: Musicians, Technology, and the
Perception of Performance. New York: Routledge.
Sanei, S. and Chambers, J.A. (2007) EEG Signal Processing. Chichester: Wiley.
Sartre, J.P. (1938) Nausea. Reprint, London: Penguin Classics, 2000.
Savage, J. (1996) Highflyers: Club Rave Party Art. London: Booth-Clibborn.
Schacter, D.L. and Tulving, E. (1994) ‘What Are the Memory Systems of
1994?’ in Schacter, D.L. and Tulving, E. (eds) Memory Systems. Cambridge,
MA: MIT Press, pp. 1–38.
Schafer, R.M. (1994) The Soundscape: Our Sonic Environment and the Tuning
of the World . Rochester, VT: Destiny Books.
Schaffner, F.J. (1968) Planet of the Apes [feature film]. Twentieth Century Fox.
Schankin, C.J., Maniyar, F.H., Digre, K.B., and Goadsby, P.J. (2014) ‘“Visual
Snow”—A Disorder Distinct from Persistent Migraine Aura’, Brain, 137,
pp. 1419–1428. https://doi.org/10.1093/brain/awu050.
Schultes, R.E., Hofmann, A., and Rätsch, C. (1996) Plants of the Gods: Their
Sacred, Healing and Hallucinogenic Powers, 2nd edn. Rochester, VT: Healing
Arts Press.
Scott, M., Nuttgens, S., Upton, A., and Platts, M. (1996) Success & Achieve-
ment. De Wolfe Music.
Scott, R. (1979) Alien [feature film]. Twentieth Century Fox.
Scott, R. (1982) Blade Runner [feature film]. Warner Bros.
Second Phase (1991) Mentasm. 12”, R&S Records.
218 References
Serafin, S., Erkut, C., Kojs, J., Nilsson, N.C., and Nordahl, R. (2016) ‘Virtual
Reality Musical Instruments: State of the Art, Design Principles, and Future
Directions’, Computer Music Journal , 40(3), pp. 22–40.
Shave, T. (2008) ‘Communicative Contract Analysis: An Approach to Popular
Music Analysis’, Organised Sound, 13(1), pp. 41–50.
Shave, T. (2013) ‘Hybridism; a Practice-Led Investigation’ [unpublished PhD
thesis], Keele University.
Sherratt, A. (1995) ‘Alcohol and Its Alternatives: Symbol and Substances in Pre-
Industrial Cultures’, in Goodman, J., Lovejoy, P.E., and Sherratt, A. (eds),
Consuming Habits: Drugs in History and Anthropology. London: Routledge.
Sicko, D. (2010) Techno Rebels: The Renegades of Electronic Funk, 2nd edn.
Detroit: Painted Turtle.
Signore, J.D. (2007) ‘Joshua White: The Joshua Light Show’, Gothamist, 2
April 2007. Online: http://gothamist.com/2007/04/02/interview_joshu.php
(Accessed: 25 October 2015).
Sinclair, M. (1994) Neutral Atmospheres. CD, Chappell Recorded Music
Library.
Singh, R. (2010) ‘Harry Smith, an Ethnographic Modernist in America’, in
Perchuk, A. and Singh, R. (eds) Harry Smith: The Avant-Garde in the
American Vernacular. Los Angeles: Getty Publications, pp. 15–62.
Sitney, P.A. (1965) ‘Harry Smith Interview’, in Sitney, P.A. (ed.) Film Culture
Reader. Reprint, New York: Cooper Square Press, 2000, pp. 260–276.
Sitney, P.A. (1979) Visionary Film: The American Avant-Garde 1943–1978, 2nd
edn. Oxford: Oxford University Press.
Slater, M. (2009) ‘Place Illusion and Plausibility Can Lead to Realistic
Behaviour in Immersive Virtual Environments’, Philosophical Transactions of
the Royal Society B, 364, pp. 3549–3557. https://doi.org/10.1098/rstb.2009.
0138.
Slater M. and Wilbur, S. (1997) ‘A Framework for Immersive Virtual Envi-
ronments (FIVE): Speculations on the Role of Presence in Virtual Environ-
ments’, Presence Teleoperators and Virtual Environments, 6(6), pp. 603–616.
https://doi.org/10.1098/rstb.2009.0138.
Slayer (1990) Seasons in the Abyss. CD, Def American Records.
Smalley, D. (1986) ‘Spectro-Morphology and Structuring Processes’, in
Emmerson, S. (ed.) The Language of Electroacoustic Music. Basingstoke:
Macmillan Press, pp. 61–93.
Smalley, D. (1997) ‘Spectromorphology: Explaining Sound-Shapes’, Organised
Sound , 2(2), pp. 107–126.
References 219
Weinel, J., Cunningham, S., Roberts, N., Griffiths, D., and Roberts, S. (2015a)
‘Quake Delirium EEG: A Pilot Study Regarding Biofeedback-Driven Visual
Effects in a Computer Game’, IEEE Internet Technologies and Applications
2015, Wrexham Glyndŵr University, Wales. https://doi.org/10.1109/ITe
chA.2015.7317420.
Weinel, J., Cunningham, S., Roberts, N., Roberts, S., and Griffiths, D. (2014b)
‘EEG as a Controller for Psychedelic Visual Music in an Immersive Dome
Environment’ [extended abstract], EVA London 2014 (Electronic Visualisa-
tion and the Arts), pp. 188–189. https://doi.org/10.14236/ewic/EVA201759
4.45.
Weinel, J., Cunningham, S., Roberts, N., Roberts, S., and Griffiths, D. (2015b)
‘EEG as a Controller for Psychedelic Visual Music in an Immersive Dome
Environment’, Sonic Ideas/Ideas Sónicas, 7(14), pp. 81–92.
Wilson, P. and Ransom, G. (1996) Cyberscience. CD, Bruton Music.
The Winstons (1969) Color Him Father/Amen Brother. 7”, Metromedia
Records.
Yalkut, J. (1968) Turn, Turn, Turn [artist film]. Screened at: ‘Head Trips:
Abstraction and Infinity—Visual Music Films’, Barbican, London, 4
October 2015.
Yamashirogumi, G. (1993) ‘Dolls’ Polyphony’, Akira—Original Motion Picture
Soundtrack. CD, Demon Soundtracks.
Zander, T.O., Kothe, C., Jatzev, S., and Gaertner, M. (2010) ‘Enhancing
HumanComputer Interaction with Input from Active and Passive Brain-
Computer Interfaces’, in Tan, D., Nijholt, A. (eds) Brain-Computer Inter-
faces. Human-Computer Interaction Series. London: Springer, pp. 181–199.
Zinman, G. (2008) ‘The Joshua Light Show: Concrete Practices and Ephemeral
Effects’, American Art, 22(2), pp. 17–21.
Index
© The Editor(s) (if applicable) and The Author(s), under exclusive 225
license to Springer Nature Singapore Pte Ltd. 2021
J. Weinel, Explosions in the Mind, Palgrave Studies in Sound,
https://doi.org/10.1007/978-981-16-4055-1
226 Index
D H
Demo effect(s) 17, 134, 141, 181 Hauntology 174, 176
Demoscene 17, 134, 139 Hobson, J.A. 4, 7, 14, 29, 122
Direct animation 82, 90, 91, 93, Holodeck 188, 189
95–98, 101, 134, 138, 148, Holo Point Break 160
154 Hypnogogic hallucinations 5
DMT 8, 10, 41–43
Dreaming 4, 5, 7, 14, 15, 32
Dreams 4, 12, 14, 38, 108, 117 I
Drum and bass 14, 34, 86, 87, 153, ISSUE Project Room 63, 67
158, 159, 163
Dub reggae 13, 37, 44, 150
Dubstep 34, 37–39, 44 K
Kendall, Gary 15
Klüver, H. 8, 10, 16, 40, 42, 44, 45,
E 50, 57, 88, 95, 98, 118, 119,
Electroacoustic music 14, 29, 63, 156, 179
86, 97, 114, 194
Electroencephalograph (EEG) 116,
118–120 L
Enter Soundcat 163 La Peste 28, 57
Entoptic Phenomena 40–43 Leary, Timothy 8, 33, 34, 36–39
Entoptic Phenomena in Audio 63, 64 Literary works 13
LSD 8, 9, 11, 12, 36, 45
F
Lye, Len 90
Films 13
Fischman, Rajmil 60, 95, 188
Flashcore 37, 39 M
Form constants 8, 10, 16, 41, 42, Max/MSP 37, 42, 57, 68, 112–114,
44, 45, 50, 57, 60, 88, 95, 98, 118, 119, 197
118, 119, 156 Max/MSP/Jitter 83, 84, 112
Fulldome(s) 16, 18, 118, 119 McLaren, Norman 90
MDMA 5, 11
Meditation 16, 87, 116, 118
G Mescaline 8, 12, 40, 45, 57
Garage rock 13, 32 Mezcal Animations 93, 95, 96
Granular synthesis 33, 37, 61, 87 Multimodal(ity) 9, 30, 111, 167,
168, 200
Music visualisers 16
Index 227
Virtual Reality (VR) 19, 20, 36, 38, VJ London 143, 144, 151, 154
109, 111, 172, 174, 176–178, VJ loop(s) 135, 138, 139, 141
181–183, 188, 189, 192, VJ mix(es) 17, 142–144, 148, 154,
197–199 174
Visual music 15, 16, 18, 19, 82, 83, VJ mixing 150
90, 93, 132 VJ performance(s) 18, 84, 132, 133,
Visual patterns of hallucination(s) 142, 174, 199
32, 35, 39–43, 46, 57, 58, 66, Vortex 156
67, 69, 73, 76, 87, 88, 95,
118, 179, 196, 198
VJ culture 17 W
Wwise 183, 185, 187