0% found this document useful (0 votes)
20 views

Impact of Different Visualizations in Graphical Interpolators Ver31 Shortened

The document discusses evaluating different visualizations for graphical sound interpolators. It presents testing of alternative visualizations for a graphical interpolator to determine if visual feedback aids navigation and sound identification. Sixteen participants used interfaces with different visuals, and mouse traces showed their journeys through the interpolated sound space. Results showed visuals provide additional cues leading to better interaction.

Uploaded by

Bit Depth
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views

Impact of Different Visualizations in Graphical Interpolators Ver31 Shortened

The document discusses evaluating different visualizations for graphical sound interpolators. It presents testing of alternative visualizations for a graphical interpolator to determine if visual feedback aids navigation and sound identification. Sixteen participants used interfaces with different visuals, and mouse traces showed their journeys through the interpolated sound space. Results showed visuals provide additional cues leading to better interaction.

Uploaded by

Bit Depth
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

A Journey in (Interpolated) Sound: Impact of Different

Visualizations in Graphical Interpolators

Darrell Gibson Richard Polfreman


Faculty of Science & Technology, Faculty of Arts and Humanities
Bournemouth University, University of Southampton
Dorset, UK Hampshire, UK
[email protected] [email protected]

ABSTRACT
1 Introduction
Graphical interpolation systems provide a simple mechanism for the
The fundamental problem when designing sounds with a
control of sound synthesis systems by providing a level of abstraction
synthesizer is how to configure the often large number of
above the parameters of the synthesis engine, allowing users to explore
synthesizer parameters to create a certain audio output, i.e. how to
different sounds without awareness of the synthesis details. While a
translate sonic intent to parameter values. Although having direct
number of graphical interpolator systems have been developed over
access to every parameter (one-to-one mapping) gives fine control
many years, with a variety of user-interface designs, few have been
over the sound, it can also result in a very complex sound design
subject to user-evaluations. We present the testing and evaluation of
process. Alternatively, it is possible to map a smaller number of
alternative visualizations for a graphical interpolator in order to establish
control parameters to a larger number of synthesizer parameters
if the visual feedback provided through the interface, aids the navigation
(few-to-many mapping) in order to simplify the process. Particular
and identification of sounds with the system. The testing took the form
states of synthesis parameters (“presets”) are associated with
of comparing the users’ mouse traces, showing the journey they made
different control values and then as these control values are
through the interpolated sound space when different visual interfaces
changed, new synthesizer parameter values are generated by
were used. Sixteen participants took part and a summary of the results is
interpolating between the values of the presets. This provides a
presented, showing that the visuals provide users with additional cues that
mechanism for exploring a defined sound space, constrained by the
lead to better interaction with the interpolators.
preset sounds and the changes of the control parameters.
A number of such interpolation systems have been proposed
CCS CONCEPTS
previously and these can be categorized based on whether the
• Human-centered computing~Visualization design and control mechanism is via some form of graphical interface or some
evaluation methods • Human-centered computing~Usability other medium. Those that are of interest here are those that offer a
testing • Human-centered computing~Empirical studies in HCI graphical representation that allows the control of a visual model.

KEYWORDS 1.1 Graphical Interpolation Mapping


Sound, synthesizer, interpolation, visualization, interface, sound Graphical interpolation systems typically provide a two-dimensional
design graphical pane where markers that represent presets can be
positioned. Interpolation can then be used to generate new
ACM Reference format:
parameter values in-between the specified locations by moving an
Darrell Gibson and Richard Polfreman. 2019. A Journey in (Interpolated) interpolation cursor. Interpolating between presets of parameters
Sound: Evaluation of Different Visualizations for Graphical Interpolators. can facilitate smooth sonic transitions and the discovery of new
In Proceedings of AM'19, September 18–20, 2019, Nottingham, United
settings that blend the characteristics of two or more existing
Kingdom. https://doi.org/10.1145/3356590.3356622
sounds. The sonic outputs are a function of the presets, their
Permission to make digital or hard copies of all or part of this work for personal or location within the interpolation space, the relative position of the
classroom use is granted without fee provided that copies are not made or distributed interpolation point and the interpolation model used to calculate the
for profit or commercial advantage and that copies bear this notice and the full
citation on the first page. Copyrights for components of this work owned by others influence of each preset [1].
than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, A variety of distinct graphical models have been used for
or republish, to post on servers or to redistribute to lists, requires prior specific
permission and/or a fee. Request permissions from [email protected].© 2019
parameter interpolation [1 - 8] which present the user with different
Copyright held by the owner/author(s). 978-1-4503-0000-0/18/06...$15.00 levels of visual feedback. A detailed review of these has been
AM'19, September 18–20, 2019, Nottingham, United Kingdom undertaken [9], however, from this it is difficult to gauge if the
© 2019 Association for Computing Machinery.
ACM ISBN 978-1-4503-7297-8/19/09…$15.00 visual information provided actually aids the user in the
https://doi.org/10.1145/3356590.3356622 identification of desirable sounds, given that the goal is to obtain a
AM '19, September 18–20, 2019, Nottingham, United Kingdom Darrell Gibson et al.

sonic output, or if the visual elements distract from this intention. 2 Graphical Interpolation Framework
Moreover, if the visuals do aid the process, how much they help
In order to evaluate different visualizations, an interpolation system
and what visual cues will best serve the user when using the
was needed that permitted the visual representation for the
interface for sound design tasks.
interpolation model to be modified, whilst leaving all other aspects
the same. The previously created graphical interpolation
framework [9] was used to facilitate comparative user testing,
where only the interpolator’s visual representation was changed.

2.1 Nodes Reimplementation


Although the nodes object is freely available in Max, it needed to
be reimplemented in order to be able to customise the visual
representation for testing. The nodes object within the framework
structure [9] was replaced with an interactive user-interface created
using OpenGL for the visual representation and JavaScript for the
control mechanism and to generate the preset weightings. This
allowed the influence of different visualizations using the same
nodes interpolation model to be evaluated while also facilitating the
future implementation of other interpolator models. The
reimplementation of the nodes model was functionally tested by
Figure 1: Graphical Interpolator Models: (a) SYTER, (b)
undertaking back-to-back tests between it and the original nodes
Interpolator, (c) Gaussian Kernels, (d) Metasurface, (e)
INT.LIB, (f) Nodes, (g) Delaunay Triangulation with Spikes object, ensuring that both implementations gave the same results.
and (f) Intersecting N-Spheres
3 Experiment Design
1.2 Nodes Using the reimplemented nodes interpolator, an experiment was
Andrew Benson created the nodes object for Max in 2009 and it designed to evaluate user behaviour when using the interpolation
proved so popular that it has been included in subsequent systems and different levels of visual feedback are provided. To
distributions [6]. Here each preset is represented as a circular node assess the impact that different visualizations had on the usability
within the interpolation space. The size of each node defines the of the interpolator three interfaces were created, based on the
extent of its influence within the interpolation space (Figure 2). dimensions of a unit square. These were:
The interpolation weightings are calculated for each node currently 1.! Interface 1 – no visualizations (i.e. an empty 2D display).
under the cursor as the distance from cursor to node center, 2.! Interface 2 – only preset locations displayed.
normalized with respect to the node size. Interpolation is therefore 3.! Interface 3 – the original nodes interface.
performed where nodes intersect and when the interpolation point
is on an area only occupied by a single node, then just that node’s These are shown in Figure 3, Interface 1-3, left to right.
preset will be active. For example, in Figure 2, the cursor position
shown corresponds to the overlap of nodes 4 and 7 with the relative
weights 0.355 and 0.180, giving 66.36% of preset 4 and 33.64% of
preset 7. The node weightings are updated in real-time as the
interpolation point is moved or if the nodes are resized or
repositioned within the space.

Figure 3: Different Visualizations for Nodes Interpolator

The user testing took the form of a sound design task with a
subtractive synthesizer, where the participants were asked to match
a given sound which on the interpolator had a fixed, but unknown
to the participants, target location. Each interpolator interface was
populated with different preset sounds, with all of the presets being
created from the same base patch, generating some sonic
Figure 2: Interpolation Space Created with Max nodes Object commonalities between them. However, the starting sounds for
each interpolation space interface were different. The layout of the
nodes and target location within the interpolation space were same
A Journey in (Interpolated) Sound AM '19, September 18–20, 2019, Nottingham, United Kingdom

in each case, but so this was not obvious to the participants, the 1.! Fast space exploration to identify areas of sonic interest
interface was rotated through 90° clockwise for each interpolator. 2.! Localise on regions of interest, but occasionally check
To simulate a real sound design scenario, the participants were that other areas do not produce sonically better results
given only three opportunities to hear the target sound before 3.! Refinement and fine tuning in a localised area to find the
commencing the test and none after that. Participants therefore had ideal result
to retain an idea of the required sound in their “mind’s ear”. All
participants completed the same sound design task with each
interface, but each interface produced different sonic outputs from
different presets. To minimise bias through learning, the order in
which the interfaces were presented was randomised. Each test
lasted a maximum of ten minutes with participants able to stop the
test early if they felt the task had been completed by pressing a
button to register their estimated target location. All of the user’s
interactions with the interfaces were recorded for analysis.

4 Analysis and Results


The experiment was undertaken with sixteen participants, all with
some degree of sound design experience. For each participant, their
mouse movements were recorded allowing comparison of their
navigation behaviour with each interface - the journey that each Figure 5: Mouse Distance to Target & Speed for Participant
user made through each interpolation space. An example is shown with Interface 3.
in Figure 4 for participant 1 who had the following interface order
– 1, 3 & 2, although they are shown here Interface 1 – 3, left to These three phases may be summarised as exploration, localisation
right. and refinement. These phases were present regardless of the
interface being used, showing that these are associated with
exploration of the space and not the detail of the interface used.
From the traces, it was also observed that as the detail of the
visual interface increased so did the area of the interpolation space
that tended to be explored. This was despite the fact that the
participants were given no information with regard to what the
visuals represented. It seems that giving the participants additional
visual cues encouraged them to explore those locations. To
demonstrate this effect, the mean location for each trace was
calculated and the deviation in the form of Standard Distance
Figure 4: Mouse Traces for Participant 1 Deviation (SDD) [10], based on the unit square dimensions of the
interpolator interfaces. These were then plotted back onto the
It was found that at the start of the test users tended to make large, traces to give a visual representation for each interface. Figure 6
fast movements. In the middle of the test the movements tended to shows this for participant 6 who took the test with interface order –
slow and become more localised, but a few larger, moderately fast 3, 2 & 1, although they are shown Interface 1 – 3, left to right.
movement were often made. Towards the end of the test
movements tended to slow and become even more localised
towards the intended target location. To visualize these aspects, in
Figure 4 the first third of the trace is shown in red, the middle third
is shown in blue and the final third is shown in green. This was
also corroborated when the mouse movement speed and distance to
target were plotted on a graph, using the same colour coding.
Figure 5 shows an example for participant 2, with interface 3.
Broadly these trends were seen in fifteen of the sixteen
participants, although as might be expected it did not always evenly
divide into thirds of the test time. Nonetheless it appears to indicate Figure 6: Mouse Trace, Mean Location and Standard Distance
Deviation for Participant 6
that there are three distinct phases during the use of a visual
interpolation interface:
Thirteen of the sixteen participants showed an increase in the SDD
when more visual cues were provided by the interface. The mean
SDD was 0.131 units (! = 0.23) for Interface 1, 0.146 units (! =
0.19) for Interface 2 and 0.180 units (! = 0.21), for Interface 3.
AM '19, September 18–20, 2019, Nottingham, United Kingdom Darrell Gibson et al.

Significance was confirmed (F(1, 15) = 3.132, p = 0.05) with a visual locations encouraged the users to investigate these points and
repeated-measures ANOVA. This indicates that additional visual so explore the defining locations. The full interface not only shows
cues on the interface encourage wider exploration of the the location of the defining sounds, but also indicate to the users’
interpolation space, even though the output and goal of the test was regions of interest (node intersections for this interpolation model),
sonically (not visually) based. where there may be interesting sounds. This seems to focus users’
The locations actually selected by participants as their target exploration on these areas of interest and results in the users getting
sounds were also plotted to see if there were any trends resulting closer to the target location and so “better” sonic results.
from the different interfaces. Figure 7 shows the selected target
locations for all the participants, for Interface 1 – 3, left to right. 6 Conclusions
The identification of three distinct phases of use during the testing
of the graphical interpolators is of significant interest as it suggests
that users interact with the interfaces differently at different stages
during their journey through the interpolation space. Better
understanding of the user behaviour with these systems will in
future work allow the design of new interfaces that provide users
with visuals that will further facilitate the different phases of a
sound design task.
A number of different visual models have been previously
Figure 7: Participants Selected Target Locations by Interface
presented for graphical interpolators [1 - 8], each of these using
very different visualizations. Given the suggested importance of
The results in Figure 7 show that for Interface 1 (no visualization)
the visual feedback provided by each interface, it will be important
there is a fairly wide disbursement of locations selected as the
in future work to evaluate the suitability and relative merits of each
target. From the correct target location, the SDD was calculated as
through further user testing.
0.300 units for Interface 1. For Interface 2 (locations only), there
was a tighter distribution of target locations with SDD of 0.267
units. Finally, for Interface 3 (full nodes) there is an even stronger
ACKNOWLEDGMENTS
localisation with the SDD reducing still further to 0.187 units. This We would like to thank all the participants from Bournemouth
indicates that as the interface provides more visual detail it University and the University of Southampton who took part in the
improves the user’s ability to identify the intended target. user testing of the systems presented.
Comparing by ear the user selected targets with the actual target
sounds showed that in all cases there were sonic differences, but as REFERENCES
the selected locations got closer to the true location on the [1] J.J. van Wijk & C.W. van Overveld (2003). Preset based interaction with high
dimensional parameter spaces. In Data Visualization 2003 (pp. 391-406).
interpolator, as expected these became less distinguishable. Springer US. DOI: https://doi.org/10.1007/978-1-4615-1177-9_27
[2] J.F. Allouis (1982). The SYTER project: Sound processor design and software
overview. In Proceedings of the 1982 International Computer Music Conference
5 Discussion (ICMC). Ann Arbor, MI: Michigan Publishing, University of Michigan Library.
[3] M. Spain & R. Polfreman (2001). Interpolator: a two-dimensional graphical
The testing undertaken indicates that users tend to follow three interpolation system for the simultaneous control of digital signal processing
phases when finding a sound with a graphical interpolator system parameters. Organised Sound. Aug 1;6(02):147-51. DOI:
http://dx.doi.org/10.1017/S1355771801002114
(exploration, localisation and refinement). In the first phase the [4] R. Bencina (2005). The metasurface: applying natural neighbour interpolation to
users make large, fast moves as they explore the space. During the two-to-many mapping. In Proceedings of the 2005 conference on New interfaces
second phase the speed tends to reduce as they localise on specific for musical expression 2005 May 1 (pp. 101-104). National University of
Singapore.
areas of interest. In this phase, though, confirmatory moves have [5] O. Larkin (2007). INT.LIB–A Graphical Preset Interpolator For Max MSP.
been observed when the user quickly checks that there are no other ICMC’07: In Proceedings of the International Computer Music Conference,
2007.
areas that may produce better results. These tend to be done at a [6] nodes. Max Reference, Cycling 74, 2019.
moderate speed, often in multiple directions. In the final phase the [7] C. Drioli, P. Polotti, D. Rocchesso, S. Delle Monache, K. Adiloglu, R. Annies
user refines the sound with small, slow movements as they hone-in and K. Obermayer (2009) Auditory representations as landmarks in the sound
design space. In Proc. of Sound and Music Computing Conference.
on a desired location. These phases appear to be present regardless [8] M. Marier (2012). Designing Mappings for Musical Interfaces Using Preset
of the visual display that is presented, with similar phases being Interpolation. In Conf. on New Interfaces for Musical Expression (NIME 12).
[9] D. Gibson & R. Polfreman (2019). A Framework for the Development and
observed with all three of the interfaces tested. Evaluation of Graphical Interpolation for Synthesizer Parameter Mappings. In
From examination of the mouse traces, the visual feedback Proceedings of Sound and Music Computing Conference, 2019. DOI:
https://doi.org/10.5281/zenodo.3249366
presented by the different interfaces does appear to affect how users [10] A. Mitchell (2005). The ESRI Guide to GIS Analysis, Volume 2: Spatial
interact with the systems. When no visualization is provided, the Measurements & Statistics. ESRI Press.
users were effectively moving “blind” and tended to just make
random movements within the space initially. When the preset
locations were provided, although the users were not aware of
where or how the interpolation was being performed, the provided

You might also like