Blendnik: A Real-Time Performance System Using Blender and Pure Data
Blendnik: A Real-Time Performance System Using Blender and Pure Data
ABSTRACT along with the loops, perhaps using a physically modeled guitar
The authors, a graphic designer/painter, and a musician/software we are developing.
have developed a performance system for re-interpreting their live
free improvisational jam sessions. The open source systems Pure We plan to use this system to explore relationships between visual
Data, Open Sound Control, Blender, and Python were used to art and music in the context of live performances and gallery
provide bidirectional mapping of aural and visual parameters and installations.
real-time interaction via MIDI.
2.BLENDER AS A FRONT-END DESIGN
Keywords ENVIRONMENT
Painting, improvisation, Python, Blender, Pure Data. Blender is an excellent open source 3D animation/modeling
system. It can be used to create complex textured objects (or
meshes) which can be animated in real-time in the integrated
1.INTRODUCTION BGE. Our process starts by designing a mesh inspired by a
Over the last several years, we have collaborated on a series of painting, or series of paintings. This can be done in a variety of
short pieces, usually lasting a few minutes, comprising of freely
ways. A very basic technique would be to create a cube and then
improvised piano along with live painting, usually
edit it by moving around vertices, extruding faces and so forth
pastel/acrylic/ink on vellum.
[13]. More advanced techniques such as multiresolution sculpting
can also be employed [15].
At some point we wanted to take our collaboration to the next
level, so we researched ways to digitize the music and images for Complex Hollywood-style animations have been done in Blender
purposes of live interaction. We first experimented with various [7], [4] but for real-time applications there is a limit of about
common commercial applications, but found they were unsuitable 10,000 faces per scene. This number can be reduced more
for our sensibilities and budget. Then we discovered Blender [5]
depending factors such as the number of lights and two-sided
and Pure Data (Pd) [17].
alpha-enabled faces used. The graphics processor is also a major
factor. Our MacBook Pro with a recent graphics processor far
We experimented with mapping scans of the paintings onto 3D
out-performs our old PowerBook G4.
objects, and used the built-in Blender Game Engine (BGE) [21],
[22] for real-time rendering. A Python [18] script running in the Despite these limitations, there has been impressive work done in
BGE was written using Open Sound Control (OSC) [16] to real-time using soft bodies [9] and GLSL filters [1].
provide two-way communication with a Pd patch.
Once the mesh has been designed, a texture can be applied from
A looping sampler implemented in the Pd patch can be
an image file. Procedural textures are also supported.
configured by creating properties in a Blender scene which
specify audio samples and the mappings between aural and visual
To apply the texture in a predictable way, UV mapping [20] is
parameters.
used. UV maps can be exported to an image file, which can then
be used as a background layer in an image manipulation program.
For instance, X location could map to left-right balance, Z Imagery can then be drawn over this background layer, which will
location to pitch, Y location to reverb decay, X rotation to flange, appear on the mesh when the image file is reloaded into Blender.
Y rotation to tempo, Z rotation to dry/wet mix and transparency to The background layer can be discarded if desired.
volume. Shape morphing and camera control are also supported.
Figure 1 shows a mesh for a section of an abstract tunnel scene we
These mappings can be changed in real-time from a computer
are developing, Figure 2 shows the UV map, Figure 3 shows
anywhere on the Internet. The computer could also be connected
imagery drawn over the map, and Figure 4 shows a screen shot
to a MIDI controller or sensor. Live music could be performed from the BGE.
Figure 1. Abstract tunnel mesh. Figure 4. Screen shot of abstract tunnel in the BGE.
Figure 3. Imagery drawn over map. Every blendnik-enabled object in the scene must specify a Python
Controller Logic Brick, calling the function blendnik.process(),
which is defined in blendnik.py.
When the BGE is running, blendnik.process() is called on every control:
frame and provides Pd with the most recent ordinate values of the Sends messages to selected BGE objects via the s subpatches, via
LocX, LocY, LocZ, RotX, RotY, RotZ, ColA IPO channels. "master control" subpatches, which correspond to the various
types of motion and sound parameters.
Depending on how the BGE mapping properties are defined, each
of these values affects the corresponding sound parameter. sound:
Values come out of blendnik.py normalized and then a scale and Provides for loading different sounds during a performance,
offset is applied in blendnik.pd. turning on and off audio processing, and sound output.
For example, say the animation is running at frame N and say a init:
string BGE mapping property named LocX with value balance is Initialization, deals with the default values and other initial
defined for an object named s1 in the scene. condition handling.
The normalized value of the LocX IPO channel at frame N is sent comm:
via blendnik.process() to the balance parameter in the sampler Communication with Blender via OSC
instance for s1 in blendnik.pd. In this particular example, the
minimum value of LocX would correspond to a fully right-panned axiom49:
balance and the maximum value of LocY would correspond to a Simulation of a MIDI keyboard controller. Presents a scheme for
fully left-panned balance. choreographing a performance.
After this happens the blendnik.py and the blendnik.pd are ready
to communicate via OSC, and proceed as follows:
4.1PD TO BLENDER
When a slider is moved in Pd, or when a note or controller comes
in from a MIDI device, Pd sends an OSC message to blendnik.py,
which can make objects move or modify their appearance. Sliders
can be ganged together to send multiple messages at the same
time, and additional logic can be developed for more complex
interactions. For example, you could have a slider that sends
rotations which are opposites of each other and simultaneously
sends a transparency value run through a lookup table. We call
this idea "meta controllers".
4.2BLENDER TO PD
On every animation frame, blendnik.py sends the values of the
RotX, RotY, RotZ, LocX, LocY, LocZ and ColA channels to
blendnik.pd, which can modify the sound currently playing.
s1 ... s12:
Subpatches for each blendnik-enabled Blender object. These
subpatches contain a modified version of the looping sampler
example B14.sampler.rockafella.pd that comes with Pd, with extra Figure 5. Screen shot of system running a scene.
features for pitch bend, reverb and flanging. Up to 30 instances
can exist under the current scheme, corresponding to 30 separate
audio output channels.
9.REFERENCES
[1] Advanced GLSL filter demo:
http://blenderartists.org/forum/showthread.php?t=152343
[2] BGE Python API:
http://www.blender.org/documentation/248PythonDoc/GE/cl
ass-tree.html
[3] Basic animation:
Figure 6. Part of blendnik.pd patch http://en.wikibooks.org/wiki/Blender_3D:_Noob_to_Pro/Bas
ic_Animation
[4] Big Bucks Bunny: http://www.bigbuckbunny.org
[5] Blender: http://www.blender.org
[6] Blendnik: http://www.porcaro.org/blendnik.html
[7] Elephant's Dream: http://www.elephantsdream.org
[8] Embedded Python:
http://www.python.org/doc/2.5.2/ext/embedding.html
[9] GLSL bathroom demo:
http://blenderartists.org/forum/showthread.php?t=137038
[10] Holth, D., McChesney, C. Open Sound Control for Python,
http://www.ixi-
software.net/content/body_backyard_osc.html
Figure 7. Part of axiom49.pd subpatch [11] IPO curves and key frames:
http://wiki.blender.org/index.php/Doc:Manual/Animation/Ba
sic/Tools/Ipo_Curves_and_Keyframes
[12] IXI Audio: http://www.ixi-software.net
7.FUTURE WORK [13] Learn to Model:
• Create a performable multi-sound channel MIDI-controlled http://en.wikibooks.org/wiki/Blender_3D:_Noob_to_Pro/Lea
installation and performance scenario. rn_to_Model
• Allow for creation of new objects on the fly and dynamic [14] Logic Bricks:
allocation/deallocation of looping sampler instances, http://wiki.blender.org/index.php/Doc:Manual/Game_Engine
• Make use of BGE physics. /Logic_Bricks
• Better camera control to support 3D input devices.
[15] Multiresolution Modeling:
• Optimize sampler, perhaps a new external written in C++
http://wiki.blender.org/index.php/Doc:Manual/Modelling/Me
based on the STK [19]
shes/Multiresolution_Mesh
• Integration of one-shot sampler and a physically modeled
electric guitar. [16] Open Sound Control: http://opensoundcontrol.org
• Fix bugs in Blender related to alpha sorting [17] Pure Data: http://puredata.info
• Make Blendnik generally available when it becomes more
stable. See [6] for the latest progress [18] Python http://www.python.org
[19] Synthesis Tool Kit: http://ccrma.stanford.edu/software/stk
8.ACKNOWLEDGMENTS [20] UV Mapping
Thanks to Ton Roosendaal and Blender foundation for developing http://wiki.blender.org/index.php/Doc:Manual/Textures/UV
Blender and Miller Puckette for developing Pure Data. Also [21] Wartmann, C., and Kauppi, M., The Blender Game Kit, 2nd
thanks to the UC Berkeley Center for New Music and Audio Edition, Blender Foundation, Amsterdam, the Netherlands,
Technology (CNMAT) for developing OSC. A special thanks 2008.
goes to Julius Smith, my mentor from The Center for Computer
Research in Music and Acoustics (CCRMA) at Stanford [22] YoFrankie! - Apricot Open Game Project
University, who has egged on my experimentation over the years, http://www.yofrankie.org
Finally a big thanks goes to my late father Carmine Porcaro for
his overall support and inspiration.