CLASS 005 W/JORDAN BAREISs & MIKE ‘BAHBAH’ NGUDLE: Particles & Frequencies — Audio-Driven Design in TouchDesigner
JORDAN AND BAhBAH ARE MULTIMEDIA ARTISTS WHOSE PRACTICES MEET AT THE INTERSECTION OF DESIGN, SOUND, AND INTERACTIVITY. FOR THIS PROJECT, THEY COLLABORATE CLOSELY, COMBINING THEIR EXPERIENCE IN TOUCHDESIGNER AND VCV RACK TO DEVELOP AN INTERACTIVE AUDIO-VISUAL SYSTEM.
In this session, we explored the creative potential of gestural control in audio-visual design. Using MediaPipe in Touchdesigner to track hand movements, participants could manipulate a VCV Rack patch in real time — shaping sound through motion alone. These gestures directly influenced parameters within the modular patch, allowing sound to become a truly physical, interactive experience.
The generated audio was then routed back into TouchDesigner, where it drove a point cloud system built from a 3D model of a banana. Each movement and sonic shift translated visually, creating a dynamic link between gesture, sound, and image. This class demonstrated how community-built tools like MediaPipe and VCV Rack can expand TouchDesigner’s possibilities, bridging performance, interactivity, and play. Downloadable project files, point cloud files and demo footage from the session are available for further exploration at the end of this class.
1. Introduction to Touchdesigner
What is touchdesigner?
TouchDesigner is a node-based visual programming environment used to create real-time, interactive visuals and experiences. It’s widely used in live performance, interactive art installations, projection mapping, stage design, and generative video effects.
The platform has a large, active community of artists and developers, with countless tutorials, example projects, and forums where users share knowledge and support one another.
While no coding is required to use TouchDesigner, it includes deep Python integration, allowing for advanced automation, data handling, and customization. Beginners can create without any programming background, but more technical users can push the software further through scripting and external integrations.
Node based Workspace
In TouchDesigner, projects are built using nodes. Each node represents a specific operation, such as generating a shape, processing an image, reacting to sound, or controlling movement. By connecting nodes together, you define the flow of data and build complex visual systems step by step.
The node-based workspace provides a visual map of your project. Instead of writing lines of code, you arrange and connect nodes in a network that makes the logic of your project easy to see and adjust. This modular approach allows you to experiment quickly, trace how visuals are generated, and make changes in real time.
Operators Every node is called an Operator. Operators are the building blocks of your project, each handling a specific type of data or process. They are grouped into families, with each family designed for a particular purpose:
TOPs (Texture Operators): Work with images and video. They handle everything from simple image adjustments and compositing to advanced real-time effects, rendering, and generative textures. • SOPs (Surface Operators): Create and manipulate 3D geometry. SOPs are used to model shapes, apply transformations, and build procedural 3D structures.
CHOPs (Channel Operators): Work with numbers and motion over time. They’re often used for animation, audio analysis, controlling parameters, or linking external devices like sensors and MIDI controllers.
DATs (Data Operators): Handle structured data, such as tables, scripts, or text. They’re powerful for logic, data processing, and Python scripting inside TouchDesigner. • COMPs (Component Operators): Act like containers that group nodes together into reusable systems. COMPs can hold interfaces, control logic, or entire custom tools, making them essential for organizing and scaling projects.
What is vcv rack?
As covered in depth in Kayleb sass’s class “UNDERSTANDING SYNTHESIS—FROM WAVE TO FORm” Vcv Rack is an open-source virtual modular synthesizer that emulates the look and behavior of physical modular synths. It lets you build custom sound systems by connecting virtual modules with patch cables — oscillators, filters, sequencers, and effects — to explore sound design, synthesis, and signal flow in a
hands-on, visual way.
MediaPipe Hand Tracking
MediaPipe is an open-source ML library for gesture tracking. Inside TouchDesigner, we’ll use a hand-tracking controller built with MediaPipe to detect hand positions and movements in real time. These values are packaged into channels (CHOPs), which we then route out to VCV Rack.
2. The Workflow
Step 1: Hand Tracking → Control Data
A webcam feeds into TouchDesigner’s MediaPipe component.
The system tracks hand landmarks (distances, pinches, rotations).
Output is cleaned and normalised into CHOP channels (null operators).
These values are sent to VCV Rack via OSC Out or MIDI Out.
Step 2: Gestures → Sound (VCV Rack)
In VCV Rack, the gesture-data channels are received with an OSC cv or MIDI-CC module.
They are patched into oscillators, filters, delays, or other sound processors.
The sound is both generative and gestural: you literally “play” the instrument by moving your hands.
Step 3: Sound → Back Into TouchDesigner
The audio output from VCV Rack is routed into TD via Dipper (virtual audio device).
An Audio Device In CHOP receives the sound inside TouchDesigner.
An Analyze CHOP extracts amplitude, spectrum, or RMS data.
Step 4: Sound → Visuals (Point Cloud)
A scanned 3D fruit model (from Polycam) is imported as a PLY/OBJ into TouchDesigner.
The audio analysis drives parameters such as:
Point size (loudness = bigger).
Colour hue (frequency shifts = colour shifts).
Rotation or turbulence (beat = movement).
The hand gestures themselves can also directly modulate visual parameters (e.g. pinch → colour, hand height → rotation speed).
3. use cases & Performance
Visual Design & Motion Graphics
Build real-time visuals using 3D geometry, particles, shaders, and video.
Create generative art that reacts to sound, movement, or data inputs.
Ideal for music visuals, installations, and interactive projections.
Audio-Reactive Systems
Connect audio inputs or MIDI controllers to drive visuals.
Link to tools like VCV Rack, Ableton Live, or Pure Data for cross-platform audio interaction.
Map sound frequencies or amplitudes to parameters like color, scale, or motion for responsive experiences.
Interactivity & Sensors
Integrate external data sources such as webcams, depth cameras, and sensors (Kinect, Leap Motion, MediaPipe, OSC devices, etc.).
Build gesture-controlled interfaces, body tracking installations, or interactive exhibits.
Ideal for experimental performance, responsive architecture, or public art.
Data Visualization & Systems Design
Use TouchDesigner for real-time data visualizations — from environmental sensors to live social data.
Import data streams (CSV, JSON, API feeds) and visualize them as interactive, moving graphics.
Often used in live events, dashboards, or immersive storytelling.
Projection Mapping & Installation Art
Precisely map visuals onto 3D surfaces, sculptures, or architecture.
Sync lighting, sound, and motion for immersive environments.
Common in large-scale events, exhibitions, and theatre productions.
Live Performance Mode
Performance Mode hides the TouchDesigner interface, allowing your final network to run cleanly and smoothly in full-screen.
Used in live visuals, concerts, VJing, and exhibitions where real-time control is needed.
You can assign MIDI/OSC controllers, gestures, or key inputs to parameters and perform your visuals like an instrument.
It’s the “presentation” state of your project — minimal interface, maximum responsiveness.
Integration & External Tools
Seamlessly connects to software like Unreal Engine, Blender, Ableton, Unity, and VCV Rack.
Supports Python scripting, Open Sound Control (OSC), MIDI, NDI, and web-based APIs for extended control and automation.
experimental & Research Applications
used for prototyping new interaction models, AI-driven installations, and experimental performances.
Because of its modular, visual workflow, it’s ideal for quickly testing and visualizing creative or technical concepts.
In this case our audio reactive system INTEGRATES an audio-reactive system created by Taipei, Taiwan based audio visual artist pepepepebrick in 2024 with the added functionality of manipulating point clouds using the audio output from vcvrack.
This isn’t about building every tool from scratch — it’s about assembling open-source modules into a creative system. By crediting the developers and patching resources together, we create a unique workflow while showing how to remix existing tools.
4. Applications & Takeaways
Interactive performance: Performers can use gestures to play sound & visuals simultaneously.
Installation art: Visitors’ movements could generate a unique soundscape and visuals in real time.
Workshops/education: A hands-on way to teach modular sound, real-time graphics, and interaction design.
Big Idea:
With TouchDesigner as the central hub, almost any kind of data (gestures, sound, sensors, networks) can be turned into creative input. By learning how to bridge these systems, you can invent your own instruments and interactive experiences.
collect and download our project files for free and have fun with this gestural synth setup.