You See What You Hear
live coding - touch designer - visualisation You See What You Hear is a live-coded audiovisual instrument that turns music into motion in real time. I challenged myself to bridge Tidal Cycles (sound) and TouchDesigner (visuals) using SuperCollider and OSC, mapping rhythms, notes, gain, and space into visual behavior. The result is a responsive feedback loop where code, sound, and image evolve together on stage.
JUMP TO IMPLEMENTATION
Research and Inspiration
This project grew out of my fascination with live-coded performance as both a musical and visual medium. I was especially inspired by the open-source ecosystem around Tidal Cycles, a language for algorithmic pattern generation, and TouchDesigner, a real-time visual programming environment used in interactive installations and performance art.
While both tools are powerful in their own domains, they are rarely used together in a tightly coupled way. Most live-coded visuals are pre-scripted or manually triggered alongside audio, rather than dynamically driven by the same musical data stream. I wanted to blur that boundary — to make visuals that emerge as a direct reflection of the music’s structure, timing, and emotion.
I researched how SuperCollider (the sound engine behind Tidal Cycles) handles OSC (Open Sound Control) messages, and how TouchDesigner can receive and react to external data in real time. I studied existing examples of OSC bridges, data sonification, and audiovisual performance setups, but found few that created a seamless feedback loop where music itself generates the visuals frame by frame.
This inspired me to build my own cross-software bridge — a system where algorithmic music drives visuals, creating a live audiovisual instrument that treats sound and image as two sides of the same real-time pattern.
This project grew out of my fascination with live-coded performance as both a musical and visual medium. I was especially inspired by the open-source ecosystem around Tidal Cycles, a language for algorithmic pattern generation, and TouchDesigner, a real-time visual programming environment used in interactive installations and performance art.
While both tools are powerful in their own domains, they are rarely used together in a tightly coupled way. Most live-coded visuals are pre-scripted or manually triggered alongside audio, rather than dynamically driven by the same musical data stream. I wanted to blur that boundary — to make visuals that emerge as a direct reflection of the music’s structure, timing, and emotion.
I researched how SuperCollider (the sound engine behind Tidal Cycles) handles OSC (Open Sound Control) messages, and how TouchDesigner can receive and react to external data in real time. I studied existing examples of OSC bridges, data sonification, and audiovisual performance setups, but found few that created a seamless feedback loop where music itself generates the visuals frame by frame.
This inspired me to build my own cross-software bridge — a system where algorithmic music drives visuals, creating a live audiovisual instrument that treats sound and image as two sides of the same real-time pattern.
Process
| Composing Patterns in Tidal Cycles
I started by writing generative musical patterns in Tidal Cycles, live-coding rhythms, melodies, and parameter changes. Each sound event emitted detailed OSC data: note, gain, cps, delta, room, delay, etc. This gave me a rich, constantly shifting stream of information to work with.
| Capturing OSC Data in SuperCollider
OSCdef
functions to listen for /dirt/play
messages, extract key values from them, and print them to the Post Window for debugging. This was my way of understanding how each musical gesture is encoded as data.| Building a Bridge to TouchDesigner
Once I could reliably parse the data, I used
NetAddr.sendMsg()
in SuperCollider to forward specific values (like note
, gain
, room
, etc.) to TouchDesigner via OSC. This step turned SuperCollider into a real-time data router between audio and visuals.| Designing Visual Responses in TouchDesigner
Inside TouchDesigner, I set up an OSC In DAT to receive the data and converted it to CHOP channels. I mapped each parameter to visual properties:
note
affected color palettes, gain
drove brightness, room
influenced depth/blur, and rhythmic pulses from delta
controlled animation triggers. This created a responsive visual system that reacts instantly to the music.
Final Thoughts
| challenges
This process forced me to work at the intersection of music programming, network protocols, and visual design — a space I had never combined before.
| TUTORIAL VIDEO
| challenges
- Protocol Complexity: Understanding the structure of OSC messages from Tidal Cycles was a hurdle — they arrive as long lists of alternating keys and values. I had to carefully parse them to isolate the parameters I wanted.
-
Timing + Sync: Ensuring that visuals reacted on beat was tricky. Even slight OSC delays made the connection feel off. I had to experiment with lightweight message structures to keep everything responsive.
-
Cross-Software Communication: Each environment speaks its own language. It took time to learn how SuperCollider handles networking and how TouchDesigner expects OSC data.
This process forced me to work at the intersection of music programming, network protocols, and visual design — a space I had never combined before.
| TUTORIAL VIDEO
| FINAL PROROTYPE