I am passionate about creating compelling experiences for immersive media and XR. Currently I am interested in joining a team of like minded creatives pushing the boundaries of possibility in innovative interaction and design.


Tools Used:

Unity, C#, Steam Audio, Csound, ChucK, PD

  • Solo development of complete Unity project from scratch
  • Developing Unique 3D Generative Music Sequencer in VR using Unity, C# for HTC Vive (Production version to be for Quest 2/3)
  • Cube of 8 layers of 8 x 8 (Production version to be 16 layers of 16 x 16)
  • VR Interface manipulates the constellation rotation, solos individual layers, and randomizes events
  • Each layer can be assigned virtual synths and samplers, and can configure beat division, transposition, patch number and more
  • Timing and synth and sample engines were handled by communication with Csound through CsoundUnity package. Production version may switch to ChucK
  • All audio output is spatialized using Steam Audio
  • Many more expansions planned for release version
Constellation is a 3D music sequencer project i have been involved with for several years on and off in the Unity game engine. It is a unique music application/experience based upon the idea of being able to rotate a musical idea along a 3D while viewing it on 2D screen. The name of the app is based on the phenomenon that any given arrangement of stars in a constellation is not actually equidistant from the viewer. Instead stars are at different levels of distance from the viewer. Thus rotation of the constellation from a center point would result in vastly different patterns than what was originally perceived.
The basic idea revolves around a fairly common grid based sequencer approach that many application and hardware sequencers take, where pitch is controlled by the vertical (Y) axis and time the horizontal (X) axis. In this case, however i have a third axis, making the design a 8x8x8 cube. This last axis (Z) controls timbre with different instruments playing. Rotating the cube then not only changes what notes or pitches are playing at various times, it also changes the instrument used to play them. Because the sequencer rotations are somewhat constrained in terms of outcomes, the resulting transformations of the sequence can be heard as more or less distantly related to the original idea, but in a slightly different manner than typical motivic transformation via inversion, retrograde and so on. This prototype uses Csound as the sound engine running within Unity as well as a rudimentary cross modulation between layers. Future expansions will revise the interaction to be more intuitive, and will have a 16 x 16 x 16 cube, as well as many more cross modulating and processing capabilities.

Coding Notes:

The project required UI design and management, with custom data structures for each Layer and a JSON script export to save and recall settings. Sequencer messages were translated and sent to the Csound scripts for the audio engine.

Hyperpiano DSP

Tools Used:

Unity, C#, extOSC, PureData, PHP, MAMP

  • Latest version of an ongoing project to create a performance interface for experimental electronic music
  • Sole Developer
    • Created UI and Custom Touch Based Control Interface Layout in Unity C# Communicating over OSC to a Computer Running PureData
    • Designed PureData Patch to Receive, Filter, and Route Control Data to Virtual Synths, Samplers, Modulators, and Effects
    • Created Waveform Overview Management System Allowing Custom Overviews to be Added and Loaded via PHP and MAMP
    • Designed Preset Saving and Recalling System
This is the second incarnation of the ongoing Hyperpiano DSP project that I’ve been working on, and I’m quite happy with the results. It uses a large touchscreen with a custom UI I designed in Unity remotely controlling synth, samplers, effects and complex modulators. You can assign controls to nearly every parameter, and there’s also a custom mix routing setup I devised.

There’s a lot of data management going on with this setup. You can save and recall settings from JSON files, and the sampler module here has a webserver running MAMP on the audio machine so that I can retrieve overview images of samples via a simple PHP script and Web requests that populate the sample list. OSC data is formatted from a struct containing all the parameter data and sent to the other computer running audio, and the commands are parsed and routed to the audio engine.

If the super accelerated GIF interaction on the side makes you interested in finding out more, you can click on the video nearby for the full explanation.