
music tech
Coding the music technology for the coming age
Starchild music
Starchild music is an LA-based startup developing an AI-powered music player where the user can actively interact with the music and add live effects. LUMINECHO is one of the Founding Engineers and current Chief Interactive Audio Officer, developing the code that runs the player and DSP.
Granu
Granu is the first of its kind granular synthesizer in Virtual Reality created and developed by LUMINECHO, with some help from other sound designers and UI developers. Granular synthesis is a synthesis technique in which small bits of audio are put one after the other, creating all kinds of inspiring textures that can sometimes evoke the original sound, disintegrating it, almost literally, into grains. The app has been released for free in the Meta Quest Store.
The synth was developed from the ground up in C#, and was integrated into a Unity application for Meta Quest 2. Link to the code.

Galaxyharp
Distance sensor based MIDI controller. Built under the supervision of MIT Media Lab researcher Akito van Troyer. The distance sensors work with sound reflection times to estimate the position of the hands, and a Teensy 4.0 creates led strip animations and sends MIDI messages through USB. The open-source code can be found here.

Firefly VST3, AU
Set of two plugins. One was an effect that simulated the random fluctuations of analog systems to try to bring sounds to life, the other one was a sampler with a similar approach. Both are open source and the code can be found here.
Loosy
Fully coded this synthesizer for Microsoft's Hololens 2, that attempts to solve some of the technology challenges that music applications in mixed reality face, like the absence of physical references and the strict reliance on visual cues, or the slight imperfections in hand position detection. It was developed in Unreal Engine, and the sound engine was in ChucK language. Both elements communiacted via the OSC protocol.
The code and a demo can be found here

Thumb
Coded this art project in collaboration with the Berklee Network Orchestra, which was showcased in the DNA festival at Berklee College of Music. In the piece, live feed was received from the social network X (former Twitter), and the installation would perform sentiment data on the most relevant data at the time, generating a sentiment analysis that would change the played generative music.
The code can be found here:
https://github.com/Natameme/BNO_DNA
