The code part was tricky, as soon as the fingers and palm were mapped onto to the camera, using a Machine Learning library, it was an invisible mouse, with which events in the browser (javascript) could be created. And here you go, I invented the idea of a bass drop sound, and a guitar riff sound, with two different gestures, like rock-paper-scissors, three different sounds had to trigger. I succeeded in getting off the bass drop to rock-gesture. Being a musician and a fan of DJ music, I think this technology can be adapted at raves and music festivals like Ultra Miami, and Coachella!
The code part was tricky, as soon as the fingers and palm were mapped onto to the camera, using a Machine Learning library, it was an invisible mouse, with which events in the browser (javascript) could be created. And here you go, I invented the idea of a bass drop sound, and a guitar riff sound, with two different gestures, like rock-paper-scissors, three different sounds had to trigger. I succeeded in getting off the bass drop to rock-gesture. Being a musician and a fan of DJ music, I think this technology can be adapted at raves and music festivals like Ultra Miami, and Coachella!