Based on previous observation and tests, now the visualisation still needs more tuning, but is working with some changes such as:
- Frame of reference for movement detection: xyz coordinate system from the Kinect skeleton tracker.
- An anchor point for joint position and speed measuring was defined so it it doesn’t matter how tall or how small, it will work according to the player body lengths.
- The Y coordinates for the particles in the Visual sketch are defined by the speed and position changes in the Y and Z axis of the Kinect coordinate system.
- The X coordinates are defined by the speed on the X axis on the Kinect coordinate system.
- Currently some variables are defined randomly but some of the will be controlled by other inputs from the sound feature extractor, like Color and size could be controlled by Pitch and length of musical note. This variables are currently being read by the Quartz sketch but not properly integrated.
- Rewards could be also represented by a particle trace that could get activated while performing the piece correctly.
Due to lack of space on Vimeo, click on the following links to look at the videos:
Prototype testing 2: String changes exercise
Prototype testing 3: String changes exercise 2