As mentioned in the recent post about using the Kinect with TUIO, I’ve been working with the Microsoft device to create a method of classroom orchestration for teachers. With the Kinect now playing nice with our multi-touch framework we’ve let a number of teachers loose trying it out. Some images of the work so far can be seen on the SynergyNet blog.
Using feedback from the system’s use with teachers and their classes I’ve been able to implement incremental improvements, paving the way to a ubiquitous and intuitive method of controlling the multi-touch interfaces in our SynergyNet classroom. These improvements include:
- The implementation of support for multiple Kinects and the ability to track a teacher’s movement between them.
- The ability for the Kinect to identify which interface a teacher points at.
- Improvements to our guessability study generated set of gestures to create a more cohesive and intuitive lexicon.
I’m currently writing up our findings, though with several more data collection sessions planned it may be until the next year when something finally gets published. Hopefully we will have some videos of the system in full swing by the end of the year. In the meantime I’m demoing our work at the Royal Society on November the 6th as part of the TEL stories event.