The SynergyNet project may be over but it legacy lives on as the SynergyNet framework is still finding uses. One such use is as the centre of an interactive exhibit at Bede’s World, a museum in the North East of England where the framework is now being used to provide an app used at the centre of an interactive exhibit.
In this previous post I mentioned that we at TEL in Durham had been running some studies using the Kinect with SynergyNet. Though data analysis is still being carried out on the results I’ve decided to provide some details on the system, its working and its capabilities, in addition to some of the initial findings.
We recently hosted a day of demoing SynergyNet being used in our lab at TEL-Durham. Resulting from this a number of articles have appeared online about our work with professional pictures and videos. In addition to this we took the opportunity to make our own videos which are available in this post on the SynergyNet blog.
As mentioned in the recent post about using the Kinect with TUIO, I’ve been working with the Microsoft device to create a method of classroom orchestration for teachers. With the Kinect now playing nice with our multi-touch framework we’ve let a number of teachers loose trying it out. Some images of the work so far can be seen on the SynergyNet blog.
We’ve recently been working on utilising the Microsoft Kinect with the SynergyNet project at TEL in Durham. I’m currently working on several publications concerning this work which I will post updates about when they’re finished. In the meantime I thought I would post a small application I wrote when I was getting to grips with the Kinect.
All current versions of SynergyNet, TEL‘s multi-touch framework for classrooms, have support for TUIO. However, anyone who has tried to use the software may have noticed that its TUIO support can be flakey at times. This has been amended in a recent update to SynergyNet 2.5 and 3.