The SynergyNet project may be over but it legacy lives on as the SynergyNet framework is still finding uses. One such use is as the centre of an interactive exhibit at Bede’s World, a museum in the North East of England where the framework is now being used to provide an app used at the centre of an interactive exhibit.
In this previous post I mentioned that we at TEL in Durham had been running some studies using the Kinect with SynergyNet. Though data analysis is still being carried out on the results I’ve decided to provide some details on the system, its working and its capabilities, in addition to some of the initial findings.
We recently hosted a day of demoing SynergyNet being used in our lab at TEL-Durham. Resulting from this a number of articles have appeared online about our work with professional pictures and videos. In addition to this we took the opportunity to make our own videos which are available in this post on the SynergyNet blog.
As mentioned in the recent post about using the Kinect with TUIO, I’ve been working with the Microsoft device to create a method of classroom orchestration for teachers. With the Kinect now playing nice with our multi-touch framework we’ve let a number of teachers loose trying it out. Some images of the work so far can be seen on the SynergyNet blog.
We’ve recently been working on utilising the Microsoft Kinect with the SynergyNet project at TEL in Durham. I’m currently working on several publications concerning this work which I will post updates about when they’re finished. In the meantime I thought I would post a small application I wrote when I was getting to grips with the Kinect.
Multi-touch technology is becoming more common place in both industrial and casual computing. With this rise in use developers are producing a wider range of multi-touch interfaces, each with different elements in their design. One element that most have in common is the shape of the display.
I recently got rid of my iPhone and bought a HTC phone with Android (one of the best consumer decisions I’ve ever made). I decided to see if I could using the phone’s multi-touch input to produce TUIO. After some searching I found an app on the nuigroup forums calledTUIO droid by TobiTobsen (Available here).
Its been a while since anything was posted on this blog so its time for an update. First of all I’ll just point out that I’ve been working really hard on a thesis. I don’t want to go into any more detail about it till its accepted so I haven’t posted any updates concerning it. Hopefully in the new year I’ll be able to make several posts detailing what it was about.
I’ve showcased the multi-touch plugin for Compiz before but I thought I’d bring a new feature into the light. Robert (whose website you can see here: http://highpointvantage.com/) has managed to make the multi-touch plugin compatible with Compiz’s firepaint plugin. The result is some amazing eye candy as you can see below.