As part of my on-going efforts to publish some of my older research I have gotten round to producing a paper on my work to produce a simple dynamic method to adapt content to different tabletop display shapes. The paper details how the technique work, specifically how it resolves the initial issues encountered fitting the content to a new display shape then how it attempts to make the most of the new display. Also detailed in the paper is a study assessing how the technique performs with some tabletop apps and a discussion on where the technique could be useful in the future.
I’ve recently taken the time to get a paper published on a technique I started development on as part of my final year undergraduate studies several years ago. The technique uses a small number of user inputs on a touch-screen which relate to the known position of landmarks in an environment to determine the position of an interface. The publication details a study investigating how big of an impact user’s accuracy has on the technique.
Earlier this year I was involved in supporting a study which investigated how technology could support collaboration between primary-school aged students in separate classrooms. As part of this study the SynergyNet software framework was used. This is the first time SynergyNet had been used in a study spanning multiple sites and required a few tweaks to get working.
The SynergyNet project may be over but it legacy lives on as the SynergyNet framework is still finding uses. One such use is as the centre of an interactive exhibit at Bede’s World, a museum in the North East of England where the framework is now being used to provide an app used at the centre of an interactive exhibit.
We recently hosted a day of demoing SynergyNet being used in our lab at TEL-Durham. Resulting from this a number of articles have appeared online about our work with professional pictures and videos. In addition to this we took the opportunity to make our own videos which are available in this post on the SynergyNet blog.
As mentioned in the recent post about using the Kinect with TUIO, I’ve been working with the Microsoft device to create a method of classroom orchestration for teachers. With the Kinect now playing nice with our multi-touch framework we’ve let a number of teachers loose trying it out. Some images of the work so far can be seen on the SynergyNet blog.