SynergyNet at Bede’s World

The SynergyNet project may be over but it legacy lives on as the SynergyNet framework is still finding uses.   One such use is as the centre of an interactive exhibit at Bede’s World, a museum in the North East of England where the framework is now being used to provide an app used at the centre of an interactive exhibit.

What’s happening?

The use of interactive displays in museums is becoming more commonplace. They are versatile and can allow for visitors to navigate through large amounts of relevant media at their own pace. A growing share of these displays utilise multi-touch to encourage use by multiple visitors simultaneously. These multi-touch displays have been shown to be effective at conveying information to groups of visitors in a large number of pieces of existing research.

However, little work has been carried out in investigating the effectiveness of multi-touch displays at collecting information from the visitors. This is where SynergyNet comes in.  An app has been produced, built on the SynergyNet framework, which allows museum visitors to explore information and media relating to exhibits.  This app is designed for interaction with multiple users simultaneously through large multi-touch table interfaces.  In addition to this, the app encourages museum visitors to partake in conversations with each other about items on display.

The app also gives a gives visitors the chance to save a recording of their conversation for use in the future.  The stored recordings can be easily moderated and added to the app’s contents for future visitors to listen to.  These recordings may then influence future discussion around the table interfaces allowing for continuous conversations about the exhibits to take place between many groups of visitors.

The app has been designed to be easy to configure, with tools that allow for quick tailoring of contents and appearance so that it can utilised in various exhibits.  It has had its premier at Bede’s Worlds Banners of the North Exhibit.   As part of this exhibit several medieval artefacts on loan from the British Museum have been put on display alongside a wealth of information abut their significance and history.

#

The Set Up

There are currently 2 multi-touch tables (Samsung SUR40s) in place at the Banners of the North Exhibit. These tables allow for more than 50 simultaneous touches which should allow for several visitors to explore the content simultaneously.  The tables are augmented with sets of speakers and a high quality microphones to improve playback and audio capture.

#

The App

The app is built on SynergyNet 3 allowing for its dynamic media loading features to be utilised and comes in two parts; the app itself and the configuration tool.

The app itself displays a large background image relating to an exhibit this may be a map, a photo, an image of the exhibits on display, etc.  On this background appear Points of Interest.  When a user presses one of these Points of Interest ‘spider-webs’ appear.  These ‘spider-webs’ are collections of related media; pieces of text, images, audio recordings, videos and 3D objects connected by lines between them in the form of web.  Visitors can move, rotate and scale content in these webs and the lines will update to keep the media items connected.  Content in these webs are collected from files stored in folders specified by the configuration tool.

These ‘spider-webs’ of content allow for simultaneous but separate interaction from multiple visitors.  The webs contain items which link to other webs.  As visitors follow these links the webs close and open accordingly.  These webs typically centre around specific items in the exhibit but can also be based on certain themes, events,  people or periods of history.  The links between items, themes and other spider-web focuses occasionally unite visitors’ paths of discovery where the content they are looking at overlaps with others’.

The ‘spider-webs’ also contain links to an audio recorder through prompts to visitors encouraging them to record their thoughts on the content.  These recorders have simple record, stop and playback buttons.  When a visitor has made a recording they are then prompted with an option to save the recording for use in research and for potential use as content in the app in the future.  If the visitor does consent to having their recording used in research the app stores the recording along side several metrics which can be used for data analysis.  If the visitor also consents to having the recording used as content on the tables in the future it is placed in a moderation queue which museum staff can view at a later time to approve or deny its appearance in the appropriate ‘spider-web’.  Visitor contributed content has a slightly different appearance to content provided by the museum to allow visitors to differentiate between recordings made by experts and those made by other visitors.

The configuration tool is a straight-forward java app which allows users to make changes to SynergyNet settings, such as the display resolution and input device.  In addition to this the configuration tool contains a content builder which allows users to create workspaces containing different content and appearance settings.  The idea behind this is that museum staff can prepare separate workspaces for different exhibits on machines other then the interactive displays. The moderation queue is also part of this tool.

#

Research Goals

The app is designed to support various possible future research projects.  Currently the app is being used to investigate how the design of multi-user interfaces in public spaces encourage contribution, specifically audio contributions.  This research looks to find relationships between multi-user touch interaction and other multi-user interaction with a system such as multi-user contributions to audio.  In addition to this, the research also intends to find evidence supporting or refuting the claim that multi-user audio is richer (contains more useful and on-top statements) than recordings of contributors on their own.

The app also contains what are called ‘filters’.  These are objects through which hidden Points of Interest can be seen. The app can contain various filters; each one showing a specific selection of hidden Points of Interest.  The filters are intended to tie together points of interests in a common theme.  For example, each filter in the app could represent a historical age.  Visitors would be able to set a filter to the age they are interested in then explore the background looking for Points of Interest that would otherwise remain hidden.  Though not currently in use as part of the current Banners of the North exhibit these filters may be deployed in the future to investigate whether adding an additional discovery task to navigation improves engagement.

A future update on this blog will discuss the approaches to any research carried in more detail along with any findings and resulting publications.

#

Challenges

Along the way to getting the app and its supporting technology ready for use by the public there have been several challenges to overcome, some of the solutions to which have resulted in changes to the SynergyNet framework itself.

The most major issue to resolve was the need to support a large number of media items at once.  The banners of the north exhibit has resulted in 100s of audio recordings, video clips and images needing to be loaded into the app. Originally this caused a lot of slow-down resulting in the app becoming near unusable.  So several changes were made to make better use of resources.  The first of these changes was to the third party libraries used in the SynergyNet framework.  Several of these were updated to newer version which allowed the application to be supported by 64bit java for a larger heap-size (as a by-product of this change the framework can now be supported in Windows 8 too).  In addition to this change the framework was made to load content on-the-fly, rather than all at once when first started, reducing the load further.  These changes resulted in a much more responsive, and as a result more intuitive, app.

The next major challenge was the introduction of 3D objects.  Previously only SynergyNet 2.5 could support 3D models, but as part of this project SynergyNet 3 has been updated to support .obj files and materials.  The .obj model type was chosen for support due to its wide use in 3D scanning software, specifically those that can utilise the Microsoft Kinect which may be used in future exhibits.

The app was also required to support for PTM or RTI files, high-quality images used in archaeology.  This resulted in PTM support being implemented into the app.  PTMs can be extremely high quality and require large amounts of resources.  Luckily, the changes made to support more media benefited the PTM support too.

An option was also introduced to the SynergyNet framework which allows apps to disable multiple pieces of media playing simultaneously.  This was implemented because multiple sources of audio playing at once would likely be off putting and confusing to visitors.

Other challenges focused on the hardware setup of the interactive tables used in the exhibit.  One of these challenges related to sound levels in the museum environment.  It was discovered the the SUR40 audio card had issues for both playback and recording.  An external sound card was added to both tables used in addition to speakers and microphones.  This massively improved both playback and recording.

The calibration of the tables was a cause for concern throughout design of the exhibit.  An attempt was made to use the touch tables in a previous exhibit but the calibration made them unusable.  This was due to them being placed in an environment with a lot of natural lighting.  For the Banners of the North Exhibit the tables have been placed away from windows which has had a positive effect on their calibration resulting in a very responsive and accurate input.

#

Special Thanks

I would like to thank everyone at Bede’s World and the British Museum for offering me the opportunity to develop this app and supporting this research.  I would like to specifically thank Kathy Cremin, Mike Benson, Diane Grey, Georgina Ashcroft and Craig Williams for their significant contributions.

#

If you’re interested in seeing the tables in action and getting hands on (literally) with the app then pay Bede’s World a visit.

Advertisement

2 thoughts on “SynergyNet at Bede’s World

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s