So now that we are well in truly on the road to delivery for phase 2 of social interpretation in July. I have been having fund with lots of data. At the beginning of April 6 Social Interpretation kiosks and 8 QR codes were installed in the Family in Wartime exhibition. We have been recording and tracking interaction with the devices for two months now. Its nice to see some numbers. We have also been doing some observations and interviews in order to get more juicy qualitative data. But here are some of the numbers first, from month 1 in gallery.Read More
One of the problems with R&D digital lifecycles and museum exhibition lifecycles is that they are completely different. The pace of technology change is misaligned with the fiscal, creation, development and installation cycles of museums.
In a climate in which new technology platforms emerge on a weekly basis, there is a dramatic mismatch between the cycle of technology and the long planning cycles that exist for most museums exhibitions. Social Interpretation is no exception. We came in very late to the build of the Family in Wartime exhibition, and it’s fantastic that we could incorporate SI into the exhibition. It looks really good with the time and resource we had available. But it does mean due to this lack of time and resources that a few issues are now cropping up.Read More
This week I have been undertaking some visitor behaviour mapping in the main exhibition space up at IWM North. I had reservations about how well I would be able to complete visitor behaviour mapping as the space is intentionally confrontational, with a want of making visitors feel ill at ease. You get lost easily, and are never quite sure where the exit is, or if you have come from the right or the left. There is a chronology, but it isn’t easy to follow, and visitors do get lost and distracted. The museum is really one wonkily-shaped large central dimly lit room with small ‘silos’ focusing on particular themes. So it was fascinating to see how visitors interact with the space, what behaviours they display, and which objects interest them the most.Read More
Hot on the heals of Claire’s post about being agile and putting the visitor / user first, we wanted to share how the design of the kiosk interface is going – and the process we’ve gone though so far.
In essence, we’ve followed a pretty traditional design/development cycle: sketch, wireframe, pixel-design, refined design. BUT, at every phase we’ve tested, tested and tested again. With the results directly feeding into the next phase of design. This has been rapid, sometimes fraught, and often insightful. And here’re a load of images, showing exactly what each phase actually looked like.
These were the first wireframes. They’re paper-based and created very rapidly to illustrate how we thought the content/interaction should all fit together. Literally as soon as they were ready, Claire deployed her Artful Dodger skills, nabbed them, then tested them with visitors in-gallery.
Then we refined the design…Read More
The Digital R&D Fund for Arts and Culture, funded by Nesta, AHRC and ACE, is funding two teams of researchers to investigate the SICE project. Our team consists of Gabriella Giannachi, Professor of Performance and New Media at the University of Exeter; Peter Tolmie, Senior Ethnographic Consultant for the Mixed Reality Lab at the University of Nottingham; Steve Benford, Professor of Collaborative Computing in the Department of Computer Science at the University of Nottingham; and Derek McAuley, Professor of Digital Economy in the School of Computer Science and Director of Horizon, an interdisciplinary research institute funded through the RCUK Digital Economy programme.Read More