This is the blog for University of Exeter ‘s JISC-funded Learning and Teaching Innovation project. The project is built around the innovative and creative use of Augmented Reality. This exciting new technology adds a layer of virtual information over the physical world, and enables mobile phone users (equipped with a suitable application) to interact with their surroundings, unlock a rich hidden curriculum and see the world in a different way.
The project is using Augmented Reality as a means of interpreting the rich biodiversity of the University of Exeter’s main campus. Smartphone users will be able to explore the campus and access scientific data as a layer of graphics and multimedia content superimposed on the living landscape viewed through the smartphone camera and viewing screen.
More information is available on our about the project page, with project outputs and resources compiled on our resources/outcomes page.
Recent conference events have provided many dissemination opportunities. We have presented at several events, including the International Blended Learning Conference, SOLSTICE, as well as various national JISC events.
We have been both encouraged and heartened with the warmth and enthusiasm shown by delegates towards the possibilities offered by this infant and emerging technology. Our presentations have featured a great many reflective points from participants, with many keen to use aspects of our forthcoming Augmented Reality toolkit to implement the technology in their own contexts.
A video from the April ELESIG Spring Symposium will be posted soon as part of out toolkit of Augmented Reality resources. In the meantime, here’s a photo from one of our sessions after our on-site ‘magician’ used AR to expose the internal organs of a willing volunteer!!
Matt Jenner from UCL's Learning Technology Support Service as Augmented Reality Volunteer during our presentation at the International Blended Learning Conference (Image Credit: Danielle Hinton)
On Monday, we invited ~25 participants to our workshop: ‘Unlocking the Hidden Curriculum: An Augmented Reality Toolkit for Discovery-based Learning‘.
The workshop was a hands-on practical day – both touring the campus to showcasing our work demonstrating biodiversity data, as well as demonstrating the ease of AR creation when using a content management system.
Participants ranged from lecturers, e-learning professionals, to communication staff at Wildlife Trusts and we gathered some invaluable feedback on our system as well as participants’ reflections on the potential use of AR in education.
Thanks to everyone for your participation and reflections.
Doug Belshaw from JISC infoNet has also blog posted an excellent summary of the day, and there were also a number of tweets from the event under the #hiddencurriculum hashtag.
We have compiled video footage from the day into a short video…
We are delighted to announce that our project has been short-listed amongst the nominations for the Times Higher Education Leadership and Management Awards within the ICT Initiative of the Year category. This is a joint bid with the University’s Web Innovation Project, following their work with campus geo-location and an augmented reality navigational application.
The awards celebrate the leadership, management, financial and business skills within the UK higher education sector, and showcase the extraordinary innovation, teamwork and commercial acumen across UK higher education institutions.
This nomination follows on from the many tweets about our project earlier in the year, as well as the conference appearances we are making in the coming months. Together, this further demonstrates the interest and enthusiasm for Augemented Reality as a tool to inform high-quality learning and teaching in the future.
Times Higher Education Awards: Short-listed Nominee
[Bookings have now closed for this event]
Participants should have received programme and joining instructions by email.
Contact Steve Rose for more information.
We’ve just finished shooting for our short show-reel film. We anticipate this will be of value as a dissemination tool at our many forthcoming conferences and public events.
This 2-3 minute film will give an overall feel of how we have used Augmented Reality so far – we’ll be featuring a demonstration of the functions of our interim AR app, and focusing upon two species.
One particular challenge whilst filming, was capturing a good quality screencast from a Samsung Galaxy Tab running Android 2.2. There are some applications available for this purpose, however most require root access to the device (which wasn’t available in our case as we value the manufacturers warranty on the device!).
The Shoot Me app offers non-root screencasts, albeit at a low resolution and frame rate. After evaluating the footage generated through this – and in true Blue Peter style – we created a bespoke tablet filming rig involving a cardboard box which has provided some good quality footage. Look out for our final edit in the coming days!
In the meantime, here some shots from the film shoot to wet your appetites…
Here’s one we made earlier…!
Using the Tablet Screencast Box…
Filming in Progress
With valuable feedback and technical insights gained from the successful pre-pilot test activities of recent weeks, project work will now be focussed towards building an educational resource and investigating concepts of habitat interaction.
Seven significant habitats across the University’s campus have been identified, covering a variety of biotic and abiotic characteristics. These areas range from natural woodland, areas of heavy non-native planting, standing water as well as areas of grassland/meadows.
Whilst species data for some habitats is rich, especially concerning habitat F (which has been the focus of pre-pilot work with students from the College of Life and Environmental Sciences), less is known about some of the other areas.
Data collection, as well as collation from existing sources, has already begun with further testing on a pilot application covering all seven habitat areas expected in the coming months.
Habitats to be Showcased using Augmented Reality
On Friday, project staff had a met with developers from the University of Exeter’s Web Innovation Project (WIP) to further share experiences of web and mobile technologies including augmented reality and geolocation.
Meeting WIP project staff, such as Darren Davies and Rich Osborne, is always an inspiration as they are immersed and incredibly knowledgeable of their field, offering many technological ideas for further development of the biodiversity resources. Indeed, this augmented reality in education project is very fortunate to be able to benefit from web innovation inicitative as they have explored implementation of a wide range of technologies across their 18-month project.
In particular, drawing on the WIP’s successful implementation of a purely navigational Layar application for the campus, we discussed possible solutions to the pre-pilot bugs and improvement suggestions that were mooted when we tested with students last week. For example, it was suggested that we could improve the user-interface of our Layar pre-pilot through addition of both a search bar within our Layar as well as including check boxes, to allow users to select points-of-interest of interest to them.
There were also some great ideas in relation to the possible web interface that we envisaged a couple weeks ago, which would allow users to access campus biodiversity data irrespective of their geographical location. The WIP’s development of the 3-D virtual campus – accessible through a browser-based Google Earth plug-in – is a great example of how technology can be used to aid understanding and interaction with physical locations.
In terms of biodiversity, it is possible to envisage a similar set-up whereby users engage with campus habitats, and click on particular species of interest. As with some of the WIP work, development work could also generate a quick-response (QR) code to enable mobile device owners to access directions to the particular POI. In the meantime, a fully interactive version of of the 3D campus is also available to explore.
3D University of Exeter Campus (Developed by the Web Innovation Project)
Over the past few days we’ve been impressed and overwhelmed by the level of interest, enthuasium and imagination that the project has generated, particularly through the social media site Twitter.
Since the beginning of the week, the project has been the subject of many tweets and retweets, with contributions from some key thinkers within our technology sector. This has included many key thinkers within the augmented reality industry, including Claire Boonstra, the co-founder of the Layar augmented reality platform itself. Many, many thanks to you all!
Indeed further research has revealed that this project blog has been recommended to over 18,000 followers through twitter this week. Following this, we also noticed a Dutch blog post linking our project to the potential for the technology in schools. Below are some sample tweets, with a full list also available.
Just a sample of the tweets we received this week!
Recognising the power of social media as a means to aid developing links within both the educational community and the wider augmented reality industry, we have created a twitter feed to join our project YouTube channel. Please do follow us for the latest project updates: @uoebiodiversity.
We anticipate this will be a great platform to share experiences with technology and the pedagogy, as well as to disseminate our project outcomes and the augmented reality toolkit that we are developing.
Over the past 10 days we have begun testing our pre-pilot biodiversity application on-site, gaining some valuable feedback, bug reports and suggestions for improvement.
The UoE Biodiversity dataset (not currently availble publicly) within Layar has been accessed by around 15 students from various academic disciplines using a variety of their own hardware devices – including the iPhone 3GS and iPhone 4 running iOS, HTC Hero and HTC Desire phones as well as a Samung Galaxy Tab.
Students testing the Layar Biodiversity app on Campus. Image credit: Emma Barker
General feedback has been very positive, and we have gained a valuable insight into how students have used the app to explore and engage with parts of the campus – some having never previously visited.
Most feedback and suggestions for improvement appear to be limited by the Layar platform itself, most notably:
- The option within Layar to locate a particular species/point of interest (POI) is tricky as there is a lack of search facility within specific ‘layars’ themselves. Indeed, the slow refresh GPS refresh rate can also make locating a species using proximity/radar indicators appear unreliable.
- It is a common scientific convention to note Latin species names in italics. This does not appear to be possible to within the POI title in Layar, although names can of course be displayed using italics within the HTML web pages.
- There were also some problems with the display of ‘floaticons’, as on occasions default grey circles are displayed instead of the specified custom icons. We suspect this may be related to the the beta Hoppola Augmentation content management system for Layar.
- There are inconsistencies in the way video was displayed, with some mp4 video files being unavailable. Further investigation suggests that Hoppola may prefer video files in the mobile-standard 3gp format.
- The default floaticon size may be too large, as several icons can hide a large part of reality as seen through the camera. This is particularly pertinent when users are within metres of more than one POI, as floaticons become larger within prominent when close to species.
Despite these issues, students were impressed with the concept of the system, spending time exploring the survey area in detail. In particular, non-biology students were viably engaged and interested by the current system – with students taking time to read the associated species ‘more information’ web pages in detail and share facts with the rest of the group.
Video footage of pre-pilot tests to be added soon to our YouTube channel.
With a pre-pilot underway showcasing the range of data collected by project students, we are beginning to conceptualise how data can be both accessed and presented in the best educational and pedagogic contexts.
For the former, we anticipate two main entry points into the biodiversity data. The first, and most concerned with this project, is accessed through augmented reality platforms such as Layar. At present this is compatible with mobile devices running on the Andriod and iOS platforms, with further development rumoured for the Symbian and Bada platforms.
Short research is underway within the College of Life and Environmental Sciences to determine current Andorid/iOS ownership trends amongst students, with the knowledge that this is a fastly moving sector with comparative market shares changing all the time, as discussed in detail within our Review of Available Augmented Reality Packages report.
As demonstrated through the current pre-pilot project, this is the current interface for mobile devices…
Equally, for those either a) without compatible mobile devices, or b) where it isn’t physically possible for users to access the survey area, an intuitive website could provide, through an interface such as google maps. The following mock up demonstrates how such a system could appear to desktop/laptop web users, irrespective of their geographic location…