Moving beyond the collection box: Connecting community with natural history collections via augmented reality
Immersive technologies such as augmented reality, virtual reality, and mixed reality are quickly gaining momentum as a mass media tool being deployed in all areas of industry and society. These technologies will undoubtedly impact not only how we interact with our world but could have significant potential in transforming our understanding of it.
Unlike virtual reality, augmented reality depends on the user staying connected to place and space to achieve the intended experience and impact. In the augmented reality ecosystem, information is superimposed over the physical world creating an extra layer of interactivity as virtual objects and content appear within a spatial context.
Object-based pedagogy and constructivism has predominantly been the foundation of learning within the field of museum education. It stands to reason that current technologies now have the potential to serve as a virtual extension of the physical object providing deeper learning opportunities. While the artifact or specimen will always be the “real” thing, there are some real drawbacks and barriers that limits how one can engage with collections. Having worked in education and outreach in a university-based natural history collection for almost 6 years these barriers were frequently encountered. The most obvious is the “hands off” rule due to the fragility of natural history specimens. Specimens however may also be pressed or dried as found in botanical specimens making it difficult to fully appreciate their true anatomy and morphology. Fossils often bring up questions as to how the organism looked in life. To pursue the potential uses of augmented reality as it applies to natural history collections, a collaborative partnership with iDigBio (www.iDigBio.org), co-developer and botanist Austin Mast at Florida State University, and ExplorMor Labs at Arizona State University was involved in developing a prototype called Libraries of Life (www.libraries-of-life.org). The app has since served as a collaborative augmented reality platform serving natural history museums and nonprofits who are involved in biodiversity and conservation education and outreach. The Florida Museum of Natural History used the platform to promote a public campaign on the endangered status of the Miami Blue butterfly via an augmented reality craft beer label which was distributed at restaurants in Disney’s Animal Kingdom theme park. https://www.youtube.com/watch?v=L0Op828foWw
Users activate the augmented reality experience by scanning target images which can appear on collection cards, floor graphics, stickers, and even tattoos at outreach events and exhibits allowing the public to interact with the specimens. Target images can also be enlarged allowing participants to engage with a larger than life praying mantis, to walk around it, or take their picture standing next to it.
The initial prototype involved using photogrammetry techniques to create 3-D models of specimens such as insects, plants, and fossils contributed from some 15 collection groups in the iDigBio network. The resulting models were stored in a cloud repository and on recognition of target image, the specimen models and augmented reality scene would launch on the end-user’s device. The potential outcome and impact of a 3-D repository on promoting accessibility to collections to anyone with a mobile device will pave the way for new case uses as virtual and physical worlds become intertwined. This movement is in parallel with the current massive digitization efforts of museums to output collections in both 2-D and 3-D digital formats. The resulting work of these museums are increasingly being shared on platforms such as Sketchfab (https://sketchfab.com/feed) creating online virtual museums.
With the current release of iTunes ARKit and Android ARCore SDK for developers and mobile devices which have AR viewers built in, we will see an increase in the integration of this technology on a massive scale. AR ready devices will allow museum visitors to easily engage with content in space and place with or without target images or the need to download apps. Probably the biggest barrier in the use of augmented reality in museums currently is the need to download an app. If you are involved in the emerging technology world you already know that things don’t stay the same very long. The augmented reality ecosystem is about to shift again with WebAR. The virtual museum of the near future will be one which you can search for a collection online, pull artifacts from the cloud, gather data, and engage with it in physical space all through a web browser. With the emergence of haptic technologies and mixed reality, users will not only be able to view and interact with the object in front of them but to also touch it. The cognitive and learning potential of technology that incorporates all the senses with objects in a spatial context will undoubtedly push the barriers in how we learn and engage with collections.
As museums start to think outside of the proverbial collection box, augmented reality and computer vision technologies may well be the tool of choice in moving experiences beyond the object while enhancing deeper engagement and accessibility. With this, new best practices and communities of practice will arise along with the need to answer new questions.
While the landscape of reality computing unfolds there is perhaps no better time to explore how these technologies might democratize the museum and move us beyond the collection box. This clearly is an opportunity to change the dynamics of how our audiences understand and engage with collections and the natural world.