top of page

Future Fauna


Vulpes Haliaeetus Capreolus - Fox, Eagle and Deer - Two variants

Future Fauna is an interactive art installation in Augmented Reality at The Swedish Museum of Natural History in Stockholm. The visitors interact with the taxidermy specimens in the showcases, and see the long since deceased animals return to life.


The visitors can play with, feed and breed the animals. The virtual beings are decoupled from the laws of biology and can interbreed across species. Strange beings populate the exhibition rooms; owls with antlers, foxes with eagle wings, wolves with moose bodies.

What role does human perceptions of beauty play in our interaction with non-human beings? What happens with nature in virtual worlds outside human control? What cryptozoological potentials comes with a reality freed from the implacable laws of nature?

Find the app for iOS here and for Android here.

You can use the app without visiting the museum. In the first info prompt you get an option to start with 9 animals. You breed the animals by cuddling with them.


A Roe Deer taxidermy comes alive in its showcase. Click for a walkthrough of the project.

The project enhances the traditional museum diorama with augmented reality, creating an augmented diorama. The static displays of taxidermy become interactive, and the visitors can play with and feed the animals. This explores the possibilities of using novel visualisation technology in a museum context. 

What happens when the visitors can interact with otherwise “dead" installations? AR opens up for a deeper engagement with the exhibitions, and a deeper understanding of the object on display.

Future Fauna is made by Untold Garden in collaboration with interaction designer Sandra Dang and The Swedish Museum for Natural History. The project is funded by Kulturbryggan and Stockholms stad.

The project has been used by schools visiting the museum to teach the evolutionary processes. A fictionalised text version of the project was featured in NORK magazine #4.


We conducted a number of user tests at the museum before release, especially with kids since they are the museums primary audience.

The 3D models are created by scanning specimens in the archive of the museum.

The basement of the museum is filled with taxidermy animals, a large dusty archive of life. Beautiful African antelopes and towering bears, as well as ragged old cats and a reindeer broken in half. Nothing is to be thrown away in case it contains valuable genetic material, so we in the future can recreate the beings we now make extinct. 


Creating a photogrammetry 3D model

The mixed animals are made by a genetic algorithm that randomly selects the genes of the parents and creates a new being with the respective limbs. The users themselves choose what animals to breed but cannot choose in what way they are combined.

The initial aim was to use a neural network for this process in collaboration with software engineer Hiroharu Kato, creator of the world's first Neural Renderer. For several reasons, this method, unfortunately, was not viable, and we had to resort to a simpler method. This is an initial test of a Tiger reimagined by Hiroharu’s neural network.


AR Technicals - The animals follow a NavMesh based on the museum's floor plan. This forces the animals to avoid the same obstacles as you and creates the illusion of a shared space. The NavMesh is re-instantiated at each showcase, which solves drifting and misalignment, creating a seamless experience.

The exhibition has 9 active showcases, hence the virtual world is made up of 9 different virtual spaces, each with its own origin as the point of instantiation to be aligned with the actual space. When swapping one virtual space for another, the animals are translated to the accurate point in the new space.

bottom of page