With the help of artificial intelligence, dead animals at Cambridge University’s Museum of Zoology have been given a voice so they can share their stories and experiences.
The project started on October 15th and ends tomorrow. It allows more than a dozen exhibits to “speak” to visitors, including an American cockroach, the remains of a dodo, a stuffed red panda, and a fin whale skeleton.
The long-dead creatures and models will have their own personalities and accents. They will be able to talk in their voice or through text on visitors’ mobile phones.
The AI technology provides the animals with the ability to describe what their lives were like on Earth and the challenges they faced in an effort to foster a deeper understanding of the biodiversity crisis.
“Museums are using AI in a lot of different ways, but we think this is the first application where we’re speaking from the object’s point of view,” Jack Ashby, the museum’s assistant director, said.
“Part of the experiment is to see whether, by giving these animals their own voices, people think differently about them. Can we change the public perception of a cockroach by giving it a voice?”
The project was developed by Nature Perspectives, a company with a mission to build AI models that can cultivate profound connections between people and the natural world.
For each exhibit, the AI is programmed with specific details on the specimen’s habitat, natural environment, and how it arrived at the museum and became part of the collection.
The exhibits can change their tone and language depending on the age of the person they are speaking to, making the experience accessible to a wider audience. The animals can converse in more than 20 languages, including Spanish and Japanese.
For instance, the red panda has a subtle Himalayan accent, while the mallard sounds like a British person.
In addition, the platypus has an Australian twang. It is hoped that museum visitors will learn more information than what fits on the standard labels that accompany the specimens.
The conversations that visitors have with the specimens will be analyzed as part of the project. The analyses will help reveal what type of information people are looking for.
The AI suggests several questions to ask, such as “Tell me about life in the open ocean” for the fin whale, but visitors are encouraged to ask whatever they want, which can result in more personalized, stimulating, and memorable interactions.
“When you talk to these animals, they really come across as personalities; it’s a very strange experience,” said Ashby.
“I started by asking things like ‘Where did you live?’ and ‘How did you die?’ but ended up with far more human questions.”
Hopefully, these talking animals will inspire more empathy and concern for biodiversity loss and environmental conservation.
You can read more about the project here.