Augmented Dartmouth lets users explore some of the College's masterpieces.
Why do the figures in the Anglo-American panel of the Orozco murals look like zombies? What's going on in the landscape behind the throne in Perugino's painting of the Virgin and Child with Saints?
A new mobile app—developed by Associate Professor of Russian Mikhail Gronas, Associate Professor of Art History Mary Coffey, Associate Professor of Art History Nicola Camerlenghi, and their students—uses augmented reality technology to let viewers discover these and other hidden details in two of Dartmouth's most prized works of art.
A beta version of the app, called Augmented Dartmouth, will be available for download from the App Store and Google Play in time for homecoming this weekend, says Gronas. This version will provide an augmented experience to viewers of José Clemente Orozco's Epic of American Civilization murals in Dartmouth Library's Baker-Berry Library as well as Perugino's masterpiece, now on view in the Hood Museum of Art.
"The analogy was a footnote—these are basically notes for an eye," Gronas says. "I thought it would be great to allow paintings or murals or architectural objects to talk back to us, the way books can do."
The app works a little bit like Pokémon Go—when users aim their phones at the art, the app shows points on the work that have been digitally augmented. Click on one of those points—say, on the white-robed figure of Quetzalcoatl in the Orozco murals, or one of the saints in the Perugino—and you'll open a window with more information.
Eventually, says Gronas, these explanatory windows might include reference images and videos, all linked to specific details in the art. The team ultimately hopes to expand the project to include other works of art on campus—and to provide the open-source software, known as Eyenotes, to other museums and institutions to use with their own collections.
The idea grew out Gronas' earlier experience collaborating on an augmented reality app that let users register real-time approval for political candidates during the 2016 presidential debates.
"We'd been developing technology that allows us to 'glue' a piece of augmented reality to part of a visual image, and I was thinking about ways to use this in educational and cultural contexts," says Gronas, a specialist in Russian literature who calls app development a "side interest" to his scholarly pursuits. "One object I thought about was the Orozco murals."
When Gronas first approached Coffey—an expert on Mexican muralism who frequently leads tours of the murals—she was a little skeptical.
"When Mikhail first came to me, I didn't know much about augmented reality," she says. "I found examples of how some museums use it, but they were all super-gimmicky—like, make the Mona Lisa dance, or put yourself in with the dinosaurs. I wasn't interested in that. But I did see a few examples that helped me understand the potential."
Coffey says she's excited about the potential of the Eyenotes technology to allow for data about artworks such as murals to be crowd-sourced. "Murals are everywhere, but they're poorly documented, and attempts to create resources to document them often fall apart because of the scale of the project," she says. "An Eyenotes wiki-like tool that people could crowdsource has the potential to create the kind of information that we have been wanting all these years for public art and murals. It's something that I think would greatly interest my colleagues in Mexico."
Camerlenghi was interested in the potential of augmented reality to help annotate virtual-reality projects—including his three-dimensional virtual-reality model of the Basilica of St. Paul's in Rome, an ancient church that burned to the ground in the 19th century.
"The model was instrumental for publishing a book and other scholarly endeavors, but I realized that what I had created was not very user friendly if I wasn't there to tell people what they were looking at and how I arrived at my conclusions," Camerlenghi says. "I want to show other scholars my thinking, and I want students and teachers to use this as a tool to understand what the space was like. That means annotations. So I was imaging some kind of augmented reality to the virtual reality—is there a word for that?"
"Yes—fused reality," Gronas says.
"So in comes Mikhail with this proposal, and I saw the potential application for virtual reality," Camerlenghi says. Thanks to funding from the Kress Foundation, this fused version of Eyenotes will be the project's next stage of development, once Augmented Dartmouth is launched.
In the spring term, students in Coffey's "Mexican Muralism" class drafted the first texts for the app, based on their research on the Orozco mural. One of these students, Grace Hanselman '20, stayed on campus over the summer to continue working on the project. Natalie Shteiman '21 helped test-drive and proofread the application prior to launch. Other contributors to the Orozco content include Yazmin Ochoa Flores '21, Jennifer Lopez '20, Alicia Massey '22, Jhon Ortiz '20, and Karla Rosas '20. A presidential scholar, Courtney McKee '21, helped Camerlenghi develop content for the Perugino part of the app.
"Working on the content has provoked for me a whole bunch of bigger conceptual questions," Coffey says. "When you're thinking about a book, you're always thinking about discrete chapters and some kind of through-line argument that runs across the text. When you're doing a guided tour, you think about how to move people through the space. This was a new challenge, and one I feel like we're still working on. We've had to think about where people get information about the whole panel before we elaborate on certain details. For me that's where the challenge lies, and I'll be interested to see how people use it."
Support for the project has come from the 250th Anniversary Committee, the Hood Museum of Art, the Leslie Center for the Humanities, the Dartmouth Center for the Advancement of Learning, and the Kress Foundation.
The Augmented Dartmouth app is available for download at the Apple App Store and Google Play.
Hannah Silverstein can be reached at hannah.silverstein@dartmouth.edu.