At the Fork is a documentary about humane meat & animal farming. I designed a 360-degree virtual reality app as part of the film’s marketing & impact campaign.
UX Research & Design
United States of Animals (the filmmakers), Whole Foods Market & the Humane Society of the United States
At the Fork is a documentary about humane animal farming. The filmmakers asked us to create a virtual tour of farms from the film, in order to increase shoppers' interest in and understanding of humane animal products.
We worked with them over several weeks at the start of the project to determine the key goals for the project, and to work through the right approach and tone.
Desk research, interview notes, landscape analyses, stakeholder analyses + subject maps from early strategy sessions. What would "success" look like for this app, for all stakeholders?
The filmmakers wanted to show a variety of real conditions affecting five different varieties of farm animals: egg-laying hens, broiler chickens, dairy cows, pigs, and cattle. Each animal had a range of improvements that could increase their quality of life — more space, more access to express instinctive behaviors, different feeding strategies, etc — but every variable we wanted to explore meant another video to organize into a larger content structure.
Our challenge was to give users a way to experience everything, without having to make too many choices — but it was important for key stakeholders that users be able to explore environments modularly, especially for in-store experiences, where they'd be more crunched for time.
A low-fidelity prototype of the home screen. On opening the app, users would be presented a primary, narrated experience that would guide them through a variety of environments.
Below that, quick access to modular, "deeper dive" content — where users could select a particular animal to learn about in depth — was just a tap or swipe away.
A low-fidelity prototype of the "Explore" screen (a selection of modular experiences). We wanted to allow users to see details about an experience they were interested in before "committing" to it, without having to go "back" to explore other options.
Once users were in VR, we didn't want to make them leave in order to navigate to the next section — but we weren't sure if navigating in VR would be the same in different devices. We decided to test in-VR navigation in real devices, where we identified major usability & consistency challenges in cross-platform navigation (below).
Because we started with these tests, we were able to iterate and pivot early on in design, and create affordances that worked across all devices.
Initially, we thought that conversational patterns like nodding in response to questions like "Continue?" would be platform-agnostic and intuitive for new users.
This worked really well for hands-free headsets: users nodded, the device's accelerometer was triggered, and subjects thought it was delightful.
But for handheld devices (like Google Cardboards) it was a disaster: users nodded their heads, and held their hands (and their device) still. The device couldn't tell they were nodding, and they grew frustrated and confused — "Why was nothing happening? Did I nod wrong? Should I nod harder? Did I break it?" — without any feedback.
The "long stare" was easier to understand, across more devices. Users would point their field of vision at a particular target for a certain length of time to go to that experience. It took slightly longer to navigate, but it had a few huge advantages:
The filmmakers had done some early interviews about the content of the film (and the VR footage), and surfaced an unexpected user concern that we might not have known to look for otherwise: anxiety about the material.
Oh, uh, I don't know. I don't think I want to see animals in situations like that.
What group are you with? Is this one of those undercover videos?
Quotes from test users & audience members approached by the filmmakers.
Around that time, a few undercover videos of farm animals being abused had gone viral. Users were concerned that we'd be strapping a device to their face and trapping them in an experience that was painful to watch even when it wasn't in VR.
To be clear: our footage was not like that at all. But it wasn't an unreasonable concern, and it meant that our final design challenge would be to build trust and put our users at ease, before they even downloaded the app.
Technology: Frank Flemming
Creative Direction & Script: Marshall Walker Lee
VR Footage: Canyon Darcy & Ryan Hunts
Pacific Legal FoundationWeb Design
I started designing and building websites and games in 2006, but as my dad would tell you, I started playing with (and occasionally breaking) computers (usually his) long before that. After studying biology and literature in college, I returned to product design with a passion for research methodologies, human-centered technology, and an itch to make things.
My brain likes making connections between different perspectives; I’m happiest when I can be the “glue” in a project or team.