atf-photo

At the Fork VR
A virtual tour of American farms & humane husbandry

In 2015, I collaborated on the UX/UI of At the Fork VR, a companion app to the documentary film At the Fork. Despite an aggressive timeline, our team was comitted to testing, and as a junior designer, I had the opportunity to learn a lot about the challenges of designing software to fit within a physical context.

Role: 
UX/UI Designer

Client:
Whole Foods Market & HSUS

Timeframe:
2014 - 2015

the brief

The project had two goals: educate shoppers about animal welfare, and encourage users to watch the documentary.

Our challenge was that most app users would use the app in a distracting environment. We planned to show the app in grocery store pop-ups and at film festivals. We needed to make the app delightful, engaging, and extremely easy to use.

Goals
- Promote the film
- Educate users about animal welfare

Challenges
- busy/distracting environments

Screenshot 2019-11-26 11.41.30

virutal reality: show, don't tell

For shoppers who want to buy humane products, grocery stores are a maze of confusing labels. In our research, we found that lots of people were already trying to explain the differences, but their clinical descriptions left us feeling more confused. How big of a difference is 0.3 square feet of indoor space? What exactly is a "foraging area"? How much darkness do chickens like having?

We wanted to avoid yet another explanation of labels our users wouldn't recognize with jargon they didn't understand. Instead, we wanted to show them what humane animal farming actually looked like.

How might we: show users what
labels mean, without overwhelming them
with industry jargon?

noun_Up hand drawn arrow_1563363

One of many "what do all these labels mean?!" articles.

tone and
transparency

Animal welfare is an emotional topic, and seeing animals in conventional farming environments can be uncomfortable. We wanted to create a positive experience, but we didn't want to sweep less-ideal practices under the rug, either. 

We needed to balance being transparent about the bad stuff, while also empowering users to feel good about purchases from farms like this GAP-5 rated chicken farm (right).

Web-7
image 3.1

Using a spinny chair,
like a pro

prototype,
test, repeat

Because we were on a tight timeline, and lots of crucial components were still in flux — the film's poster and visual language, the name of the app, whether or not we'd have a celebrity voiceover — we opted for a loosely agile approach to design and development.

We broke features and interactions down into modular components, built protoypes, and tested them internally. Below are a handful of our prototypes and key learnings.

"Err.. I don't know. I guess just
show me everything?"

the split home screen

The app needed to serve dual use cases. When users were given the app to try on their own, they often weren't sure what they wanted to see first. We wanted to provide those users with some guidance, so we foregrounded a narrated experience which toured all 10 farms featured in the app.

But in a pop-up scenario, someone working at the booth might want to show a bystander a particular species, and would need to access that experience quickly, too. So we created a split screen to give new users a strong recommendation, and "power-users" quick access to specific, modular content.

 

details without
commitment

For new users who did opt to Explore, we wanted to provide details about each experience before "committing" to it, and without having to go "back" to explore other options.

cross-platform
affordances

don't make me think
leave VR

While we couldn't do much about the friction of using a VR device, we could make sure users didn't have to leave "VR-mode" to navigate to another video. We weren't sure if navigating in VR would be the same in different devices — and as it turned out, it isn't. We identified major usability & consistency challenges in cross-platform navigation. 

ATF-CaseStudy-ResearchInsight-01-white

To Try this experiment at home:
1. Hold your phone in front of your face
2. Nod "yes"
3. Note how awkward it is to move your
entire torso so your phone *and* your hands
move at the same time

problem:
humans don't nod with their hands 

We originally thought that conversational design like nodding in response to questions like "Continue?" would be platform-agnostic and intuitive for new users. This worked great for hands-free headsets, like the Occulus: users nodded, the device's accelerometer was triggered, and users thought it was delightful.

But for handheld devices, like Google Cardboard, it was a total disaster. Users nodded their heads, but held their hands still. The device couldn't tell they were nodding, and they grew frustrated and confused without any feedback — "Why was nothing happening? Did I nod wrong? Should I nod harder? Did I break it?"

In the live version, the tap target
became a "view" target, at the 
center of the device.

solution: the
"long stare"

The "long stare" was easier to understand, across more devices. Users would point their field of vision at a particular target for a certain length of time to go to that experience. It took slightly longer to navigate, but it had a few huge advantages:

  1. We could give users feedback about whether they were "doing it right," or if they'd moved off their target through the use of a progress bar.
  2. It relied on a physical behavior that users were familiar with — they could use the same way of looking around their environment that they'd already practiced in earlier VR experiences.
  3. It worked without a VR device, too — users who wanted the "360-degree" experience (without VR) could navigate just as easily from one experience to another.

"Oh, uh, I don't know. I don't think I want to see animals in situations like that."

branding +
emotional design

The filmmakers had done interviews about the content of the film, and surfaced an unexpected user concern: anxiety about the material.

Around that time, a few undercover videos of farm animals being abused had gone viral. Users were worried that we'd be strapping a device to their face and trapping them in an experience that was painful to watch even when it wasn't in VR. Our final design challenge would be to build trust before they even downloaded the app.

The app itself had to be built before we could finalize a visual direction for the film's marketing ecosystem, but we were able to apply a consistent visual design across the website and the app's marketing materials. We opted for friendly hues of green and blue, and rounded fonts to help create a sense of approachability.

Credits

Technology: Frank Flemming
Creative Direction & Script: Marshall Walker Lee
VR Footage: Canyon Darcy & Ryan Hunts

Hey there!
Let's talk.

Want to learn more about me? I love coffee and talking to people. Shoot me an email, and let's meet up!