Making fun of surveys

Exploring gamified patterns for self-response surveys

Role

UX Designer

Focus

Visual / Interaction Design; Research

Duration

2014–2016

Challenge

The Census collects critical national data through various methods, from self-response digital interfaces to field workers conducting door-to-door interviews using phones and tablets. However, there were no standardized usability guidelines for mobile survey interfaces across different Census operations

Role

As the UI/Interaction designer on a specialized research team, I worked alongside a mobile developer, two product owners, and Census research psychologists to design and conduct comprehensive usability tests. My role focused on creating test scenarios and providing design insights that would inform evidence-based standards for government survey interfaces.

Project scope

The goal was to consider ways to overcome user's poor sentiment, and low engagement rates. Self-response surveys are optional, but stakeholders desired to maintain and increase participation. These mobile interfaces needed to work for an incredibly diverse user base with wide-ranging tech literacy and bandwidth connectivity.

Field enumerators also needed systems that would function reliably in various environmental conditions, from urban apartments to rural areas with poor connectivity.

Research

Our approach combined multiple research methods to create a comprehensive understanding of mobile survey usability. We collected both qualitative feedback about user preferences and quantitative data about interaction patterns, completion rates, and error frequencies.

Developing the prototype

I created mobile prototypes specifically designed to measure user interactions and gather empirical data about interface effectiveness. These prototypes worked as functional test environments that captured time on task, touch target accuracy, and overall journey.

To boost engagement, I tested novel interactions loosely adapted from simple games that use gestures like gestures like tap, slide and swipe.

Testing for trust

A crucial component of our research focused on self-response trust factors. We tested how different visual design elements, including government logos and branding approaches, affected user confidence in the survey system. This research aimed to encourage broad public participation, with a trauma-informed approach–given the sensitive nature of data collection form the government.

Results

The project established a framework for ongoing usability research, with documented plans for further rounds of field testing. This foundation ensured that Census mobile interfaces could continue to evolve based on empirical evidence rather than assumptions.

Key learning

Working on Census interfaces highlighted the unique challenges of government survey design, where accessibility, trust, and broad usability are not just user experience goals but civic responsibilities. The interfaces needed to work for every respondent, regardless of their technical background or circumstances.

Next steps

The research established a foundation for standardized mobile usability guidelines across all Census operations. The documented findings and testing framework provided the IOE Group with the tools needed to continue refining and improving mobile survey interfaces based on ongoing empirical evidence.

Colophon

IBM Plex Sans

Merriweather

Nathana Reboucas for Unsplash

© 2024 -
Lauren Russell

Colophon

IBM Plex Sans

Merriweather

Nathana Reboucas for Unsplash

© 2024 -
Lauren Russell

Colophon

IBM Plex Sans

Merriweather

Nathana Reboucas for Unsplash

© 2024 -
Lauren Russell