Taking a page from the private sector, global public health organizations have become obsessed with human-centered design, applying it to an array of projects, from handwashing in Kenya to Ebola outbreaks in Nigeria. They've used it to understand their “customers”—the people targeted by their programs—through the familiar tools of the design-thinking trade: interviews, observations, co-designing with communities, rapid prototyping, and quick testing of solutions.
Yet design thinking alone won't help us understand the 800 million people living in poverty, our customers in global health. The complexity of their needs calls for more comprehensive efforts in both diagnosing the problem and designing solutions. Above all, we need to focus on better data that go beyond simple tallies. We must do more than collect the “what” information, such as how many women are breastfeeding. We must also know why our customers do what they do. We need to understand the structural factors, policies, laws, individual beliefs, motivations, biases, and influencers that play a significant role in how people make choices.
I saw the limitations of design thinking first-hand when I evaluated a human-centered design company's recommendations for getting more women in Bihar, India, to breastfeed their newborns. To come up with its plan, the firm spent two weeks doing qualitative research in the field as part of a 12-week engagement, interviewing and observing what I considered to be too few people. The proposed solution? An interactive mobile platform to build awareness of breastfeeding by providing targeted messages to the mothers and other community members. I and other experts found it uncompelling and unactionable. Its flaws were rooted in the initial research, which failed to account for who had decision-making power in the women's families, complex community dynamics that steered a mother’s choices, and the fact that few women had mobile phones.
We almost made the same mistake at the Surgo Foundation. We hired a human-centered design firm to understand why nurses in rural Uttar Pradesh, India, didn’t follow clinical practices when delivering babies, especially when deciding whether to refer a pregnant mother from a local care center to a bigger hospital. What stood out in the qualitative research was that nurses unnecessarily referred patients to a larger facility mainly because they were responding to non-clinical factors, such as worries about consequences to themselves if anything went wrong with the patient or pressure from families.
Had we just relied on this firm’s insights, we would have developed a very narrow and potentially ineffective set of interventions. Instead we found guidance for understanding the problem from Amazon, a company that uses not just focus groups but also millions of data points on customer behavior. We coupled the consultants' qualitative insights with information from quantitative studies that was gathered in a way that did not breach privacy or aggressively track survey subjects—two concerns that many people have about technology firms that we also share. And what did we find? A much broader set of factors that affected the clinical actions of a nurse, such as the presence of medical officers, the facility being worked at, and the time of day.
Combining qualitative research with data from large samples of the population moves us beyond design thinking's anecdotal approach and closer to generating truly useful insights about our customers and the systems that surround them. But how? Here are three steps to guide the way:
1. Know What Data to Look For
Several behavioral models, such as COM-B and Health Belief Model, can shape your decisions about what data to pursue to generate insight and build effective interventions. However, none of these models holistically captures all of the drivers of behavior. At Surgo, we developed CUBES (Change behavior and Understand Barriers, Enablers, and Stages) for this purpose. The framework has three components:
- The path toward a desired behavior consists of a series of distinct stages: awareness, intention, and action. Is your research addressing one or more of these stages?
- The progression through each stage is influenced by two types of factors. Contextual factors include infrastructure, policy and laws, relevant processes, and social norms. Perceptual factors include beliefs, biases, emotions, and personality. Is your research accounting for all of the factors?
- Barriers to and drivers of behavior may come through influencers (friends, family, and community members) either directly or through media channels. Is your research identifying the barriers and drivers, and their origin?
At Surgo, we used CUBES in several large-scale development programs, including HIV prevention, maternal health, child health, and family planning. For example, we have been studying why 20 percent of mothers in Uttar Pradesh, India, still give birth at home despite government incentives to use hospitals. Our state-wide quantitative survey with mothers identified a number of key reasons: beliefs (it’s safer to deliver at home); perceptions (home delivery is the norm); and structural factors (hospitals are far away). But listing reasons why the incentives weren't working for the 20 percent wasn't enough to produce a solution. We also segmented mothers into different groups based on several factors, such as their perceptions of safety, which allowed us to target the right intervention to the right mother.
2. Know What Data Already Exist
The truth is, not all information needs to be originally gathered. The global health community already offers an abundance of data. In Uttar Pradesh, the government has been focused on building the communities’ awareness about modern contraceptive methods. But when Surgo looked at the government’s annual quantitative family planning survey, we found that the awareness of modern contraceptives was already high, while intention to adopt them was low. This told us that awareness-building campaigns would be a waste, and that efforts should instead focus on bridging the gap between awareness and intention. This insight had huge implications on program design and budgets.
Still, you sometimes need to collect new information. We see many gaps in global development data around perceptual factors, such as biases and beliefs. Methods to gather this information include polling booth surveys, journey maps, discrete choice experiments, implicit association tests, and surveys with validated scales. While many large organizations in global health have research and analytic capabilities, others might need to buy them from independent contractors or organizations that specialize in data science, market research, or evaluation.
3. Spend Wisely
Significant time and money are often wasted in trying to identify effective interventions with trial-and-error approaches that fail to interrogate the underlying “why.” We also see organizations spending elaborate sums on human-centered design firms without first ensuring they have the right data and insights to inform the design work. While collection and analysis of data may require investment up front, it ends up being more cost-efficient because the resulting interventions will likely be more effective.
Surgo learned this the hard way in a voluntary medical male circumcision program. We designed and evaluated close to a dozen interventions to promote the uptake of circumcision by men in Africa. After two years and several hundred thousand dollars, only four interventions demonstrated any efficacy. Subsequent research showed us why. It turned out that knowledge about the procedure was not a barrier to getting it, yet many interventions focused on increasing awareness. What actually got in the way was an expectation of pain—a belief. The data showed that men wanted honest answers about how much the procedure might hurt. Using the principles of design thinking, our partner then created a pain-o-meter to help facilitate conversations about circumcision, leading to more men choosing it.
If we had focused on unearthing the “why” upfront—the fear of pain—and avoided jumping to a solution design with partial information, we would have made interventions more likely to succeed at the get-go.
Expand the Toolbox
We must include our customers in the design of our solutions, but it can be only one element of our approach. We also need the rich insights found within robust data. Our thinking, language, and toolbox must expand. A customer-centered approach to global health must include data that ensures we understand not just the “what,” but also the “why” of the people we're trying to help.