When the designer shows up in the design
How our assumptions and our biases shape what we design and how we design it
)
Explore Business Standard
How our assumptions and our biases shape what we design and how we design it
)
I want you to imagine the American Midwest.
Don't think about what it looks like as a landscape — instead, imagine you were to put the Midwest on a map. What would it look like geographically?
Have you got it?
Some of you may be picturing the Midwest like this — the way the Census defines it. It includes North Dakota and South Dakota, as well as Wisconsin, Illinois and Indiana.
For others, things may look a little different. Here are just a few published maps of the Midwest, from an educational resources site, an interior design council and a manufacturing company:
Sometimes the Midwest is only states bordering the Great Lakes, sometimes it's larger. Sometimes Kentucky counts as the Midwest, other times it's in the South. We assume the Midwest is a generally fixed and stable thing, but that's not the case. It turns out that there are lots of Midwests.
What would happen if we mapped out all the different Midwests that came to mind for you and for others? What would that look like?
This is exactly what cartographer Bill Rankin did in 2013. He overlaid 100 published maps of the Midwest. Here's what they all look like together:
That's just one map, one example. But it speaks to something broader. Just as we draw different mental maps of a single region, we each carry our own assumptions and ideas about the world.
Here we'll focus on the unintended ways that assumptions, perspectives and biases find their way into our work as journalists, designers and developers. We'll look at how the decisions we make — what data to base our stories on, what form those stories should take, how they're designed, who they're created for — always come out of our particular point of view.
Much of our work as visual journalists is based on data. Even before we interpret or analyze it, we've already made assumptions about what data we'll focus on, or what data will help us answer a question.
For example, if we're looking at the impact of crime on local communities we might choose to focus on the victims of crime, and seek out data about the frequency and locations of specific incidents. This is the general pattern of most "crime mapping" tools or visualizations, like this one from Trulia.
A few years ago Laura Kurgan of Columbia's Center for Spatial Research made a criminal justice map from a different perspective than the typical crime maps. Instead of looking at the frequency of crimes, she looked at neighborhoods in terms of the percentages of residents who were in prisons, with an eye toward understanding the effect of mass incarceration on the people left behind. In her research, she found that most incarcerated people in American cities come from just a few neighborhoods — mostly poor neighborhoods of color. The resulting project is called Million Dollar Blocks.
Kurgan found that these same neighborhoods are desperate for other kinds of public resources. In interviews, she makes the case that the typical crime-mapping tools are actually part of the problem of mass incarceration, because they frame crime in an oversimplified way — as bad acts to be eradicated, and not the product of a system whose heavy costs are often borne by the very population law enforcement is meant to protect.
What data set to focus on, and how to frame it, is a decision. Kurgan looked at the same data and framed it completely differently, and that changed what we were able to see.
The way in which data is collected often reflects something about the people who collect it.
A few years ago, the City of Boston released Street Bump, a smartphone app that uses GPS and accelerometer data to detect potholes as Bostonians drive through the city. Every bump gets submitted to the city, and three or more bumps at the same location trigger an inspection and, in theory, a quick repair.
Luckily, Boston was aware of this issue and addressed it from the beginning, by giving the app to city workers to use as they drove around the city. But the implication is clear. Data doesn't speak for itself — it echoes its collectors.
Or to take another example: A study came out a few years ago that looked at Twitter and Foursquare data generated during Hurricane Sandy, which included millions of tweets and posts.
The researchers found some interesting things in the data, like a spike in nightlife right after the storm, and some unsurprising things, like a surge in people buying groceries right before the storm.
In an article for Harvard Business Review, Microsoft researcher Kate Crawford calls this a "signal problem." Data is usually assumed to accurately reflect the social world, she says, but there are significant gaps, with little or no signal coming from particular communities.
Beyond selecting and collecting a data set, the way we display data is the result of decisions that can carry political or cultural weight. Even small visual elements can have significant meaning.
Even a single line can take sides in a political dispute. The people who make Google Maps are careful to avoid doing that. When showing disputed territories, Google Maps will display the same reality differently for people in different parts of the world. For instance, Russian users see Crimea — the disputed region formerly part of Ukraine — as part of Russia, and marked off with a solid line. But Ukrainian users see it completely differently. Instead of a line, Ukrainians see the border as a subtler, almost invisible dashed stroke, leaving an open question as to who controls the area.
This raises larger questions about when it is acceptable to represent humans with dots at all, and what reactions a reader might have to seeing other people, or themselves, represented that way. Jacob Harris, then a newsroom developer at The New York Times and now at 18F, wrote a sobering piece about this conundrum of representing people with dots, and when it may not be such a good idea.
Designers and developers like to talk a lot about the user: user testing, user experience, user interface, etc. But your assumptions about the people you're designing for can have serious consequences.
Take something as seemingly trivial as intrusive ads on webpages. Of course, everyone thinks banner ads are annoying, but they can make websites actually impossible to use for people who are blind or have low vision, or anyone who depends on a screen reader to navigate the web. Science journalist Rose Eveleth found that these ads often don't include any text for a screen reader to read or are missing a close button so users with visual impairments are prevented from using the page at all.
Or take the period-tracking app Glow. The opening screen asks you to choose your reason for using the app. A female user's three choices are: avoiding pregnancy, trying to conceive or fertility treatments. Male users get their own option.
Engineer Maggie Delano, who describes herself as a "queer woman not interested in having children," wrote about her experience using Glow and other period tracking apps.
"I figured the apps couldn't be THAT bad and that I could probably just ignore the pregnancy stuff," Delano wrote. "But I was wrong … my identity as both a queer person and a woman with irregular periods was completely erased."
According to a spokesperson at Glow, the app's "main purpose, since its inception, has been to be the best fertility tracker out there … and remains focused on the experience of trying to conceive." But because the company knows that not every woman is focused on tracking her fertility, in August 2015 they launched a separate app called Eve by Glow, targeted at women who instead want to focus on tracking their cycle and sexual health.
Making assumptions about our users can have other unexpected, unhelpful and potentially dangerous results. For example, it shouldn't surprise you to know that if you say to Siri, the virtual assistant built into the iPhone, "I'm having a heart attack," it responds with list of nearby hospitals along with a link to call emergency services.
But not all emergencies yield such helpful results. Last year, designer and author Sara Wachter-Boettcher tested out Siri's response to a few other emergencies. She found that Siri seemed to have a surprising lack of knowledge about rape and sexual assault.
Here are three responses it gave on those subjects.
After the JAMA study came out, Apple worked with the Rape, Abuse and Incest National Network (RAINN) and researchers at Stanford to revise Siri's responses. As of March 17, 2016, it now includes helpful resources, such as links to the National Sexual Assault Hotline.
All this comes despite the prevalence of such crises for women. According to some estimates, nearly one in five American women will be raped or sexually assaulted in their lifetime, and one in three women worldwide are victims of physical or sexual abuse. As conversational agents like Siri, Alexa or Google Now become ubiquitous and speech becomes the dominant user interface, making sure this technology is responsive to all users will be even more critical.
Assumptions about what is a "normal" or "default" user come up over and over again. Take facial recognition software. It has had problems recognizing people of color who presumably did not look like the white faces in its practice data sets.
Lorna Roth, a Concordia University communications professor, wrote a fascinating historical overview of the Shirley card. She describes how, from 1959 until today, many versions of Shirley have circulated through photo labs, defining and molding onto photographic equipment a standard of "normality" that was exclusively white.
It may be tempting to think that even if we all bring our own assumptions to our work, that they'll just cancel each other out. The Midwest map roughly captures what we mean by the "Midwest," doesn't it?
But it's worth pausing for a moment to consider who is making most of the decisions in fields of journalism, technology and design. It doesn't take long to realize that this community skews white, male, straight and able-bodied. So it's no surprise that the assumptions built into our work skew that way too.
Once we acknowledge that as a community we have imperfect assumptions, we can work to get better. One way to do that is to intentionally design against bias.
Some companies have actively tried to design against bias once it is discovered in their products. The social network Nextdoor is a free web platform that lets members send messages to people who live in their neighborhood. Sometimes this means announcing tag sales or giving tips about plumbers, but the site also has a section dedicated to crime and safety.
A few years ago the East Bay Express wrote about how Nextdoor was being used for racial profiling. Users in Oakland were reporting black folks as "suspicious" just for walking down the street or driving a car. They suggested that a black salesmen and a black mail carrier might be burglars. They reported "suspects" with little more description than "black" or "wearing a hoodie."
Nextdoor quickly acknowledged this was a problem, and redesigned their interface to discourage it.
Most notably, they required people to give a few specific details about this supposedly "suspicious" person, such as what the person was wearing or the color of their hair. They also added more prominent warning screens about racial profiling and added the ability to report a post as racist.
Another way to address systematic assumptions is to expand who's making these decisions. When Diógenes Brito, a designer at the messaging company Slack, was assigned to create a graphic for the company, he assumed that he could make the hand in the graphic look like his own. So he made it brown.
The effects may be subtle, but if we pour so much of ourselves into the stories we tell, the data we gather, the visuals we design, the webpages we build, then we should take responsibility for them. And that means not just accepting the limits of our own perspective, but actively seeking out people who can bring in new ones.
A version of this story originally appeared in "Malofiej 25," published by the Spanish chapter of the Society for News Design (SNDE).
ProPublica is a Pulitzer Prize-winning investigative newsroom. Sign up for their newsletter.
Already subscribed? Log in
Subscribe to read the full story →
3 Months
₹300/Month
1 Year
₹225/Month
2 Years
₹162/Month
Renews automatically, cancel anytime
Over 30 premium stories daily, handpicked by our editors


News, Games, Cooking, Audio, Wirecutter & The Athletic
Digital replica of our daily newspaper — with options to read, save, and share


Insights on markets, finance, politics, tech, and more delivered to your inbox
In-depth market analysis & insights with access to The Smart Investor


Repository of articles and publications dating back to 1997
Uninterrupted reading experience with no advertisements


Access Business Standard across devices — mobile, tablet, or PC, via web or app
First Published: Apr 05 2017 | 9:09 AM IST