“Computers don’t have an implicit bias—they’re just machines.”
There’s an infamous example of how a machine learning model included Zip Code in its algorithm for prediction. While Zip Code is fairly innocuous on face value, when researchers looked under the hood of their black box model, they realized that Zip Code was being used as a proxy for race. And as society continues to fast-track machine learning and AI-based solutions for everything from hiring to credit to recidivism, we as strategy and analytics practitioners need to be aware of and challenge these biases.
And don’t forget: computers are programmed and designed by people. Data is collected and observed by people. Conclusions, insights, and strategies are written by people. And people bring a whole host of biases (either directly or indirectly) that shape the narrative.
As a person who frequently uses data to draw conclusions, I understand the need for impartiality. After all, if X is better than Y, that’s what’s important, right? And while there are certainly simple questions that call for black and white answers, our conclusions can usually be improved upon and deepened with context. Take an organization with the goal to align their supporter demographics more closely with the people they’re serving. If they were able to decrease new donor average age by 15% and increase minority involvement by 10%, a temporary decline in donor value might be anticipated, even if the future ROI is expected to rise significantly and better support the long-term vision.
So, to marry objectivity (science) with context (art), I try to be realistic and transparent about ways that I may be biasing my conclusions and unintentionally feeding into a system which values the experiences of some above others. I was lucky to recently receive some excellent counsel and new perspectives from some current and former colleagues.
Bettina Papirio-Faerber, One & All’s VP of Strategy & CX spoke to the need of shaping data collection and analysis correctly. “If we don’t take care to structure questions without bias and if we don’t take care to interpret the data without bias and the understanding that the customer behavior is influenced by things we don’t know—the lived experiences,” then we will not truly be capturing the customer experience. I loved this quote, because question verbiage and intent have a huge impact on how individuals respond.
“Numbers are telling, but there are so many nuances in our lived experiences that numbers alone can’t read,” said Arianna Laila Dela Rosa, a career coach for first-gen professionals. Arianna is speaking to the importance of combining qualitative alongside quantitative insights, which is ideal in any analysis. She added, “Make sure folks who are doing data [analysis] are diverse themselves.”
And, O&A’s Director of Digital Strategy, Chris Tsai advises, “I’ve always appreciated the elegance of a well-constructed predictive model. The power of being able to take an incredible amount of complex data and discover some underlying relationship within is enthralling. As a strategist however, I’m more circumspect, because I recognize that the beauty of these models derives in part from their reductive nature.” That reductive quality of machine learning is what we need to be cognizant of to avoid and challenge.
The advice and guidance from our experts remind us that throughout the planning, data collecting, analysis, and insights process, we must be aware of our culturally encoded biases.
Data and analysis without context creates noise. Conclusions and insights without understanding the art and science (like the AI which blindly used Zip Code as a proxy for race) can actively harm others—particularly in communities who are already marginalized. It’s our job as strategic analytics practitioners to be aware of the biases in data, understand the context in analysis, call out prejudices, and attempt to counteract preconceptions in all the ways we have at our disposal.
We'd love to hear what you think. If you'd like to engage with our experts, let’s connect.