Home    8451    Insight

84.51°’s Jeanette Mumbeck seeks to eliminate algorithmic bias

84.51°’s Jeanette Mumbeck seeks to eliminate algorithmic bias

 May 21, 2020

Algorithms are everywhere.

They decide which TV shows appear at the top of your streaming service recommendations, which online ads you receive, and whether or not your loan application will be approved or denied.

In modern society, they are ubiquitous – making millions of calculated decisions behind the scenes that affect countless aspects of your life. They are algorithms, and Jeanette Mumbeck has spent her career working with and developing a better understanding of these cryptic, decision-making lines of code. 

Jeanette works for data analytics powerhouse, 84.51°, the company that manages data from Kroger’s expansive customer loyalty program. As manager for the company’s Science function, Jeanette ensures strategic plans exist for developing the Science Team’s multiple disciplines, and implementing these plans relies on algorithms to help them make decisions at the massive scale necessary to handle the sheer volume of Kroger data.

One area of focus for Jeanette is ensuring these algorithms treat the individuals they recommend products and allocate coupons to fairly. Unfortunately, if not addressed, even in something as seemingly objective as a computer program, bias can creep in and cause unfair treatment of different groups. 

Jeanette's work is key to promoting Algorithm Fairness 

84.51°’s work promoting Algorithm Fairness is part of the company’s comprehensive roadmap to improve Diversity & Inclusion in all areas of the business. This roadmap was initiated by 84.51° leadership and has been embraced by the highly-impassioned and active employees of 84.51°.

To accomplish her specific task in the overarching roadmap, Jeanette and Grant Gilkerson, her co-lead for this initiative, partner with a number of individuals in 84.51°’s Data Science and Science functions.

Jeanette explains her role: “Kroger has thousands of stores across North America, with tens of thousands of products per store. On a daily basis, we interact with millions of customers. As a result, we rely on data and science to make decisions at scale.

These decisions range from the price of a product, which product should go on promotion and when, and which assortments of products should be placed in the store based on those customers’ preferences. We are a ‘customer first’ company and to stay true to that promise we must come to those decisions in a fair way.”

Regularly reviewing algorithms

Jeanette and Grant meet regularly with their team to review the algorithms produced by 84.51° data scientists, assess their potential for developing bias, and lay out any corrective measures that need to be taken to mitigate these risks. Most recently, the team reviewed Kroger’s new Start My Cart science. Start My Cart uses machine learning powered by algorithms to help online shoppers by making suggestions for products they have purchased in the past and suggesting items that are on sale. 

“While this particular project is low on the risk spectrum for developing harmful biases, we want to make sure that all customers are treated fairly,” says Jeanette. “For example, we have other sciences where we allocate coupons and since we have a limited number of coupons to recommend to customers, we want to be sure these coupons are being allocated fairly.” 

Why is bias relevant to algorithms?

While humans are fallible and biased, how can a computer program, essentially run by machines, be biased? Bias creeps into algorithms in three ways:

1. Data bias

Data sets can bring forth representational bias, since they reflect biases inherent in our society like gender norms or when some groups are be underrepresented in that data set. This can be based on race, income, gender or other facts. For instance, while many women shop for diapers, because of representational bias, fathers might be excluded from getting a coupon for diapers.

2. Measurement bias

Another example of data set bias is measurement bias. This involves trying to get a desired outcome, when in some cases you can’t find something that represents that outcome. This can occur when the data collections are designed poorly, or data is unavailable. One recent example that made news was a healthcare provider’s algorithm to determine which patients might require additional medical treatment. The particular model they created was trained on which patients had higher medical costs. This resulted in less additional medical treatment for African Americans, since their healthcare costs are generally lower. In this case, using cost to train the algorithm did not help determine who really needed the additional treatment.

3. Scientific bias

Every decision that goes into the algorithm is inherently based on the biases of the scientist creating it, and by keeping that in mind those biases can be reduced. For instance, the field of artificial intelligence is dominated by men. And high-tech fields tend to be overrepresented by Caucasians according to the Equal Employment Opportunity Commission*.

Ensuring inclusion and looking beyond 84.51°

In addition to ensuring the accuracy of the data set, it’s also important to have a diverse team of scientists to provide their unique personal perspectives while writing the code. Diversity & Inclusion is a key commitment at 84.51°, so incorporating diversity in the scientific team makeup is a natural outgrowth of those initiatives. 

As an additional step in protecting against bias in its business practices, 84.51° looked outside of its own organization for a third-party partner to help review its algorithms and train its employees. Some years back, Jeanette and her colleagues began a partnership with computer science professors at the University of Wisconsin-Madison. UWM Professors Loris D’Antoni and Aws Albarghouthi have done extensive research in fair decision making. Since the partnership began, the professors have held several company-wide workshops to create awareness and provide training to promote fairness in data science. Typically, data scientists aren’t taught to evaluate their models to see if they are producing unfair results. By adding this extra level of evaluation to 84.51°’s models, the risk of unfair treatment of Kroger customers is greatly reduced. 

Jeanette notes: “I’m really proud to be working for a company that takes seriously its promise of ‘customer first.’ We are really walking the talk and are leading the way in our industry. By searching for intrinsic bias, our leaders have committed to putting a priority on diversity, inclusion and fairness.”

Hear Jeanette discussing Algorithm Fairness 

Jeanette recently discussed Algorithm Fairness with Grant Gilkerson in an edition of 84.51°’s The Uplow’d podcast.

Listen to an insightful podcast with Jeanette and learn more about Algorithm Fairness. 

8451 Uplow'd podcast


Join women like Jeanette at 84.51°

Interested in Jeanette's work?

84.51° is always keen to hear from women with a passion for data science and technology.

Search and apply for one of the exciting data roles with prime employer, 84.51°.
 

*Joe Davidson, (May 23, 2016) High Tech Firms Lack Diversity, EEOC says. Washington Post

  

Find out more

Stay connected by subscribing to our monthly newsletter and following us on LinkedIn, Twitter, Instagram and Facebook.

Disclosure: Where Women Work researches and publishes insightful evidence about how its paid member organizations support women's equality.

Share this page:


Join our women's careers community