Tech researcher Ann Arbor explains how software can harm

This article is the first in a new series on promoting diversity, equity, and inclusion in Washtenaw County’s technology sector. Ann Arbor SPARK provides support for this series.

Bias is inherent in any decision-making process, but in the technology world, the bias of programmers can have a particularly damaging effect on the people their products are intended for. Meg Green (they / them), an Ann Arbor-based senior user experience researcher for Missile houses, has dealt extensively with this topic in her personal research. Green’s job is primarily focused on research and also uses design to collaborate with developers and designers. You share the challenges that biased data can bring with developers.

Green gives an example of how data can be skewed when buying a home.

“Customers buying or selling houses want to easily find a home in an area they love to live in, and there are public items in a neighborhood that are easy to search,” says Green. “The most frequently asked question is: crime statistics.”

Green admits that the more cops there are in an area, the more crime is reported – which could affect the data and its analysis.
Mega green.
“Anyone can access publicly available information about the city they want to live in, but the way the information is used for data can perpetuate a pre-existing problem,” says Green.

Bias can also play an important role in artificial intelligence (AI), and Green and others in the industry are re-evaluating these effects. For example, Green points to an AI product called Correctional Offender Management Profiling for Alternative Sanctions (COMPAS), which uses demographic information to assess how dangerous criminals are. Ratings from COMPAS and similar programs are sometimes used to determine the length of prison or suspended sentences, though Evidence that they are prejudiced against black offenders.

“If it’s from a computer, people are more likely to trust the data,” says Green. “… The data is just statistics, but if you enter skewed data, skewed data will be output.”

Bias can also come into play in machine learning algorithms that are used to teach AI how language works. The way we categorize people and terminology have changed over time, and context is very important in language. But unfortunately, AI can be biased towards both gender and race concepts.

“The word ‘doctor’ and ‘nurse’ in English are gender-neutral, but when translated into German, Google Translate uses the masculine term for ‘doctor’ and the feminine term for ‘nurse’,” says Green. “AI tries to take the context to determine whether the masculine or feminine word is used in languages ​​that use gender signs.”

Algorithms can also create negative associations with certain racial or gender identifiers such as “black”, “woman” or “gay”. For example a Search google for “black girls“used to return results for pornography primarily including the word “transgender” in video titles has resulted in YouTubers getting lower ad revenue for their videos.

“Being gay, being black, or being a trans woman doesn’t mean these things are negative and you don’t want to read this information,” says Green. “Anything about being bisexual and being gay is pornographic and unacceptable to children, according to some biased data found using AI.”

Green suggests that one can easily counteract these biases by sending software developers along with user experience researchers while they interview target users of their software.

“You can empathize with people and see their situation,” says Green. “The developers are more empathetic and can now talk to their team about the situations. It’s about building that empathy and helping people get into the bias that is being input. “

Distortions in software can have deleterious consequences, and Green says it is up to the programmer to reverse it.

“A machine algorithm won’t learn unless we teach it better and help reprogram the data,” says Green.

For more information on AI and gender roles, see Green’s video presentation on “Gender and Artificial Intelligence. “

Monica Hickson is a freelance writer and currently lives in Ypsilanti. She has joined Focus as a news writer in 2020 and is the author of a book “,The COVID diaries. “You can reach her at [email protected].

All photos by Doug Coombe.

Comments are closed.