Racism in Artificial Intelligence (How STEM Education Help)

Guest Author
Guest Author
03 May 2021
Racism Artificial Intelligence
About the Author: Nicole Anderson

About the Author: Nicole Anderson

Nicole Anderson is a social anthropologist with a Masters in Social Justice Education from the University of Toronto. Her research interests include anti-racism education, colonial history and material culture. She is currently pursuing a PhD in Social Anthropology at the University in Edinburgh, conducting ethnographic and archival research with the University’s collection of North American human remains.

As an anthropologist, I am somewhat new to the world of coding. I recently completed a methods course where the professor constantly repeated that data analysis operates on a “garbage in, garbage out” principle. As scientists (social scientists or otherwise), working with data means we have to be careful with what information we feed machines and computers. Feeding computers “garbage” means we will get garbage results,  out which leads to inaccurate and misleading conclusions. The issue in working with big data and machine learning (often referred to as “artificial intelligence” or AI) is that sometimes the garbage we put in is hard to spot. 

For individuals who do not belong to a marginalised group, it is sometimes hard to see that we are always surrounded by structures of power. These power structures create systems of inequality in our society. Recently, more attention is being paid to how our societies are structured through a white supremacist framework. White supremacy is a worldview that positions white people as superior. This framework affects our perceptions, our decision-making, how we relate to others and how we produce knowledge. The way we construct scientific knowledge is affected by this, particularly in our advancements in AI and machine technology, which has real and dangerous effects for racialised communities. 

As we know, science is meant to be a force for good. Advancements in scientific knowledge are meant to solve some of the issues our societies are facing today. For example, AI is meant to help us prevent problems by predicting different outcomes based on information we already have. However, it is important that we are critical about the data we are using to make these predictions. Science is not as objective, or impartial as we like to think. Instead, it is implicated by the unconscious bias of scientists who live, work and research within these racist structures. 

An example of such a problem in AI technology is how systemic racism impacts predictive policing. Predictive policing uses algorithms to predict patterns in crime data. Police institutions use this data to target specific locations in order to prevent crime. However, this data is often flawed as it over-represents low-income and racialised communities, which falsely suggests racialised people are more likely to commit a crime. In turn, these predictions lead to over-policing of these communities, which perpetuates the issue. 

Similar problems also arise in the use of facial recognition. Facial recognition tries to protect the public by using surveillance technology to analyses our facial features to confirm our identity. However, often it is not accurately programmed to recognise Black and racialised faces, which means sometimes racialised individuals are accused of crimes they did not commit. Bias in these AI technologies can lead to wrongful arrests, which adds to the inaccurate statistics reflected in crime data.

As a result, Safiya Noble, a professor in Information Studies and African American Studies, shows in her book “Algorithms of Oppression” (2018) that racist algorithms are not only based on these racist structures, but can actively create them. Big data does not just reflect society, but it can also change how we perceive, understand, and study the world around us. Challenging these algorithms means we have to be aware of how our histories, prejudices and biases are shaping these technologies, and how they are impacting the lives and futures of our communities.

Challenging Social Injustice Within Big Data

 Until there are better regulations for AI technology, it is important that as scientists we are aware that we cannot always take data at face value. Instead, we have to be critical of the how “garbage” data is distorting the way we see reality. To limit “garbage” data we need to ask ourselves: how was this data collected and by whom? What biases might it contain? What are its implications: who is it harming and who is profiting from this harm? Lasty, and importantly, we need to think about what kind of scientists we need to fix these problems. Critical STEM education has the potential to create scientists that directly challenge these issues and develop technologies that lead to a more equitable and socially just world.

About She Maps.

She Maps is Australia’s leading expert in drone and geospatial education. 

She Maps assist schools with the purchasing of drones, school-industry created drone and geospatial teaching resources and highly supportive teacher professional development.

 
You’re in Safe Hands!
She Maps is a CASA approved commercial operator to fly microdrones indoors with students and teachers. CASA holds commercial operators, to a higher standard than recreational users and educators. This means that She Maps has been assessed by CASA as having rigorous training and risk mitigation procedures in place.
 

Ready to buy drones for your school? We are an authorised DJI reseller in Australia

Want some help?

Schedule a call with one of our team members to get some personalised recommendations.
picture4
Share
Tweet
Share
In this article:
    Add a header to begin generating the table of contents

    Stay up to date

    Subscribe by email and never miss a blog post or announcement.

    Flying High with Drones at Your School!

    Learn the 6 Steps to Launching a Successful Drone and Geospatial Program at your School

    What’s covered:
    • The educational benefits for running a drone & geospatial program
    • How to gather whole school support for the program
    • How to fund your drone program
    • Matching your school requirements to the best program
    • How to build teacher confidence and capabilities
    • Steps to expand your program with school-industry partnerships

    Try before you buy

    Take our resources for a spin and join the thousands of teachers who love our ready-to-teach classroom materials. Try one of our complete units of work for free.

    drones in forestry yr 5 6
    drones in forestry yr 9 10