Photo by Markus Spiske from Pexels

Removing unconscious bias right from the start

What is unconscious bias?

Bias is leaning in favor of or against a particular person, thing, or group, creating unfairness. These can stem from beliefs a person holds concerning a group of people or things and feed into their actions unconsciously. Specific environments can trigger people’s unconscious bias.

Types of Bias

There are several types of unconscious bias a person may possess, but for brevity, we will highlight only three, usually found in technological developments:

Photo by Monstera from Pexels
Bias can creep into even technology. Photo by Monstera from Pexels
  1. The first is sampling bias: The data collected in Artificial Intelligence can over-sample a community and under-sample another. Sampling bias will mean specific characteristics will show more and skew the results. Ideally, there should be random samplings, and the elements should represent the modeled population.
  2. The second is confirmation bias: With this, you already have a stance or preconceived notion you are trying to validate. So, even if data is flying in the face of your hypothesis, you will look for whatever information aligns with your belief. The result of this is objectivity reduced and poor decision-making.
  3. The last bias is the bandwagon effect: In data collection, a trend may occur, and more data confirms the trend. When this happens, developers may overrepresent the movement. The bandwagon effect is risky because the direction may disappear when more data is collected.

Bias in Edtech

You are probably wondering how humans’ biases correlate to Edtech.

Let us bear in mind that humans develop Edtech tools, and in this creation, their biases can seep through the technologies they create. Bias can be present in any activity, from creating a course syllabus to assessing learners.

Take a child born in sub-Saharan Africa completing an Edtech course. The student is faced with examples in the learning material that focus on snow, winter, and temperate climates. Yes, this child can research these but will not have relatable examples. There might even be vocabulary peculiar to such conditions that the child is unfamiliar with or minor subtleties that change the meaning of a sentence or paragraph. Using non-objective Artificial Intelligence (AI), which has not considered geographic and cultural nuances, might flag the child as a low reader during assessments when that is not the case.

How to reduce bias in Edtech

Photo by Katie Rainbow 🏳️‍🌈 from Pexels
Our goal should be to develop technology that is inclusive and unbiased. Photo by Katie Rainbow 🏳️‍🌈 from Pexels


The first step to reducing bias in Edtech is commitment. Developers must be committed to being more aware of their implicit biases and confronting them. Also, they must learn to be more open-minded when creating to reduce such biases. Edtech firms also need to be committed by ensuring objective checks and balances to eliminate bias. Commitment on the part of businesses and developers needs to be continual, as letting go of such prejudices is an iterative process and will take time.

Remove bias from the whole process.

A slippery slope when addressing bias is when developers only focus on a particular aspect of the developing process. For example, a developer may be aware that the dataset collected for algorithms can lead to a harsher assessment for a group of students. As a result, the developers will identify any blind spots in their datasets and modify them accordingly. However, that is only one part of development. Especially for Edtech, tools that use Machine Learning and Artificial Intelligence, from ideation to logical assumptions, content, datasets, training algorithms, user experience design, testing, evaluation, and implementation, must all be free from biases. Only then will there be a thorough elimination of prejudices.

Edtech tools should be user-focused

A user-focused approach leads to an iterative design where each stage of the design process focuses on the user. Many Edtech tools focus on students, and in such cases, the providers should spend time with students. The time spent with the students will allow them to learn from them, understand their cultural and social contexts, pay attention to any nuances, and note their specific needs and interests. Moreover, the users involved should be diverse to achieve a holistic approach.

Photo by from Pexels
A diverse team brings varying experiences. Photo by from Pexels

Diversify the pool of developers

Apart from ensuring that datasets reflect broader perspectives and lived experiences, it is a plus when the developers are diverse. If possible, the end-users of any digital tools should have similar people building and overseeing the software creation. This is because developers also need to think like users to improve user experience, and having developers similar to users makes this much easier. Having developers from different backgrounds also helps to double-check biases, for example, language in the communications and on the platforms.

Collaboration with stakeholders

Aside from the end-users, other pockets of people will be interested in or affected by Edtech tools created. These include the community, investors, academic institutions, and more. As a result, Edtech providers should partner with these stakeholders and incorporate their feedback into the design of their solutions. This will ensure that the solutions produced meet the needs of the end-users and satisfy the interest of stakeholders.


As humans, due to personal and societal beliefs, we tend to have unconscious biases that ever favor or disfavor a person or group of people. Regarding Edtech, because humans develop it, these biases may creep in and can be harmful to the end-user and ruin the user experience. Edtech providers can do a lot to ensure technology is fair:

  1. Edtech providers must be committed to fighting against such prejudices in their digital solutions to solve this. Fighting unconscious bias requires continuous learning and adapting to an iterative process that inculcates any feedback received.
  2. From ideation to implementation, the whole development process must be rigorously checked for implicit biases.
  3. Edtech tools should have the user at the core: the user should represent a broad range of users who will use the services.
  4. Ideally, the developers should be diverse to help identify biases that would have otherwise gone unchecked.
  5. Edtech companies and relevant stakeholders should push a joint effort to ensure a holistic endeavor in removing implicit biases.

Once these are ingrained in the business, we will be well on our way to having impartial Edtech tools.

Watch How Chalkboard Education Works

About Chalkboard Education

Chalkboard Education provides a mobile-based, offline-first Learning Management System tailored for underserved communities’ training. Lightweight, inclusive, and complete with full analytics capabilities, Chalkboard Education helps you reach your beneficiaries everywhere in the World, seamlessly. Currently used in 12+ countries in Africa, South and North America, Chalkboard Education is available worldwide.

America, Chalkboard Education is available worldwide.


Kwak, J. (2021, April). HOW COLLEGE FACULTY CAN CONFRONT UNCONSCIOUS BIAS IN EDTECH TOOLS. Retrieved from Every learner everywhere:

Navarro, R. (n.d.). Unconscious Bias. Retrieved from Diversity: University of California, San Francisco:

Schmelzer, R. (2020, June 10). 6 ways to reduce different types of bias in machine learning. Retrieved from Search Enterprise AI:

Thinking Kap. (n.d.). How to Remove Unconscious Bias for Improved eLearning. Retrieved from Thinking Kap:

Torres, J. (2021, September 16). How to identify, address bias in educational technology. Retrieved from Smart Brief:



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store