Break the Algorithmic Bias or Bias in Technology
There’s an Instagram acc which I find interesting: @taskforce_kbgo. They’re focused on breaking the gender bias online, including the Google Translate bias.
Here’s the thing about Indonesian language: it’s genderless. I personally think it’s a good thing. I prefer language as a neutral medium and not categorized to certain genders. But when it comes to translation, it’s getting tricky. Especially when the ‘translator’ doesn’t give the right context. The result can be confusing or deemed as bias.
Two weeks ago Crispin Porter Bogusky London’s powerful campaign for International Women’s Day went viral. No matter how progressive we think we are, we can’t deny that the unconscious bias can linger within us.
Geoffrey Colon (whose Linkedin post is among the first ones I saw for this campaign) summed it up in one of his comment. He mentioned the root cause of unconscious bias is the years of impressions and images passed down to us through the media and imagery.
Then, what is the root cause of Google Translate unfair results? The answer is algorithmic bias. Wikipedia describes it as a systematic and repeatable errors in a computer system that create unfair outcomes, such as privileging one arbitrary group of users over others. Artificial Intelligences are biased because they are created by human. Those humans generate, collect, and determine what datasets and other variables the #algorithm learn from to make predictions. The whole process are prone to biases and can be embedded in the systems.
Google Translate is a free service from Google which use a translation algorithm called Statistical Machine Translation (SMT). It’s based on language pattern matching. So they will break down each sentence into individual words/ phrases to match with the ones on their database. Then they will string together the most frequently occurring translations for each component to construct the complete sentence translation.
They are really depend on their database, so when we tried to translate individual words or a bunch sentences, the accuracy will be relatively higher than if we translate the whole page or article. This happened because Google Translate doesn’t put context. The translations are basically just statistics. So if there are thousands usage of the word ‘CEO’ in their database and most of them were used for male, Google will automatically use that information to translate ‘CEO’ as a ‘he’.
I took a shot and combine phrases and sentences from the @taskforce_kbgo and CPB London post on Google Translate and got the same results.
Disappointed, but not surprised. Men are ‘translated’ as someone who have higher level of job and personally tougher than women. Most AI models are trained on a large open-source data set compliled by scraping content, including images, from the internet. The big problem is, the internet lacks gender-representative data sets and is littered with misinformation, disinformation, xenophobic, and sexist content. Centre for International Governance Innovation, a think thank on global governance, believes that without the necessary filters and mitigation in place, generative AI tools are being trained on and shaped by flawed, sometimes unethical, data.
The gender bias in technology not only will provide us with sexist and stereotypical results, but also affect the future of women in tech.
Executive Director of UN Women, Sima Bahous, in her remarks to participants of CSW67, “the gender digital divide is the new face of gender inequality.” That means, girls are disadvantaged when it comes to digital adoption, have lower levels of access to and use of digital technology than boys, and often they are not benefitting from digital technology in the same way as boys.
So how to improve the fairness in artificial intelligence outcomes? I believe it’s all comeback to those who created it: human. Project F has done a great job in summarizing the steps that we need to do in order to break the vicious cycle and improve the algorithm performance. Two of them are Inclusive Workspaces where we create inclusive work environments that challenge stereotypes and promote collaboration and well-being and also Support Girls in STEM where girls are supported with mentorship, internships, and awareness programs to pursue STEM education and careers.
Updated on Friday, July 26, 2024
Leave a Reply