“Eclectic, interesting…will fill you with hope and resolve!” – Pick up our new short story collection, Women.Mutiny
Ask Alexa a question about the skewed gender bias and she will definitely not be able to answer you. Why, even in the 21st century, is there such a bias?
Children ask the most unexpected questions.
My little one put Alexa on the spot with one such question and the algorithm copped out. The question was simple enough, “Why is it called mankind and not womankind when it is the woman who gives birth?”
Alexa heard it, processed it and came up with, “Sorry, I’m not sure.”
The kid smirked. I smirked. And we agreed that Alexa’s answer should have been, “It is not mankind or womankind the correct word is humankind.”
But you know how kids are, once they smell defeat they want to keep going. The next question was “Are women better drivers then men?”
Alexa’s reply was no different.
Next came “Are women better leaders than men?”
Different questions but same answer.
We wondered if there were some exceptions that triggered this response to the gender comparison questions. Because if the dataset that ran the algorithm was large enough, the answer should have been yes to both the questions.
Whether it was men being better drivers or leaders, the men in these positions and their success rates are overwhelmingly high. In cases of gender bias, the bigger the dataset the bigger the problems.
In practically all technological fields, there is an overrepresentation of men. This is simply because women started getting into the workforce practically a fortnight ago, in comparison to men. If you don’t believe me, just look at the number of women who women the Nobel prizes- it is a dismal 3.29 percent.
As for the written word, the presence of women in text is not very encouraging. In the British National Corpus ‘Mr,’ occurs more often than ‘Mrs,’ ‘Miss’ and ‘Ms’ combined. Biased descriptions in newspapers don’t paint a pretty picture either. Men are more frequently described in terms of their behaviour while women in terms of their appearance and sexuality.
I’m almost afraid to ask Alexa to make a sentence using the word “bitchy” and “bossy” together. Add to this the fact that the developers behind the design of AI technologies are almost all male. Pretty soon, Alexa, Sophia and Vyommitra all start resembling the Stepford wives.
The creators of AI have been raised in a patriarchal society which repeatedly devalues women. And the historical data backs up the claim that men are inherently more capable than women.
We’ve all heard of Amazon’s self-learning resume filtering algorithm that ranked male applicants favourably over equally qualified females. The machine learning model is only as good as the data fed into it and the bias (conscious or unconscious) that comes with it.
Introduce gender bias and feed it tons of historical data. And it goes haywire and perpetuates it more effectively and even amplifies it.
Years and years of patriarchy have provided us with the most skewed datasets and training models. And letting these loose into the world will only reinforce gender stereotypes and get us back to right where we started.
Solving for unfairness, be it with regards to race, culture or gender is a very real problem. And women certainly do not need another voice in the system that drags them down further.
Picture credits: Pexels
Women's Web is an open platform that publishes a diversity of views. Individual posts do not necessarily represent the platform's views and opinions at all times. If you have a complementary or differing point of view, sign up and start sharing your views too!
Roopa Prabhakar describes herself as a mother, a working woman, a closet feminist and blogger.
Alexa, Siri And Cortana; Isn’t It Time We Ungendered Our Virtual Assistants?
Does Gender Matter In Interviews? Women Tell Us Their Stories. [#WomenOnTheMove]
What Is The Real Reason Women Don’t Get Equal Recognition Of Their Work?
‘Sanskaari Girls Don’t Answer Back’ And 9 Other Stone Age Rules I Happily Challenge
Stay updated with our Weekly Newsletter or Daily Summary - or both!