If you are a professional in an emerging industry, like gaming, data science, cloud computing, digital marketing etc., that has promising career opportunities, this is your chance to be featured in #CareerKiPaathshaala. Fill up this form today!
Women assistants and virtual assistant are okay but men behaving like or doing things “usually” done by women IS NOT okay? You say this is not stereotyping?
Women assistants and virtual assistant are okay but men behaving like or doing things “usually” done by women IS NOT okay? And you say this is not stereotyping?
Google Human Evolution. The theory of human evolution is almost always represented by the lives of men, as if men have been taken to represent all of humankind. When it comes to the other half of humanity – women, there is often nothing but silence. And these silences are everywhere.
Films, news, literature, science, city planning, economics, the stories we tell ourselves about our past, present and future. And now even our “advanced scientific innovations” have come to carry forth this legacy of “missing women.”
Caroline Criado Perez, author of ‘Invisible Women: Exposing Data Bias in a World Designed for Men,‘ has found that many of the services and products we interact with on a daily basis were designed with sexism hiding in plain sight.
For example, research shows women are more likely to own an iPhone than men. The average smartphone size is now 5.5 inches and the massive size of the screen is definitely impressive.
But it is interesting to note that the average man can fairly comfortably use his device one-handed. However, the average woman’s hand is not much bigger than the handset itself. More annoying is the reason given to justify this – phones were no longer designed for one-handed use. And also that many women actually opt for larger phones, thanks to their ‘handbags.’
But women carry handbags in the first place because thanks to gender stereotypes, women’s apparels rarely and barely have pockets. So, designing phones to be handbag-friendly rather than hand-friendly feels like adding injury to insult.
Plus, it’s rather odd to claim that phones are designed for women to carry in their handbags. Especially when so many passive-tracking apps clearly assume your phone will be either in your hands or in your pockets at all times instead of in your bags.
Anyway, since we’ve been talking about phones, let’s also talk a little about a more popular feature of our devices – our virtual assistants. Voice is going to become the chief way that we make our wants known to our “hi-tech” devices and when they respond, they will do so with female voices.
I mean, most virtual assistants like Siri, Cortana, and Alexa powered by artificial intelligence (AI) by default have female names, voices, and mannerisms. For example, Alexa, named after the library of Alexandria, could have been Alex.
Apple’s Siri translates to “a beautiful woman who leads you to victory” in Old Norse. Microsoft’s Cortana’s namesake is a fictional synthetic intelligence character in the Halo video-game series. She has no physical form, but projects a holographic version of herself. But as designers make that choice, they run the risk of reinforcing gender stereotypes. How, you would wonder. It’s always been like this!
In the late 1990s, BMW had to withdraw the female-voiced navigation system from its 5-series cars. Why? Because German drivers complained that they did not want to take instructions from a female voice.
Further, even BMW was guilty of sexism when their customer service representatives tried to calm down the callers. They explained that men were actually behind the making of both the car as well as the directions.
According to Prof. Clifford Nass (Stanford University), the German drivers at that time considered it as a deal breaker to have a female voice in their GPS. Taking directions from a woman, essentially in a ‘driver’s seat’ challenged their traditional ideas that existed in the society.
Moreover, men consider women to be less knowledgeable and do not trust the female voice offering directions. His research has also shown that a call centre in Japan operated by Fidelity would rely on an automated female voice to give stock quotes. However, for transactions, it would transfer customers to an automated male voice. His findings suggest that even the iPhone owners in the U.S. might be more inclined to overlook a male Siri’s shortcomings when compared to that of a female’s.
Consider IBM’s Watson — an AI of a higher order, speaks with a male voice as it works alongside physicians on cancer treatment and handily wins Jeopardy. When choosing Watson’s voice for Jeopardy, IBM went with one that was self-assured and had it use short definitive phrases. Both are typical of male speech. And people prefer to hear a masculine-sounding voice from a leader, according to research, so, Watson got a male voice.
Hence, our 21st century digital universe is fraught with 17th century social problems which actually stem from a tendency, deeply ingrained in our brains. They are now skilfully programmed into our machines.
Although they lack bodies, they (VAs) embody what we think of when we picture a personal assistant. A competent, efficient, deferential, and meek woman who does all the big and small tasks ordered by one. But she is always polite and obeisant. Thus, it’s no accident that Siri, Cortana, and Alexa, all have female voices, female names, and female mannerisms “by default.” In these instances, technology adapted to pre-existing stereotypes, and helped to perpetuate them. So, the gendering of AI became inevitable.
UNESCO’s 2019 report titled ‘I’d Blush If I Could’ mentioned that “their hardwired subservience influences how people speak to female voices and models how women respond to requests and express themselves. To change course, we need to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them.” The title of the report is, in fact, a reference to the standard answer given by the default female-voice of Apple’s Siri, in response to insults from users.
The whole point of having a digital assistant is to have it do stuff for you. You’re supposed to boss it around. But if we’re going to bring home machines which can be given orders so casually, why do so many of them have to have female voices, names, and mannerisms? The simplest explanation is that people are conditioned to expect women, not men, to be in “assistance” roles. And that the makers of digital assistants are influenced by these social expectations.
Our historical familiarity with telephone operators having traditionally always been female has made us accustomed to getting help from a disembodied woman’s voice. VA’s predecessors — recorded voicemail systems, were predominantly female as well.
The answer to the question as to why most computerised voices are female could be, because consumers expect a friendly, helpful female in this scenario. And that is what companies give them.
In a 2018 interview with Business Insider, Daniel Rausch, vice president of smart home for Amazon commented, “We carried out research and found that a woman’s voice is more ‘sympathetic’ and better received.”
There is psychological research to show that voices of lower pitch are perceived as more authoritative by both men and women. Higher-pitched voices are associated with submissive, helpful and caring characters. And guess who has higher-pitched voices and is ultimately more profitable when coded into an AI? The companies behind AI are, thus, cashing in on “this” bias – because doing so makes your “Alexa” look more and more real. But when we can only see a woman, even an artificial one, in that position, we enforce a harmful culture.
According to Psychologist James W. Pennebaker, women use more pronouns and tentative words than men. Pronoun use, particularly of the word “I,” is indicative of lower social status.
AI assistants are very prone to using “I,” particularly in taking responsibilities for mistakes. Ask Siri a question she can’t process and she says, “I’m not sure I understand.”
If so, does she talk like a woman? Should she? Data consistently show that women use I-words (I, me, my) about 12% more often than men. Women use “cognitive” words (understand, know, think) as well as “social” words (words that relate to other human beings) more than men. But replicating these patterns in the language of AIs might serve to perpetuate the existing gender inequalities in society.
And we like that!
My very well-educated friend who is a student of computer science asked me — what’s the big deal? So what if Alexa is a woman? These are the same people who, recently had problems with male TikTokers because they dressed, mimicked women, and/or performed on female background scores.
What’s the big deal? So what if he’s “being a woman”, isn’t that voice pleasing any more? Oh, women assistants are okay but men behaving like or doing things “usually” done by women IS NOT okay. And you say this is not stereotyping?
Because we have been “taught” that a nurse is always a woman and a doctor is always a man. Perception matters. We cannot “not see” women being obedient assistants. This is how it has always been and this is how it should always be.
We can keep building rockets and landing on Mars, but women, my God, they have to continue being the servile assistants that they were born to be. Otherwise, will this ever be even the progress that we can boast of?
Why do you even want a machine to have a gender? Because a machine is all about science and gender is social. Gender is the crude way how society perceives a person to be a man or a woman.
So, why do you want to put this social limit on science and technology? You advocate for permissionless innovation but seek society’s permission to choose the voice of your AI. Could you be any more hypocritical? (FYI, innovation is transformative, not regressive!)
One of the Amazon Echo ads shows how Alexa is now a part of the family. But if AI is a simulation of human intelligence, who does it simulate and why does it have a gender? Have you ever even queried where “this so-called” research, that companies keep referring to while justifying Alexa’s being a woman, derive its data from?
Were its respondents mostly men? Did it have the “required diversity” to quickly generalise and pass the verdict that “people preferred Alexa’s womanly voice”? Were the questions asked in the survey completely unbiased? And were the researchers completely filtered of their personal biases?
Have the companies reported any sampling errors? Because as far as my knowledge of statistics goes, a sample can never be an accurate representation of the population and that’s exactly why it’s an estimate.
And if we should always do what our consumers desire. Then, consumers want more and more of the products for the same prices and heavy discounts on them. Regardless of reason and season, did companies do that?
Economics might tell you that consumers are rational but we know that it’s assumed, which means that that may not be the case in reality. Because had our consumers been rational enough, they’d not be buying AIs in the first place despite regularly complaining of indolence induced health problems!
The field of AI is predominantly, white and male. Eighty percent of AI academics are men, and just 15 percent of AI researchers at Facebook and just 10 percent at Google are women. You want to retain the female voice but how many times did you outrage for more female scientists and/or AI researchers? Have you ever asked companies what they have done to incentivise the induction of more female AI researchers in their units?
Ask yourself, if there’s an option to go for a genderless voice assistant, then, why are you so hell bent on retaining the female voice? You’re ready for genderless or male “alternatives” with existing female one just so that you get to retain the female voice option. Since you know, you will still always have the option to boss a female and hence, you can keep gratifying your patriarchal urges, right?
Like I’d earlier written in The HuffPost, why must men and women have “essential traits”? We must not chain ourselves to the ‘shoulds’ of gender-based preconceptions and misconceptions. But here we are, busy gendering technology and fighting to justify why it’s important for technology to have a gender.
We already have too many people inside the society (and sometimes, even inside our homes) to stereotype. For example, as recently as February 2020, the Indian Supreme Court had to remind the Indian government that its arguments for denying women command positions in the Army were based on stereotypes.
And a recent UNDP report entitled Tackling Social Norms found that about 90% of people (both men and women) hold some bias against women. Do we actually now need to pay a device to “further” reinforce gender stereotypes?
Should we actually ALSO make science join hands with society to proliferate the existing subversion of women? We haven’t fixed sexism or racism or any other -ism in humanity yet, but here we are halfway through coding them into technology.
Technology is, and should be meant to benefit all members of society, regardless of their age, gender, religion or status in society. Rather than replicate human biases, perpetuate disparities or widen the gap between the haves and have nots.
Hence, I am asking Amazon to lead the way by removing problematic gender stereotypes in Alexa. Let’s #UngenderAI together.
It’s high time that we shoulder our responsibility of creating and inspiring technology that is not only effective but ungendered enough to break gender stereotypes.
P.S. To my Computer Science Engineer friend who asked “so what if Alexa is a woman,” I think you should go confirm your subjects again. Maybe you’d signed up for a 17th century history course and you’re even now living in déjà vu!
Picture credits: This Is Engineering on Pexels
Women's Web is an open platform that publishes a diversity of views. Individual posts do not necessarily represent the platform's views and opinions at all times. If you have a complementary or differing point of view, sign up and start sharing your views too!
Ananya is an economist-in-the-making & internet governance advocate who is extremely passionate about & committed to supporting young girls to stay "relevant, productive, and efficient" in the labor force for the futuristic read more...
Women's Web is an open platform that publishes a diversity of views, individual posts do not necessarily represent the platform's views and opinions at all times.
Stay updated with our Weekly Newsletter or Daily Summary - or both!
Shows like Indian Matchmaking only further the argument that women must adhere to social norms without being allowed to follow their hearts.
When Netflix announced that Indian Matchmaking (2020-present) would be renewed for a second season, many of us hoped for the makers of the show to take all the criticism they faced seriously. That is definitely not the case because the show still continues to celebrate regressive patriarchal values.
Here are a few of the gendered notions that the show propagates.
A mediocre man can give himself a 9.5/10 and call himself ‘the world’s most eligible bachelor’, but an independent and successful woman must be happy with receiving just 60-70% of what she feels she deserves.
Darlings makes some excellent points about domestic violence . For such a movie to not follow through with a resolution that won't be problematic, is disappointing.
I watched Darlings last weekend, staying on top of its release on Netflix. It was a long-awaited respite from the recent flicks. I wanted badly to jump into its praise and will praise it, for something has to be said for the powerhouse performances it is packed with. But I will not be able to in a way that I really had wanted to.
I wanted to say that this is a must-watch on domestic violence that I stand behind and a needed and nuanced social portrayal. But unfortunately, I can’t. For I found Darlings to be deeply problematic when it comes to the portrayal of domestic violence and how that should be dealt with.
Before we rush to the ‘you must be having a problem because a man was hit’ or ‘much worse happens to women’ conclusions, that is not what my issue is. I have seen the praises and criticisms, and the criticisms of criticisms. I know, from having had close associations with non-profits and activists who fight domestic violence not just in India but globally, that much worse happens to women. I have written a book with case studies and statistics on that. Neither do I have any moral qualms around violence getting tackled with violence (that will be another post some day).