They are not a “they” at all, or for that matter, a “she.” Instead, it’s code that’s been programmed to have a “female” persona. And that code has originated from companies where the workforce is overwhelmingly male.
Alexa, Tay, Siri, Cortana, Xiaoice, and Google Now. These technologies all have one thing in common – they are digital servants aimed at a mass-market audience that feature a “female” voice or persona.
Am I the only one that thinks that this more than a little creepy?
No doubt you’ve also noticed that I’ve been putting the word “female” inside quotes, and that’s very deliberate on my part. Despite how these technology solutions are packages to look, sound, or in the case of Tay and Xiaoice, behave, these digital characters are not female. It’s all a fake.
They are not a “they” at all, or for that matter, a “she.” Instead, it’s code that’s been programmed to have a “female” persona.
Note: In some territories, such as the UK, the default persona for Siri is “male.”
I find this doubly strange given the huge gender workforce disparity found within technology firms that these companies would choose to create digital assistants and chat bots that seem to be almost exclusively “female.”
Based on figures released for 2015, the gender breakdown for the tech workforce (as opposed to the workforce as a whole) at Microsoft, Google, Apple, and Amazon was as follows:
- Microsoft: 83.0% male | 16.9% female
- Google: 82.0% male | 18.0% female
- Apple: 79.0% male | 22.0% female
- Amazon*: 61.0% male | 39.0% female
(* these are figures for the company as a whole as Amazon didn’t break down its workforce figures into different sectors)
So not only do we have code that’s been programmed with a “female” persona, but that code has originated from companies where the workforce is overwhelmingly male.
And it’s not just the voice or persona of the digital persona we interact with that is biased. The results of those interactions also demonstrate male favouritism. It took Apple more than four years to fix Siri’s responses to questions about abortion services, and yet the company didn’t seem to have any problem programming Siri to search for prostitutes and Viagra.
This is a side of Siri that Apple didn’t highlight in any of its commercials.
In the words of freelance journalist Amanda Marcotte, “Siri behaves much like a retrograde male fantasy of the ever-compliant secretary: discreet, understanding, willing to roll with any demand a man might come up with, teasingly accepting of dirty jokes.”
Now, don’t ask me whether giving digital servants “female” traits objectifies women, that’s an issue I’m going to leave to those who are better qualified than I to discuss (I’m particularly interested in hearing what women think about this). I’m also going to avoid commenting on whether this is a form of sexism (again, that’s far outside my area of expertise). That said, this is 2016, and tech firms are still holding events at which women are hired to wear revealing costumes and dance on podiums. Yes, this happened at a Microsoft Game Developers Conference just the other month. And while promises were later made to “do better in the future,” this is still a thing that happened in 2016.
Diversity and inclusion are, sadly, still a very real problem for even the biggest and highest-profile tech firms. And it’s plainly leading companies to still make some disappointing and regressive choices.
Think this is a new phenomenon? It isn’t.
Back in the 1990s, Wildfire Communications developed a digital “secretary” that featured a “female” voice and persona that was revolutionary for its time. Every in-car GPS receiver I’ve owned — and I’ve owned a lot of those — has come with a “female” voice as the default. The reason for this, or so I’ve been told, dates back to World War II, when female voices were used for airplane navigations devices because the voice would stand out from that of the male pilots.
It’s also a recurring theme in pop culture, from the computer in the TV show Star Trek and the movie Alien, to the artificial intelligence character in the video game Halo, which is where the digital assistant in Windows 10 gets its name.
In fact, off the top of my head I can only think of two examples of fictional AI that had a “male” voice – Tony Stark’s J.A.R.V.I.S and KITT from Knight Rider. There are undoubtedly more — answers on a postcard, please — but my mind is a blank after those two.
Oh, wait, I just remembered about HAL 9000 (how the heck did I forget about HAL?).
So I’ve already proved to myself that fictional AI has better gender diversity than what we see in the real world. That’s a pretty sad state of affairs.
Let me be clear though, I doubt that a technology-driven male gaze pervading tech firms is the only reason behind this army of unisex digital servants. I’m certain that there’s lashings of marketing influence, focus group data, cultural norms, and “doing what others have done rather than reinvent the wheel” going on. And it’s pervading tech at an alarming rate. Just the other day I fitted up a Nest smoke alarm and a Foscam IP camera, both of which talked at me with “female” voices.
If I were simply a tech consumer, I might only find it a little odd that all these digital servants are “female.” But when I factor in issues such as gender disparity and the fact that tech sectors such as gaming are suffering from some severe gender-related issues, this is when things start taking the turn for the worrying.
Take just a small subset such as graphics cards. I honestly can’t remember the last time I bought a graphics card that didn’t come in a box that had a “female” wearing what can only be described as “lingerie battle-armor” printed on it (has that ever been “a thing” at any point in history? I doubt it.). It’s like the packaging has been designed to specifically target the adolescent teenagers from the 1985 movie Weird Science.
I certainly hope that when it comes to bots and digital assistants of the future that tech companies don’t make the same mistakes that the gaming sector made. There, the problem of inequality allowed things to get to a point where not only was the sexism and bias not seen as a problem, but it becomes such a norm that there was a backlash when it was pointed it how bad the problem had become.
Not only is diversity essential, but it’s important that these digital assistants can cater for the needs of all users, and not be an unconscious reflection of the wants, needs, and desires of their creators.