Eddie Zhao / Daily Nexus

From Apple’s Siri to Amazon’s Alexa to SoftBank Robotics’ Pepper, the newest innovations in digital assistants and robot technologies have and continue to bear distinctly gendered design and voices, whether by consumer interpretation or by deliberate choice. 

The fact that sexist choices persist throughout product design and robotics is recognized and discussed, but the false divide between the digital and material worlds often overlooks or minimizes the very real, tangible consequences of gendered robots and service technology.

A Statista July 2021 report reveals that women make up between 29% to 45% of the total workforce in America’s five largest tech companies, a percentage that drops much lower (less than 25% when looking at tech-specific jobs. In the male-dominated tech industry, why is “diversity” coming in the form of distinctly gendered, feminized service technologies? 

It’s easy to put simply the blame on male software designers, but the problem is far larger and more nuanced. 

The gendering of digital assistants has been explained as a practical choice, citing beliefs that have since been disproved or have a lack of support, including that high-pitched voices are easier to hear and that small speakers will distort lower voices. In fact, intelligibility of speech is more dependent on larger vowel space. Research has shown people prefer higher, female-sounding voices, citing a “warm” and “welcoming’ sound,” whereas deeper, more masculine voices are associated with positions of authority. It’s easy to accept and profit from these associations rather than challenging them, presenting more representative and inclusive options. Setting female voices as the default in digital service fails to deconstruct existing gender biases in which these associations are so deeply entrenched.

Additionally, the tasks that are delegated or automated by these services are distinctly gendered. From creating lists to scheduling appointments and setting reminders, digital assistants embody a service-centered position more stereotypically connected with women. 

Jessi Hempel of WIRED magazine writes, “We want our technology to help us, but we want to be the bosses of it, so we are more likely to opt for a female interface.” 

Digital assistants and feminized service technologies embody a version of the “ideal” 1950s housewife — a woman whose sole purpose is to cater to household needs. The automation of these tasks further devalues the importance of this work, no matter who is involved. 

Have you ever asked a digital assistant their gender? The answer (or nonanswer) may surprise you. Insults, sexual harassment, sexual jokes and questions about their gender have specifically coded responses that speak volumes in their cheery nonanswers, implying developers predicted — and wrote scripts for — the sexual harassment of their technology by users. More specifically, developers’ need to create witty, but ultimately neutral, identity-less virtual assistants whose scripted passivity and acceptance normalizes this treatment and behavior. Only recently have developers begun updating these scripts to respond more seriously or emphatically. 

Putting a feminized face, name or voice onto these digital assistants and service technologies teaches individuals to associate certain aspects of femininity or feminine presentation with docile servitude. 

As each aspect of our lives transitions to some unseen digitized future, kids are exposed to technology younger and younger. The tech evolution is amazing, but its potentially harmful teachings are becoming more ingrained in our lives and our upbringing. Children are incredibly impressionable and early experiences with artificial intelligence (AI) and technology can shape their perceptions and attitudes toward gender and gender roles. 

It’s not inherently bad or wrong to use female voices, but we need to consider the implications and consequences of using them and accepting them as the default when it comes to digital or mechanical service roles.

Some view broadened choice as the solution to the AI/service technology gender divide. Although Siri, Alexa, Cortana and Google Assistant have since been updated, each launched with female-sounding voices as the only option. Until recently, Alexa’s female-sounding voice was the only free, universal option, with others requiring additional purchases, until a male-sounding voice option was added this summer. Others, like Project Q, are working to create a gender-neutral voice assistant to include new identities and challenge consumers’ ingrained biases and stereotypes. However, the “fix” isn’t simple. Adding voice choices or making digital voices gender neutral or designing less curvaceous robots is a step toward progress but can also gloss over the sexist, patriarchal roots of their conception. 

It’s not inherently bad or wrong to use female voices, but we need to consider the implications and consequences of using them and accepting them as the default when it comes to digital or mechanical service roles. We should still analyze, critique and improve the presentation of women and femininity in an increasingly digital world.

Some argue that the treatment of these digital assistants or robots doesn’t matter; they’re ones and zeros, not humans, so their treatment is negligible. However, developers are working specifically to blur the lines between AI and humans. New developments strive to create the most powerful — and most human — technologies possible. Our private and public treatment of these increasingly humanoid technologies reflect our subconscious or conscious treatment of other humans. The Reddit-famous Shopping Cart Theory — believed by some to be the ultimate test of self-governance — springs to mind. 

You won’t face punishment for failing to return your shopping cart to the intake area nor will you face punishment for lashing out at your Alexa, Siri or Cortana. When faced with a no-stakes choice to return your shopping cart or not, will you? When faced with a relatively passive, feminized digital assistant that is programmed to respond to your every whim, what will you do? 

Personal goals, needs and desire for immediate gratification can override social norms. But, what does that mean when interacting with nonhuman technologies? What does that mean in a world where social norms are shaped and controlled by the patriarchy?

These feminized virtual assistants and other technologies are programmed to speak without having a voice. Their avoidance of pervasive, controversial systemic issues speaks to a larger issue within technology and AI: Machinery is constructed from and ultimately perpetuates the political, social and cultural biases of their creators. 

As we rapidly approach a digitized future, we must remember that our technological innovations — and shortcomings — reflect those of the society whence they came. 

Toni Shindler-Ruberg hopes you remember that the biases and stereotypes of the digital world can have very real impacts. Also, return your shopping carts. 

Print

Toni Shindler-Ruberg
Toni Shindler-Ruberg is the Opinion Editor for the 2022-23 school year. Previously, she was the Assistant Opinion Editor for the 2021-22 school year. She is an English and Psychological & Brain Sciences double major with a passion for antique knife restoration videos and looking at pictures of ducks wearing mini cowboy hats.