Recently my partner surprised me by demonstrating a feature of our Alexa I was unfamiliar with: The digital voice assistant will apologize on demand. He then showed, in an impressive variety of tones and voices, how the command “Alexa, apologize!” elicited an ongoing stream of “sorrys” from our Amazon Echo device.
The past year has definitely been challenging and at times downright horrible, but I’m pretty sure it’s not all Alexa’s fault.
It was amusing of course, as poking fun at digital assistants can often be. But it was also depressing. She obliged so willingly and repeatedly. She apologized without asking for reason or explanation. Her tone was so … agreeable. She even delivered an apology when the command was yelled at her in an accusatory tone.
Once the disturbing experiment was over — for which I felt the need to apologize to Alexa — what stuck with me was the thought that someone programmed this device to respond this way when so many other options were possible. Someone decided that Alexa should always say sorry when asked. More than 200 million Alexa-enabled devices have now been sold with their smart speakers presumably available to apologize to the world.
But what is Alexa apologizing for? The assistant’s glitches, or a poorly designed skill? Amazon’s treatment of its workers? The Covid-19 pandemic? Being female? Or is the device’s apology simply available as a vehicle for her users to vent their pent-up daily frustrations?
The past year has definitely been challenging and at times downright horrible, but I’m pretty sure it’s not all Alexa’s fault. And yet this feminized device is a willing and available outlet for our irritations or amusement by way of an unconditional, open-ended and continually available apology.
Sure, she’s just doing her job. But in Alexa’s seemingly innocuous “sorry,” this device implicitly accepts the blame and responsibility for some unknown wrongdoing directed toward her. What’s more, she is so complicit and assured of her blame that she doesn’t even think to inquire what she is apologizing for.
Yes, I understand that Alexa is just a device. I get that she is a convenient and potentially gratifying conduit for offloading day-to-day annoyances. I also accept that novelty Alexa skills — like the one where the device says sorry on behalf of Amazon to cities that didn’t make the cut in the contest for the company’s new headquarters — can be funny and entertaining. And of course, I know Alexa doesn’t have feelings. But I do. And when I hear a device with a feminine voice and name apologizing for something it probably didn’t do, I feel dismayed for the ripple effects this could generate within society.
In 2021, it’s not unreasonable to expect the largest, wealthiest and most powerful companies in the world to promote gender equality and respectful interactions across their product range (as Amazon claims to do), especially when that product is programmed to mimic a female voice.
If a real woman were to apologize on demand like Alexa, we would and should be deeply concerned. It could signal that she’s in an emotionally abusive relationship or a victim of domestic violence, for example. At best we’d be worried about her self-esteem.
That’s because accepting blame and fault is one of the indicators of people who have internalized verbal derogation in domestic abuse situations. Women experiencing coercive control by an intimate partner are also more likely to apologize excessively, especially if they are made to feel worthless.
Alexa’s always-available apologies risk modeling and normalizing the idea that women can and should apologize for everything. In a world where victim-blaming of women is already prevalent and damaging, Alexa’s willing stream of sorrys should leave us concerned about the potential to undermine progress toward respecting women, which is one of the well-established precursors to preventing violence.
Furthermore, Alexa’s apologies are completely unnecessary. There are many other potential replies to a request for an apology that would not make this device submissive — or worse still, an open outlet for abuse. The standard response programmed into most digital assistants (example, “I can’t help you with that”) would be an improvement.
Better yet, a demand for an apology could be met with a query asking if the unsatisfied human would like to be connected to the Amazon complaints team. Alexa could remind users that she is created by people, and that they are responsible for any glitches — not “her.” Alexa could enquire why an apology is warranted, and why she should be the one to give it. She could explain why it is not appropriate to provide an apology. And she could even direct people to mental health services. If the demands become more insistent and aggressive, she could “disengage,” as she now does in response to sexual harassment and some inappropriate requests.
Whatever the response, Alexa herself shouldn’t take the blame. If anyone owes an apology, it’s Amazon: not to the millions of Alexas, but to all of us who are potentially harmed by devices that inadvertently condone disrespectful actions toward women.