GIVE BOTS A BREAK

Virtual assistants spend much of their time fending off sexual harassment

Artificial intelligence isn’t immune to harassment
Artificial intelligence isn’t immune to harassment
Image: Suzanne Plunkett / Reuters

Bots are already scheduling meetings, ordering meals, and analyzing bank accounts. They’re now well on their way to becoming our full-time virtual assistants. That means they must now suffer the indignities unethical bosses inflict on their human assistants, especially sexual harassment.

As bots do more of our bidding, their algorithms are spending more time parrying flirtations, dodging personal questions, and dealing with darker forms of sexual harassment. ”Lots of use cases come from that motivation,” says Ilya Eckstein, CEO of Robin Labs whose bot platform helps truckers, cabbies, and other drivers find the best route and handle logistics. “People want to flirt, they want to dream about a subservient girlfriend, or even a sexual slave. It may just be more for laughs, or something deeper underneath the surface.”

Bots are now everywhere—they’re “the new app” according to Microsoft CEO Satya Nadella. Although bots just made their commercial debut last year, you can call already call up thousands of them on your voice, text, and messaging services. Facebook’s chatbot platform Messenger, launched in April, already offers more than 11,000 bots and “tens of thousands” of developers are reportedly working on more.

The scale of the sexual harassment issue is unclear. Most bot makers say they encounter it regularly. Eckstein says 5% of interactions in their database are categorized as clearly sexually explicit, although he believes the actual number is far higher due to the difficulty of identifying them automatically. A writer for Microsoft’s Cortana, Deborah Harrison, said earlier this year at the Virtual Assistant Summit that “a good chunk of the volume of early-on inquiries” were into Cortana’s sex life. “That’s not the kind of interaction we want to encourage,” said Harrison.

The nature of the harassment varies. Some people seem to be testing the software’s limits, and teenagers seem to be purposefully eliciting outrageous responses for fun, says Eckstein. But others are playing out aggressive, degrading, and violent fantasies of control and domination. There’s also a small number of lonely users desperate to find partners, even if they’re an AI, says Eckstein. “You see some people try very hard to establish a relationships with the bot,” he says, noting a full third of interactions with their bots are simply conversations without any intended task. “Some users are amazingly stubborn in that sense. They walk from one assistant to another trying to find one that will understand them best and can have a conversation with them.”

Engineers are struggling to design for these interactions while deterring outright harassment. There’s a debate in AI circles right now is whether bots should present themselves as humans or machines, and if gender should even apply in either case. “I don’t think as a industry we’ve agreed that to humanize our agents is the best thing to do,” says Dennis Mortensen of x.ai, which has built a successful scheduling bot available as Amy or Andrew Ingram.

Google has purposefully avoided humanizing its bot creation: the Google Assistant acts as an all-purpose agent with no distinct personality (although its voice is female). But of the major tech companies, Google is virtually alone in this approach. Most have opted to present a strong human persona with clearly feminine voices, including Apple’s Siri, Amazon’s Alexa, and Microsoft’s Cortana (only Siri lets you switch easily to a male voice).

Mortensen believes people will naturally gravitate toward humanizing their bots whether companies like it or not. Most of x.ai’s users have already decided to refer to the bot as “him” and “her,” something Mortensen expected to take five years or more.

“We saw this happen overnight,” he says. Users also show a clear preference for the opposite sex when picking assistants: females tend to chose “Andrew,” while men select Amy (although the two speak in precisely the same way and both scripts are written by women at x.ai).

The AI-company Kasisto is designing a digital world that intentionally backgrounds human gender norms in an effort to avoid stereotypes and harassment. It has built a genderless financial bot called Kai that analyzes spending, makes payments, and answers banking questions. Before launching Kai, the company’s founder Dror Oren wanted to ensure the bot represented the company’s values by defying gender expectations. The research suggests “people would rather get advice from women,” says Dror. “We decided it was our place to take a stand, or educate if you will. Just because people are used to getting advice from a female secretary, it doesn’t mean we have to do the same.”

Kasisto designed its bot to avoid demure or deferential responses when confronting sexual innuendo, or inappropriate personal questions such as asking Kai out on a date. Instead, it firmly directs users back to the task. Oren criticized companies like Amazon and Apple whose bots are purposefully designed to use what he calls flirtatious or demure responses that play into sexual stereotypes. 

Responses from Siri.
Responses from Siri.

Almost all bot developers seem to agree that the solution is to adapt their creations to respond to the tone, context, and intent behind every human interaction, in the same way adept human assistants use language differently in formal and casual conversations. Mortensen believes x.ai is only a few years from turning each bot into a savvy conversationalist, regardless of gender, that responds appropriately and can keep the most demeaning sexual overtures at bay.

“There’s a legacy of what women are expected to be like in an assistant role,” said Microsoft’s Harrison. “We wanted to be really careful that Cortana…is not subservient in a way that sets up a dynamic that we didn’t want to perpetuate socially. We are in a position to lay the groundwork for what comes after us.”