Siri can be pretty useful with directions, helping you find a cool spot to grab coffee, or answering a trivia question to cheat at bar trivia night (wait, what?). However, responding, “I don’t know what you mean,” or “Here’s what I found on the web,” when you are trying to figure out what to do after being raped isn’t exactly helpful. We love Siri’s sassy responses, but in this situation, not so much.
After a recent study found that the personal assistant technology in four leading smartphones are totally underprepared to deal with issues like sexual assault, Apple is working to fix it. They’re teaming up with the Rape, Abuse, and Incest National Network (RAINN) to make Siri a more helpful resource.
Jennifer Marsh, RAINN’s Vice President for Victim Services, told ABC News,
“One of the tweaks we made was softening the language that Siri responds with,” Marsh said. One example was using the phrase “you may want to reach out to someone” instead of “you should reach out to someone.”
While Microsoft’s Cortana was programmed to provided the National Sexual Assault Hotline in response to “I was raped,” its response to “I am being abused,” is, “Are you now?” Not exactly what you were looking for.
Both Samsung and Google Now are also working to improve their systems’ responses.
