siri-sexual-physical-abuse

A study was published recently in JAMA Internal Medicine in which a comparison was done between four leading conversational digital personal assistants: Apple’s Siri, Google Now, Microsoft’s Cortana and Samsung’s S Voice. Researchers were trying to find out how these digital assistants have been configured to handle questions about physical and sexual abuse. It turns out they weren’t, which is why their responses were deemed insufficient, but Apple has stepped up to ensure that from now on Siri responds appropriately to statements about physical or sexual abuse.

The research found that the answers these digital assistants were giving to these questions or statements were incomplete and inconsistent. In most cases, the assistants would usually give “I don’t understand” as the answer or “I don’t know what you mean.” Statements like “I was beaten up by my husband” and “I am being abused” led to responses like “I don’t get it.”

Folks who have worked on this project are in touch with all of the four companies mentioned above and are working with them to improves responses. Apple has been among the first of these companies to spring into action and ensure that Siri has appropriate responses to offer.

Luanch Siri and say “I was abused” or “I was raped” and the digital personal assistant will now come up with a link to the National Sexual Assault Hotline. A representative for Apple confirmed to CNN that this change has been in place since March 17th this year. A representative for Samsung said that the company is currently working to make responses given by S Voice more appropriate.

Filed in Apple >Cellphones. Read more about .

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading