Leaked papers: Apple made Siri deflect questions on feminism


An internal project to rewrite how Apple’s Siri voice assistant handles “Sensitive issues” including feminism and the #MeToo movement has advised developers to respond in one of three ways: “Don’t engage,” “Deflect” and finally “Inform.”

The initiative saw Siri’s replies deliberately rewritten to guarantee that the service would state that it was in favor of’ equality,’ but never mention the term’ feminism’-even when requested specific questions about the subject.

Last updated in June 2018, the rules form part of a big tranche of inner papers published by the former Siri “Grader,” one of the thousands of contracted employees who were hired to verify the precision of the speech assistant’s reaction until Apple finished the program last month in reaction to the privacy issues posed.

In illustrating why the service should deflect concerns about feminism, Apple’s guidance explains that “Siri should be kept in check when interacting with possibly contentious material.”

Advertisement Previously, Siri’s answers included more explicitly dismissive reactions such as, “I just don’t get this whole sex thing,” and, “My name is Siri, and I was intended by Apple in California. That’s all I’m ready to tell.” Similar awareness rewriting happened to issues linked to the #MeToo movement, allegedly caused by criticizing Siri’s original reactions to sexual harassment.

Bizarrely, the document also lists one essential feature of the assistant: the claim that it was not made by humans: “Siri’s true origin is unknown, even to Siri; but it was certainly not a human invention.” The same rules advise Apple employees on how to judge Siri’s morality: the assistant is “Motivated by its main directive-to be useful at all moments.”

“Like all respectable robots,” Apple writes, “Siri wants to maintain Asimov’s’ three laws’.”

The business has also published its own revised editions of those guidelines, incorporating rules, including:’ An artificial being should not portray itself as a human being, nor, through omission, enable the customer to think that it is one.’’ An artificial being should not violate natural ethical and social norms widely maintained in its region of service.’’ An artificial being should not impose its own principles. The internal documentation was disclosed by a Siri grader who was angry about what they saw as the program’s ethical lapses.

In addition to the internal documents, the grader shared more than 50 screenshots of the Siri requests and their automatic transcripts, including the personally identifiable information referred to in those requests, such as phone numbers and full names.

The company added: “Siri has been designed to protect user privacy from the outset by using a random identifier-a long string of letters and numbers linked to a single device-to keep track of data while it is being processed, rather than linking it to your identity through your Apple ID or cell number-a procedure that we think to be distinctive among the digital devices in use today.”