Language Menu

Language: English Chinese

Categorized | Mobile

Apple’s Chinese Siri Now Censoring Keywords Due to Discovery of Unexpected Services

Apple’s Siri has been known to be unreliable with its search results at times, and maybe this is expected from a software still just catching on, but at least in China Siri has gone above and beyond the call of duty with some interesting search results that have had an amusing effect on the iOS 6 experience.

According to a report by Netease today, it was recently discovered that Apple’s Chinese Siri can respond to user requests for prostitutes. Apple has since made a concerted effort to fix the issue, but not without it first stirring up some controversy.

The report states that iPhone 4S users found that in response to the statement in Chinese “I want a prostitute,” Siri would say: “Alright, I found the following 15 escorts” and proceed to list karaoke venues and nightclubs in the vicinity with addresses and the distances.

Apple stated yesterday in response that it is implementing a keyword screening system, and from this morning onward Siri will now respond only with “I can’t find any prostitutes.” In addition to the keyword “prostitute” (or “third of company” in Chinese), Apple has stated it will also be screening terms that involve violence and “illegal activity” under Chinese law.

All jokes aside, this probably indicates that Apple will not be resisting the authorities in the censorship realm, as “illegal activity” in China often refers to propagating rumors or searching terms deemed to be sensitive.  For his part, an IT legal expert named Zhao Zhanling expressed his surprise at Apple’s move: “On the one hand, this may be due to pressure from public opinion, but on the other hand it may be because Siri belongs to Apple, which is entirely different from other developers’ applications.”

Source: Netease

Related Posts:

About this author:

Follow GMIC on Twitter