• IP addresses are NOT logged in this forum so there's no point asking. Please note that this forum is full of homophobes, racists, lunatics, schizophrenics & absolute nut jobs with a smattering of geniuses, Chinese chauvinists, Moderate Muslims and last but not least a couple of "know-it-alls" constantly sprouting their dubious wisdom. If you believe that content generated by unsavory characters might cause you offense PLEASE LEAVE NOW! Sammyboy Admin and Staff are not responsible for your hurt feelings should you choose to read any of the content here.

    The OTHER forum is HERE so please stop asking.

Gay Phones now Caught as your SEX SPY sensing out your SEX LIFE SECRETS online! HUAT!

tun_dr_m

Alfrescian
Loyal
https://www.rt.com/news/465181-apple-siri-human-contractors/


Siri ‘regularly’ records sex encounters, sends ‘countless’ private moments to Apple contractors
Published time: 27 Jul, 2019 03:53Edited time: 27 Jul, 2019 11:59
1564253293643.png

© Reuters / Robert Galbraith

Follow RT on
Apple’s Siri AI assistant sends audio of sexual encounters, embarrassing medical information, drug deals, and other private moments recorded without users’ knowledge to human ‘graders’ for evaluation, a whistleblower has revealed.
Recordings from Apple’s Siri voice assistant are fed to human contractors around the world, who grade the AI based on the quality of its response and whether its activation was deliberate, according to an anonymous contractor who spoke to the Guardian. They claimed accidental activations are much more frequent than Apple lets on, especially with Apple Watch users – and wants the company to own up to the problem.
There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data,” the whistleblower revealed.
ALSO ON RT.COMApple co-founder Steve Wozniak wants everyone to quit Facebook
Apple is subcontracting out, there’s a high turnover. It’s not like people are being encouraged to have consideration for people’s privacy, or even consider it,” the whistleblower said, explaining that they are concerned these recordings, produced when Siri thinks it hears its “wake word,” could be used against the people who (accidentally) made them – especially given the “broad” amount of user data they claim contractors are “free to look through.” In what sounds like a sick joke on the part of some programmer, the sound of a zipper unzipping often triggers Siri to wake up.
If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings].
While Apple does not explicitly mention any human involvement in Siri’s training in the AI’s documentation, it acknowledged when asked about its practices that “a small portion of Siri requests are analyzed to improve Siri and dictation.” The company insisted that this amounted to less than one percent of all daily activations of the AI and that the recordings were “typically only a few seconds long.”
ALSO ON RT.COMANYONE can be re-identified from ‘anonymous data’, researchers claim & let you TEST it
While Apple emphasized that a user’s Apple ID and name are not attached to clips reviewed by contractors, it also took pains to explain that recordings are “analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements” – suggesting the company is aware of how easily even a recording stripped of its user ID can be connected to the user who made it.
ALSO ON RT.COMAmazon admits it keeps some Alexa recordings even when users delete them
Siri isn’t the only voice assistant that transmits users’ private moments back to the mothership, of course – Amazon’s Alexa infamously has entire chat rooms for its human trainers to discuss difficult-to-understand audio clips (or mock funny recordings) and Google Home uses a similar system of outsourced “language experts” that allows the company to claim that no one at Google has access to the recordings its devices make.
ALSO ON RT.COMOutsourced spying: Google admits ‘language experts’ listen to ‘some’ assistant recordings
Like this story? Share it with a friend!
 

zhihau

Super Moderator
SuperMod
Asset
KNN!!! Why SIRI hasn’t float Tantric sex is the best to the top of the list?
 

laksaboy

Alfrescian (Inf)
Asset
I don't trust any AI assistant: Apple's Siri, Google Assistant, Amazon's Alexa, Microsoft's Cortana, Xiaomi's Xiao AI, Yandex's Alice.
 
Top