The company's contract employees are responsible for listening to certain recordings recorded by Siri. These include doctor-patient conversations, drug trafficking and sexual relations.
Apple also listens at the doors. Like Amazon and Google previously, the Cupertino company is accused of making humans listen to the conversations Siri recorded.
These people are not Apple employees, but third-party contractors who are responsible for analysing voice assistant malfunctions.
According to a whistle-blower wishing to remain anonymous and working in one of these companies, the snippets of conversations listened to by these people are recorded following an unsolicited Siri call.
Those who own an iPhone, Apple Watch or even a HomePod have often experienced this phenomenon: the assistant can launch himself when you pronounce a word or a sentence with a sound close to "Hey Siri".
Disturbing conversations listened to
Whoever says unsolicited triggering, therefore, means conversation that is supposed to remain confidential. That's when the situation gets awkward.
The Guardian therefore explains that the subjects heard on the recordings should never be listened to by a third party. The contract workers in charge of listening to them may have listened a consultation between a patient and his doctor, a business discussion, drug trafficking and even... sexual intercourse.
According to the whistle-blower, the conversations were recorded from all Siri equipped devices: iPhone, iPad, Mac, HomePod or Watch.
According to him, the latter is particularly prone to unintentional triggering of the voice assistant. Indeed, the watch can listen to a request as soon as the screen lights up, without the user having to say "Hey Siri".
No identification possible according to Apple
Apple is trying to reassure its customers by explaining to the British daily that these recordings listened to by humans represent less than 1% of Siri's daily activations.
On the other hand, no Apple IDs are associated with these conversations, making it impossible to identify the person speaking.
However, Apple does not specify anywhere in its terms of use that humans can access these recordings.
To prevent this, the best solution is to disable the "Hey Siri" Detect feature from the Settings section of your devices.
On the Apple Watch, you can also disable "Wrist raise" to prevent Siri from listening as soon as the watch screen turns on.
On HomePod, it is also possible to limit Siri, even if it is the main interaction method.
In recent months, both Google and Amazon have also been singled out to do the same with their Assistant and Alexa.
These processes also raise the question of artificial intelligence, which seems not to be able to really exist by itself and without the intervention of humans.
The latter are still the only ones who can precisely analyse certain elements, such as in these cases, to understand why the assistant was triggered when it had not been requested.
Source: The Guardian
I've made a lot of articles with tools, explanations and advises to show you how to protect your privacy and to secure your computer, GO check them out!
This is my guide To Secure your PC after a fresh installation of Windows
If you think that your Phone or your PC has been hacked, you have to check it right now!
That's how you can be more Anonymous on the internet!
The Future of Cyber-Security, what to expect?
The best Crypto debit card – Wirex!
These are the best VPN to protect your numeric life: NordVPN, ExpressVPN and CyberGhost!
Your PC is slow? That's why!
Why is it important to Be Discreet on the Internet
What Do Tech Giants Know About You? A New Tool To Get An Idea!
Feel hot? Your Computer also!
How an Adware works?
That's how you should guard against Trojan!
What are the different Types of hackers?