Apple changes privacy policy after admitting it allowed workers to listen to Siri recordings

Apple has come under fire for allowing contractors to listen to audio recordings made by Siri users containing private information.

The tech giant has apologised for allowing its workers to listen to recordings of Siri users so they could grade them.

As part of the company’s grading system, contractors were reviewing or ‘grading’ Siri recordings to assess the accuracy of Siri’s responses.

However, many of these recordings reportedly included confidential information, illegal activities and even users having sex.

In a statement posted to the Apple website, the company wrote: ‘As a result of our review, we realise we have not been fully living up to our high ideals, and for that we apologise.

‘As we previously announced, we halted the Siri grading program. We plan to resume later this fall when software updates are released to our users.’

MORE: NHS teams up with Amazon to make your Alexa your new GP

Although the grading program is now on hold, Apple will be resuming the program after implementing changes including no longer retaining recordings of Siri users, but will keep automatically generated transcripts of requests.

Apple have also committed to giving Siri users the choice whether to share their recordings the company.

“We hope that many people will choose to help Siri get better,” it said.

Finally, only Apple employees will be permitted to listen to the Siri user recordings instead of hired contractors.

The news may come as a shock to frequent Siri users, however the company said that the sample of audio recordings made up less than 0.2 per cent of all Siri recordings, meaning the chances of yours having been listened to are pretty slim.

MORE: Is your mobile phone giving you wrinkles? Probably, if you’re looking at it for this amount of time daily

For more information on the new Siri privacy policy, visit the Apple website or iOS settings (Settings > Siri & Search > About Ask Siri & Privacy).

With the rise of smart-home assistants such as Siri, Alexa and Google Assistant, it’s more important than ever to ensure you manage your privacy settings so you don't get accidentally caught out!