Apple is resuming the use of humans to review Siri commands and dictation with the latest iPhone software update.
In August, Apple suspended the practice and apologized for the way it used people, rather than just machines, to review the audio.
While common in the tech industry, the practice undermined Apple's attempts to position itself as a trusted steward of privacy. CEO Tim Cook repeatedly has declared the company's belief that "privacy is a fundamental human right," a phrase that cropped up again in Apple's apology.
Now, Apple is giving consumers notice when installing the update, iOS 13.2. Individuals can choose "Not Now" to decline audio storage and review. Users who enable this can turn it off later in the settings. Apple also specifies that Siri data is not associated with a user's Apple ID.
Tech companies say the practice helps them to improve their artificial intelligence services.
But the use of humans to listen to audio recordings is particularly troubling to privacy experts because it increases the chances that a rogue employee or contractor could leak details of what is being said, including parts of sensitive conversations.
Apple previously disclosed plans to resume human reviews this fall, but hadn't specified when. Apple also said then that it would stop using contractors for the reviews.
Other tech companies have also been resuming the practice after giving more notice. Google restarted the practice in September, after taking similar steps to make sure people know what they are agreeing to. Also in September Amazon said users of its Alexa digital assistant could request that recordings of their voice commands delete automatically.