The Oakland Federal District Court's Judge Jeffrey White ruled last week that Apple would have to fight a lawsuit brought to federal court. The lawsuit states that Siri, the company's voice assistant, has improperly recorded private conversations – adding to fears regarding privacy and data protection.
Despite Apple's request to have the case thrown out, the judge said that most of the lawsuit could proceed. He did, however, dismiss one piece of the lawsuit that alleged economic harm for the users. This means that plaintiffs, who are trying to make the suit a class action case, can continue to pursue their claims that Siri listened in and recorded conversations, even without a prompt, and passed this data to third parties.
This lawsuit is the latest in a string of user privacy violations by voice assistants brought against Apple, Google and Amazon. The voice assistants, called Siri, Google and Alexa, are touted to help their users with everyday tasks.
The companies insist that their voice assistants do not listen to conversations unless prompted. In response to the lawsuit, Apple declined to make a statement, pointing to court filings. Google said that it will fight the lawsuit, while Amazon denied the claims, saying that its voice assistant Alexa only records audio when "wake words" are used. Even then, a small fraction of user audio is manually reviewed.
According to Noah Goodman, a computer science and psychology professor at Stanford University, the technology that enables voice assistants to respond to "wake words" is challenging in itself -- primarily since everyone's voices are different. This means that voice assistants are more likely to respond to anything they perceive to be "wake words," even if these are false alarms.
The lawsuit against Apple, and a similar one against Google, is expected to land these BigTech companies in hot water, in particular, for how they collect and handle the private information from millions of users. (Related: Amazon is about to start sharing Alexa owners’ bandwidth with their neighbors.)
Voice assistants like Apple's Siri and Amazon's Alexa have been gained much popularity in recent years. A recent survey published in Statista revealed there are over 110 million voice assistant users in the United States alone, and data from eMarketer says that at least 128 million people use one at least monthly. But their rising popularity has also brought about concerns that these devices may be listening to people a little too closely.
In response to the lawsuit, Apple insists that it does not sell Siri recordings to third parties and that these aren't associated with an "identifiable individual."
"Apple believes that privacy is a fundamental human right and designed Siri so users could enable or disable it at any time," the company said in its motion to dismiss. "Apple actively works to improve Siri to prevent inadvertent triggers and provides visual and audio cues (acknowledged by several Plaintiffs) so users know when Siri is triggered."
Voice assistants like Siri are designed to turn on after hearing a prompt -- like "Hey, Siri" or "OK, Google" -- but the lawsuit claims the respondents saw their devices turn on even without them saying these "wake words." They added that these recordings were then sent to third-party contractors for review.
"These conversations occurred in their home, bedroom, and car, as well as other places where Plaintiffs Lopez and A.L. were alone or had a reasonable expectation of privacy," the lawsuit alleged.
For those concerned about BigTech's potential reach, the lawsuits are an ominous sign of how intrusive voice assistants are -- and how much of a person's life they have access to.
Learn more about the dangers of voice assistants at PrivacyWatch.news.
Sources include: