Lawsuit: Apple, Google recording users without their knowledge
09/08/2021 // Ralph Flores // Views

BigTech has long insisted that voice assistance is the next step when it comes to convenience, saying the technology can help people with every little need that pops up. But a new lawsuit challenges that narrative, alleging that voice assistants from Google and Apple listen in on, and even record, conversations -- even if they're not supposed to.

The Oakland Federal District Court's Judge Jeffrey White ruled last week that Apple would have to fight a lawsuit brought to federal court. The lawsuit states that Siri, the company's voice assistant, has improperly recorded private conversations – adding to fears regarding privacy and data protection.

Despite Apple's request to have the case thrown out, the judge said that most of the lawsuit could proceed. He did, however, dismiss one piece of the lawsuit that alleged economic harm for the users. This means that plaintiffs, who are trying to make the suit a class action case, can continue to pursue their claims that Siri listened in and recorded conversations, even without a prompt, and passed this data to third parties.

This lawsuit is the latest in a string of user privacy violations by voice assistants brought against Apple, Google and Amazon. The voice assistants, called Siri, Google and Alexa, are touted to help their users with everyday tasks.

The companies insist that their voice assistants do not listen to conversations unless prompted. In response to the lawsuit, Apple declined to make a statement, pointing to court filings. Google said that it will fight the lawsuit, while Amazon denied the claims, saying that its voice assistant Alexa only records audio when "wake words" are used. Even then, a small fraction of user audio is manually reviewed.

Brighteon.TV

According to Noah Goodman, a computer science and psychology professor at Stanford University, the technology that enables voice assistants to respond to "wake words" is challenging in itself -- primarily since everyone's voices are different. This means that voice assistants are more likely to respond to anything they perceive to be "wake words," even if these are false alarms.

Voice assistants listening a little too well

The lawsuit against Apple, and a similar one against Google, is expected to land these BigTech companies in hot water, in particular, for how they collect and handle the private information from millions of users. (Related: Amazon is about to start sharing Alexa owners’ bandwidth with their neighbors.)

Voice assistants like Apple's Siri and Amazon's Alexa have been gained much popularity in recent years. A recent survey published in Statista revealed there are over 110 million voice assistant users in the United States alone, and data from eMarketer says that at least 128 million people use one at least monthly. But their rising popularity has also brought about concerns that these devices may be listening to people a little too closely.

In response to the lawsuit, Apple insists that it does not sell Siri recordings to third parties and that these aren't associated with an "identifiable individual."

"Apple believes that privacy is a fundamental human right and designed Siri so users could enable or disable it at any time," the company said in its motion to dismiss. "Apple actively works to improve Siri to prevent inadvertent triggers and provides visual and audio cues (acknowledged by several Plaintiffs) so users know when Siri is triggered."

Voice assistants like Siri are designed to turn on after hearing a prompt -- like "Hey, Siri" or "OK, Google" -- but the lawsuit claims the respondents saw their devices turn on even without them saying these "wake words." They added that these recordings were then sent to third-party contractors for review.

"These conversations occurred in their home, bedroom, and car, as well as other places where Plaintiffs Lopez and A.L. were alone or had a reasonable expectation of privacy," the lawsuit alleged.

For those concerned about BigTech's potential reach, the lawsuits are an ominous sign of how intrusive voice assistants are -- and how much of a person's life they have access to.

Learn more about the dangers of voice assistants at PrivacyWatch.news.

Sources include:

Breitbart.com

News.Yahoo.com

Statista.com



Take Action:
Support Natural News by linking to this article from your website.
Permalink to this article:
Copy
Embed article link:
Copy
Reprinting this article:
Non-commercial use is permitted with credit to NaturalNews.com (including a clickable link).
Please contact us for more information.
Free Email Alerts
Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.
App Store
Android App
eTrust Pro Certified

This site is part of the Natural News Network © 2022 All Rights Reserved. Privacy | Terms All content posted on this site is commentary or opinion and is protected under Free Speech. Truth Publishing International, LTD. is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. Truth Publishing assumes no responsibility for the use or misuse of this material. Your use of this website indicates your agreement to these terms and those published here. All trademarks, registered trademarks and servicemarks mentioned on this site are the property of their respective owners.

This site uses cookies
Natural News uses cookies to improve your experience on our site. By using this site, you agree to our privacy policy.
Learn More
Close
Get 100% real, uncensored news delivered straight to your inbox
You can unsubscribe at any time. Your email privacy is completely protected.