The lawsuit revealed that Apple’s Google intelligent assistant was still monitoring and using information for advertising when it was not activated


Tencent technology news on September 3, for a long time, technology companies have been encouraging consumers to put monitoring devices at home or even in their pocket, and trying to persuade them to rely on intelligent voice assistants to meet any sudden needs. But many people are increasingly worried that these devices will continue to monitor users even if they should not be activated. They have taken apple, Google and Amazon to court because of this concern.
On Thursday, a judge ruled that Apple would have to continue to fight a lawsuit filed by users in a California federal court accusing Siri, Apple’s intelligent voice assistant, of improperly recording private conversations. The judge said that although Apple asked the court not to accept the case, most of the proceedings are still likely to continue.
Earlier, Jeffrey S. white, a judge of the Federal District Court of Auckland, did reject a case involving user economic damages. But he ruled that plaintiffs trying to turn the lawsuit into a class action lawsuit could continue to sue apple. The plaintiffs claimed that Siri was activated without prompting, recorded conversations that users didn’t want to hear, and even passed the data to a third party, thus violating their privacy.
The case is just one of several lawsuits against apple, Google and Amazon’s intelligent voice assistant for privacy violations. These intelligent assistants were originally designed to facilitate users to complete daily tasks, such as connecting to speakers to play music, setting timers, or adding goods to shopping lists. However, the companies denied that they monitored the conversation for any purpose other than helping complete the task or playing music. Amazon did not immediately respond to requests for comment. Apple reiterated its position in court documents, while Google said it was willing to fight the lawsuit.
Noah Goodman, an associate professor of computer science and psychology at Stanford University, said that ideally, these intelligent voice assistants need to hear “wake-up words” to be activated. But in reality, this is a very challenging task, because everyone’s voice is often very different. Although these companies may try, he said, they are unlikely to “completely eliminate accidental wakeups”.
The lawsuit, and another similar lawsuit against Google, could put these companies in trouble again because the way they deal with collecting private information from millions of users is being questioned. The popularity of voice assistants has soared. Market research firm emarketer estimated that at least 128 million people in the United States will use voice assistants every month at the end of last year. But as these assistants became more and more popular, more and more people began to realize that they heard too much, which made people feel uncomfortable.
A 2019 survey found that Amazon kept copies of all the content recorded by Alexa after it thought it heard Alexa’s name, even if users were not aware of it.
In response to the latest lawsuit, apple said that the company did not sell Siri’s recordings and that the recordings had nothing to do with “identifiable individuals”. In a motion requesting the judge to dismiss the lawsuit, the company said: “Apple believes that privacy is a basic human right and allows users to enable or disable Siri at any time. Apple is also actively improving Siri to prevent accidental wake-up and provide visual and audio prompts so that users can know when Siri is triggered. ”
In an email statement on Thursday, Google said it ensured the security of information. Jos é Casta EDA, a spokesman for the company, said: “by default, we will not keep users’ recordings. It is convenient to manage your privacy preferences by simply answering your privacy questions or enabling visitor mode. We object to the statement of this case and will actively defend ourselves. ”
The intelligent voice assistant should have been activated when prompted, such as “Hi Siri”, but the lawsuit claims that the plaintiff saw their device activated even without shouting a wake-up word. They claimed that some conversations were recorded without their consent, and then the information was used to target them and sent to third-party contractors for review.
In 2019, apple almost suspended a practice that allowed human censors to listen to Siri recordings and rate them. At that time, apple said it would use computer-generated content for review. Technology companies say they use these reviews to understand what works and what doesn’t work in order to improve their products. But just a few months later, the associated press revealed that Apple began to use human censors to listen to the recording again, but users can choose to quit.
Nicole ozer, director of technology and civil liberties at the American Civil Liberties Union (ACLU) in California, said the lawsuits showed that people were realizing that voice technology collected too much information. “I think this lawsuit is the part where people are finally beginning to realize that Siri is not good for us but crucial to apple,” she said( Tencent Technology (reviser / Jinlu)