U.K. uses AI developed by Amazon to read people’s moods at train stations
07/02/2024 // Ava Grace // Views

A series of artificial intelligence (AI) trials in the U.K. involving thousands of train passengers who were unwittingly subjected to emotion-detecting software has raised privacy concerns.

The technology, developed by Amazon and employed at various major train stations, including London's Euston and Waterloo, used AI to scan faces and assess emotional states along with age and gender. Documents obtained by the civil liberties group Big Brother Watch through a Freedom of Information request uncovered these practices, which might soon influence advertising strategies.

These trials used CCTV technology and older cameras linked to cloud-based systems to monitor a range of activities, including detecting trespassing on train tracks, managing crowd sizes on platforms and identifying antisocial behaviors such as shouting or smoking. The trials even monitored potential bike theft and other safety-related incidents. (Related:AI surveillance tech can find out who your friends are.)

The data derived from these systems could be utilized to enhance advertising revenues by gauging passenger satisfaction through their emotional states, captured when individuals crossed virtual tripwires near ticket barriers. Critics argue that the technology is unreliable and have called for its prohibition.

Over the past two years, eight train stations around the U.K. have tested AI surveillance technology, with CCTV cameras intending to alert staff to safety incidents and potentially reduce certain types of crime.

The extensive trials, overseen by rail infrastructure body Network Rail, have used object recognition – a type of machine learning that can identify items in video feeds. Separate trials have used wireless sensors to detect slippery floors, full bins and drains that may overflow.

We are building the infrastructure of human freedom and empowering people to be informed, healthy and aware. Explore our decentralized, peer-to-peer, uncensorable Brighteon.io free speech platform here. Learn about our free, downloadable generative AI tools at Brighteon.AI. Every purchase at HealthRangerStore.com helps fund our efforts to build and share more tools for empowering humanity with knowledge and abundance.

"The rollout and normalization of AI surveillance in these public spaces, without much consultation and conversation, is quite a concerning step," said Jake Hurfurt, research and investigations head of Big Brother Watch.

Using technology to detect emotions is unreliable

AI researchers have frequently warned that using the technology to detect emotions is "unreliable," and some say the technology should be banned due to the difficulty of working out how someone may be feeling from audio or video. In October 2022, the U.K.'s data regulator, the Information Commissioner's Office, issued a public statement warning against the use of emotion analysis, saying the technologies are "immature" and "they may not work yet, or indeed ever."

Privacy advocates are particularly alarmed by the opaque nature and the potential for overreach in the use of AI in public spaces. Hurfurt has expressed significant concerns about the normalization of such invasive surveillance without adequate public discourse or oversight.

"Network Rail had no right to deploy discredited emotion recognition technology against unwitting commuters at some of Britain's biggest stations, and I have submitted a complaint to the Information Commissioner about this trial," Hurfurt said.

"It is alarming that as a public body, it decided to roll out a large-scale trial of Amazon-made AI surveillance in several stations with no public awareness, especially when Network Rail mixed safety tech in with pseudoscientific tools and suggested the data could be given to advertisers," he added.

"Technology can have a role to play in making the railways safer, but there needs to be a robust public debate about the necessity and proportionality of tools used. AI-powered surveillance could put all our privacy at risk, especially if misused, and Network Rail's disregard of those concerns shows a contempt for our rights."

Visit FutureTech.news for similar stories.

Watch Stanford University professor Michael Snyder explaining how AI is used to collect data on people below.

This video is from the Nonvaxer420 channel on Brighteon.com.

More related stories:

Google censors all AI that generates supposed "hate speech" in Google Play Store.

Former OpenAI employees release “A Right to Warn” document warning about advanced AI risks.

Database leak reveals ALARMING list of Google’s privacy blunders, including recording children’s voices and exposing license plates seen on Street View.

FASCIST INCEST: Ex-NSA chief joins board of OpenAI to expand tentacles of military-industrial complex.

AI candidate running for U.K. parliament to appear on the ballot for general elections in July.

Sources include:

ReclaimTheNet.org

Brighteon.com



Take Action:
Support Natural News by linking to this article from your website.
Permalink to this article:
Copy
Embed article link:
Copy
Reprinting this article:
Non-commercial use is permitted with credit to NaturalNews.com (including a clickable link).
Please contact us for more information.
Free Email Alerts
Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.
App Store
Android App
eTrust Pro Certified

This site is part of the Natural News Network © 2022 All Rights Reserved. Privacy | Terms All content posted on this site is commentary or opinion and is protected under Free Speech. Truth Publishing International, LTD. is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. Truth Publishing assumes no responsibility for the use or misuse of this material. Your use of this website indicates your agreement to these terms and those published here. All trademarks, registered trademarks and servicemarks mentioned on this site are the property of their respective owners.

This site uses cookies
Natural News uses cookies to improve your experience on our site. By using this site, you agree to our privacy policy.
Learn More
Close
Get 100% real, uncensored news delivered straight to your inbox
You can unsubscribe at any time. Your email privacy is completely protected.