Thousands of UK train passengers have had their faces scanned by Amazon’s Rekognition software as part of AI trials to predict age, gender, and emotions. Conducted over the past two years at eight stations, including Euston and Waterloo, the trials aimed to enhance safety and reduce crime by using AI to monitor trespassing, overcrowding, antisocial behavior, and potential theft. They also employed wireless sensors to detect hazards like slippery floors and overflowing drains. Network Rail oversaw these trials, with a cache of documents revealing their scope obtained through a freedom of information request by Big Brother Watch. The trials integrated “smart” and traditional CCTV cameras for object and movement detection, with some stations using up to seven cameras or sensors. The data collected was intended for potential future use in advertising. Critics, such as Big Brother Watch’s Jake Hurfurt, express concern over the lack of public consultation and the trials’ focus on passenger demographics, including emotion analysis. AI researchers and the UK’s Information Commissioner’s Office have warned against the use of emotion-detection technology, citing its unreliability and immaturity.

AI Surveillance at UK Train Stations Raises Privacy Concerns
AI trials at UK train stations used Amazon’s software to scan faces, raising privacy concerns.
1–2 minutes










