Over the past two years, artificial intelligence ("AI") surveillance technology has been tested at eight train stations across the UK. CCTV cameras were used to scan the faces of thousands of people passing through ticket barriers, and the collected data was sent to Amazon Rekognition for analysis.

These AI trials involved a mix of "smart" CCTV cameras capable of detecting objects or movements from the images they capture, as well as older cameras connected to cloud-based analysis for video feeds. The image recognition system was utilized to predict the age, gender, and even potential emotions of travelers, with the possibility of using this data in future advertising systems.

AI researchers have repeatedly cautioned against relying on the technology to accurately detect emotions. Some argue that the difficulty of interpreting someone's feelings solely from audio or video makes this technology unreliable and have called for its ban. In October 2022, the UK's data regulator, the Information Commissioner's Office, released a public statement warning against the use of emotion analysis, deeming such technologies "immature" and uncertain of their effectiveness.

The extent of these AI trials, which had been partially disclosed before, has now been revealed through a collection of documents obtained by civil liberties group Big Brother Watch in response to a freedom of information request.

2 Executive Summary Managed Stations Service Realisation Report Redacted Pdf
PDF – 3.1 MB 62 downloads