{"id":4655,"date":"2025-12-06T21:16:43","date_gmt":"2025-12-06T21:16:43","guid":{"rendered":"https:\/\/uang69.id\/?p=4655"},"modified":"2025-12-06T21:16:44","modified_gmt":"2025-12-06T21:16:44","slug":"latest-biometric-surveillance-scandal-in-uk-reveals-another-dark-side-of-ai-powered-big-brother","status":"publish","type":"post","link":"https:\/\/uang69.id\/?p=4655","title":{"rendered":"Latest Biometric Surveillance Scandal in UK Reveals Another Dark Side of AI-Powered Big Brother"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div>\n<p>Where AI-powered surveillance and control technologies meet capitalism 101. <\/p>\n<p>A fresh expose by civil rights group Big Brother Watch has revealed that over the past two years eight train stations across the UK \u2014 including busy hubs such as London\u2019s Euston and Waterloo, Manchester Piccadilly, and several smaller stations \u2014 have conducted facial and object recognition trials using AI surveillance technology. By rigging Amazon\u2019s AI surveillance software to the stations\u2019 CCTV cameras, the initiative was ostensibly meant to alert station staff to safety incidents and potentially reduce certain types of crime.<\/p>\n<p>The data collected was sent to Amazon Rekognition, according to a Freedom of Information Act (FOIA) request obtained by Big Brother Watch. As WIRED magazine reports, \u201cthe extensive trials, overseen by the government-owned rail infrastructure body Network Rail, have deployed object recognition \u2014 a type of machine learning that can identify items in videofeeds \u2014 to detect people trespassing on tracks, monitor and predict platform overcrowding, identify antisocial behaviour (\u201crunning, shouting, skateboarding, smoking\u201d) and spot potential bike thieves.\u201d<\/p>\n<p>In other words, it was all intended to help keep rail passengers safe, train stations clean and tidy and bikes in their place. A Network Rail spokesperson said:<\/p>\n<p>We take the security of the rail network extremely seriously and use a range of advanced technologies across our stations to protect passengers, our colleagues, and the railway infrastructure from crime and other threats.<\/p>\n<p>When we deploy technology, we work with the police and security services to ensure that we\u2019re taking proportionate action, and we always comply with the relevant legislation regarding the use of surveillance technologies.<\/p>\n<p>That is probably not as comforting as it may sound. As I will show later in this article, the (almost certainly outgoing) Sunak government has tried everything it can to gut the limited safeguards protecting the British public from the potential downsides and dangers of AI-empowered surveillance.<\/p>\n<p>Measuring Passenger \u201cSatisfaction\u201d<\/p>\n<p>A particularly \u201cconcerning\u201d aspect of the train station trials is their focus on \u201cpassenger demographics,\u201d says Jake Hurfurt, the head of research and investigations at Big Brother Watch. According to documents released in response to the FOIA request, the AI-powered system could use images from the cameras to produce \u201ca statistical analysis of age range and male\/female demographics,\u201d and is also able to \u201canalyse for emotions\u201d such as \u201chappy, sad and angry.\u201d<\/p>\n<p>This is where AI-powered surveillance and control technologies meet capitalism 101. From the WIRED article (emphasis my own):<\/p>\n<p>The images were captured when people crossed a \u201cvirtual tripwire\u201d near ticket barriers, and were sent to be analysed by Amazon\u2019s Rekognition system, which allows face and object analysis. It could allow passenger \u201csatisfaction\u201d to be measured, the documents say, noting that \u201cthis data could be utilised to maximum advertising and retail revenue.\u201d<\/p>\n<p>The article offers no indication as to how that might be achieved, but the proposal itself should hardly come as a surprise. Besides serving as an instrument of government surveillance control, biometric systems will be used to maximise corporate revenues and profits \u2014 whether for the tech giants providing the hardware and software, in this case Amazon, the large financial institutions facilitating the transactions or the retail companies honing their targeted advertising techniques.<\/p>\n<p>It brings to mind two scenes from the 2002 sci-fi movie (based loosely on a Philip K Dick short story), \u201cMinority Report.\u201d In the first, the camera takes a retina scan of the protagonist John A Anderton and a billboard calls out to him, \u201cJohn Anderton! You could use a Guinness right about now\u201d? In the second, Anderton visits a mall where he is met by an attractive female hologram advising him what clothes to buy. Set in 2054, the film imagines that advertisers will be able to personalise messages on billboards or through holograms via retinal scans.<\/p>\n<p>Apart from the occasional still-born attempt, this particular dystopian scenario is yet to creep into most of our lives, though the widespread use of augmented-reality \u201cwearables\u201d like Apple Vision Pro will certainly make it more possible. As the WIRED article notes, AI researchers have frequently warned that using face analysis technology \u201cto detect emotions is \u2018unreliable\u2019 and some say the technology should be banned due to the difficulty of working out how someone may be feeling from audio or video.\u201d<\/p>\n<p>On the other side of the English channel the EU Parliament has voted for a broad ban on the use of Live Facial Recognition systems in public spaces, as too have some US cities. By contrast, as we reported in October last year, the UK government is escalating its deployment of the controversial surveillance technology.<\/p>\n<p>Prime Minister Rishi Sunak, the son-in-law of Indian tech billionaire N R Narayana Murthy, is determined to transform the UK into a world leader in AI governance. Said governance apparently involves gutting many of the limited safeguards protecting the public from the potential downsides and dangers of AI, of which there are many\u2026<\/p>\n<p>As we reported in early August, live facial recognition (LFR) surveillance, where people\u2019s faces are biometrically scanned by cameras in real-time and checked against a database, is being used by an increasing number of UK retailers amid a sharp upsurge in shoplifting \u2014 with the blessing, of course, of the UK government. Police forces are also being urged to step up their use of LFR. The technology has also been deployed at the Coronation of King Charles III, sports events including Formula 1, and concerts, despite ongoing concerns about its accuracy as well as the huge ethical and privacy issues it raises.<\/p>\n<p>In what is surely one of the most brazen and egregious examples of mission creep you\u2019re likely to find, the government has also authoritsed the police to create a vast facial recognition database out of passport photos of people in the UK . The ultimate goal, it seems, is to get rid of passports altogether and replace them with facial recognition technology. In January, Phil Douglas, the director general of UK Border Force, said he wanted to create an \u201cintelligent border\u201d that uses \u201cmuch more frictionless facial recognition than we currently do\u201d.<\/p>\n<p>From The Guardian:<\/p>\n<p class=\"dcr-iy9ec7\">Douglas has been touting the potential benefits of biometrics and data security in managing the UK\u2019s borders in recent months. In February 2023, he suggested the paper passport was becoming largely redundant \u2013 even as some celebrated the post-Brexit\u00a0return of the blue document.<\/p>\n<p class=\"dcr-iy9ec7\">He told an audience at the Airport Operators Association conference in London at the time: \u201cI\u2019d like to see a world of completely frictionless borders where you don\u2019t really need a passport. The technology already exists to support that.\u201d Douglas added: \u201cIn the future, you won\u2019t need a passport \u2013 you\u2019ll just need biometrics.\u201d\u2026<\/p>\n<p>According to polling carried out by the International Air Transport Association in 2022, 75% of passengers worldwide would be happy to ditch passports for biometrics.<\/p>\n<p>\u201cSnooping Capital of the West\u201d<\/p>\n<p>This is a reminder that most of these trends \u2014 particularly the tech-enabled drift toward authoritarianism and centralised technocracy \u2014 are generalised, not only among the ostensibly democratic nations of the so-called \u201cFree West\u201d but across the world as a whole. But the UK is at the leading edge of most of them.<\/p>\n<p>In an article earlier this year, Politico described the UK as \u201cthe snooping capital of the West,\u201d snarkily noting that the country \u201cis finally leading the world\u2026 on AI-powered surveillance.\u201d The government last year passed the Online Safety Bill, opening up the possibility of tech firms being forced to scan people\u2019s mobile messages \u2013 ostensibly for child abuse content. As Open Democracy warns, this is likely to make people\u2019s digital communications less, rather than more, secure:<\/p>\n<p data-block-key=\"8jrc0\">The more of daily life that becomes digital, the more we rely on secure connections to ensure our data is not exploited. Encryption is the main method stopping miscreants from stealing passwords or personal information.<\/p>\n<p data-block-key=\"amm6e\">If firms are forced to weaken security, more attacks will ensue, just at a time that we need to boost security across society.<\/p>\n<p data-block-key=\"8uesh\">For example, if WhatsApp were instructed to make messages visible to law enforcement, that back door could be found by others, exposing personal messages. It is a pillar of information security theory that the more ways there are to access a system, the more likely an attacker will be to gain access.<\/p>\n<p>The UK government has also granted police new powers to shut down protests as well as force\u00a0employees to work during industrial action \u2013 or face being sacked. Police forces are also\u00a0resorting to Section 60AA to require protesters to remove any item being worn for the purpose of concealing their identity, including, presumably, KN95 masks. Plus, as readers may recall, the Sunak government has also granted full management of the National Health Service\u2019s federated data platform to Palantir, a US tech giant with intimate ties to US defense and intelligence agencies.<\/p>\n<p>In its Data Protection and Digital Information Act (DPDI), the Sunak government even planned to abolish the roles of the Biometrics and Surveillance Camera Commission (BSCC), an independent advisory board that was, to some extent, helping to hold the public sector to account for its use of AI. As we pointed out late last year, the government clearly wanted to have even freer reign to surveil and control the lives of British citizens. The proposed legislation also sought to scale back the UK GDPR and Data Protection Act of 2018.<\/p>\n<p>The former Biometrics and Surveillance Camera Commissioner, Professor Fraser Sampson,\u00a0 described the move as \u201cshocking\u201d and \u201ctantamount to vandalism.\u201d In the end, the DPDI was ultimately excluded from the \u201cwash-up\u201d process before Parliament\u2019s dissolution in the lead-up to the UK\u2019s general elections, leaving the BSCC in tact \u2014 for now.<\/p>\n<p>Another Slippery Slope\u00a0<\/p>\n<p>When it comes to biometric surveillance technologies, the UK\u2019s independent watchdogs appear to hold limited influence anyway. The two-year trials in the eight train stations all took place despite previous warnings from the UK\u2019s Information Commissioner\u2019s Office (ICO) against using the technology. Speaking in 2022, the ICO\u2019s deputy commissioner Stephen Bonner said:<\/p>\n<p>Developments in the biometrics and emotion AI market are immature. They may not work yet, or indeed ever. While there are opportunities present, the risks are currently greater. At the ICO, we are concerned that incorrect analysis of data could result in assumptions and judgments about a person that are inaccurate and lead to discrimination\u2026<\/p>\n<p>The only sustainable biometric deployments will be those that are fully functional, accountable and backed by science\u2026 As it stands, we are yet to see any emotion AI technology develop in a way that satisfies data protection requirements, and have more general questions about proportionality, fairness and transparency in this area.<\/p>\n<p>If there\u2019s one silver lining about the technology used in the station trials, it is that it does not identify people, Carissa V\u00e9liz, an associate professor in psychology at the Institute for Ethics in AI, at the University of Oxford, told WIRED. But there is always the risk of a slippery slope, she said, citing similar AI trials on the London Underground that had initially blurred faces of people who may have been dodging faces before changing tack, unblurring photos and keeping images longer than initially planned.<\/p>\n<p>Lastly, if British voters are expecting a reversal of policy on the use of digital surveillance and control technologies by a future Keir Starmer government, they are likely to be sorely disappointed, especially given the Starmer team\u2019s cosy ties to the Tony Blair Foundation for Global Change, which often touts digital technologies and biometric surveillance systems as the cure-alls to many of the world\u2019s deep-seated problems.<\/p>\n<p>In a speech at the WEF\u2019s 2020 cyber attack simulation event, \u201cCyber Polygon\u201d, Blair said that Digital Identity would form an \u201cinevitable\u201d part of the digital ecosystem being constructed around us, and so government should work with technology companies to regulate their use \u00a0\u2014 as the EU and Australia have already done. It is the perfect manifestation of the 21st century Public Private Partnership \u2014 a digital panopticon designed and built by global tech companies, paid for with taxpayer funds, so that the government, security agencies and their corporate partners can more easily track, trace and control the populace.<\/p>\n<p>As a recent report by Big Brother Watch documents, the Labour Party under Jeremy Corbyn\u2019s leadership pledged to ban facial recognition but the Labour 2024 Manifesto includes no such commitment. There is also \u201cno formal commitment in the manifesto to reject bank spying powers in the future\u201d or the adoption of a central bank digital currency. Nor is there any commitment to prevent mandatory ID or digital identity.<\/p>\n<div class=\"printfriendly pf-alignleft\"><img decoding=\"async\" style=\"border:none;-webkit-box-shadow:none; -moz-box-shadow: none; box-shadow:none; padding:0; margin:0\" src=\"https:\/\/cdn.printfriendly.com\/buttons\/print-button-gray.png\" alt=\"Print Friendly, PDF &amp; Email\"\/><\/div>\n<\/div>\n<p><br \/>\n<br \/><a href=\"https:\/\/www.nakedcapitalism.com\/2024\/06\/latest-facial-recognition-scandal-in-uk-reveals-an-even-darker-side-to-ai-powered-big-brother.html\" target=\"_blank\" rel=\"noopener\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Where AI-powered surveillance and control technologies meet capitalism 101. A fresh expose by civil rights group Big Brother Watch has revealed that over the past two years eight train stations across the UK \u2014 including busy hubs such as London\u2019s Euston and Waterloo, Manchester Piccadilly, and several smaller stations \u2014 have conducted facial and object [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":491,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"tdm_status":"","tdm_grid_status":"","footnotes":""},"categories":[1],"tags":[],"class_list":["post-4655","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-uncategorized"],"_links":{"self":[{"href":"https:\/\/uang69.id\/index.php?rest_route=\/wp\/v2\/posts\/4655","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/uang69.id\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/uang69.id\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/uang69.id\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/uang69.id\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=4655"}],"version-history":[{"count":1,"href":"https:\/\/uang69.id\/index.php?rest_route=\/wp\/v2\/posts\/4655\/revisions"}],"predecessor-version":[{"id":11321,"href":"https:\/\/uang69.id\/index.php?rest_route=\/wp\/v2\/posts\/4655\/revisions\/11321"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/uang69.id\/index.php?rest_route=\/wp\/v2\/media\/491"}],"wp:attachment":[{"href":"https:\/\/uang69.id\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=4655"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/uang69.id\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=4655"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/uang69.id\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=4655"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}