UK Government Goes Full Orwellian. DNA, Fingerprint, Face, Voice: Biometric Data for Every Single Citizen in the UK

We’ve been warning about this moment since the first day TruePublica went online. We said that the government would eventually take the biometric data of every single citizen living in Britain and use it for nefarious reasons.  DNA, fingerprint, face, and even voice data will be included. But that’s not all.

The excuse to be used, as ever, will be national security or terrorism, despite the huge fall in fatalities from terrorism and terror-related incidents since the 1970s.

Apart from crime-fighting, the Home Office also proposes in its long-awaited report that it will use the centralized database for vetting migrants on the streets and borders of Britain.

Not for the first time, civil rights groups argue that systems such as face recognition is faulty, dubiously legal, and collected without public consent. The outcry over Facebook, Cambridge Analytica and the EU referendum should, if nothing else, confirm that bulk data collection, used without either public debate or a legal basis is emphatically against our civil liberties.

However, the legality of the creation of a centralised biometric database will not stop a government who have been repeatedly caught breaking the law when it comes to privacy and data collection. Police, immigration, and passport agencies already collect DNA, face, and fingerprint data. On the latter, police forces across Britain now have fingerprint scanners on the streets of Britain with officers providing no more than a promise that fingerprint data taken will be erased if the person stopped is innocent of any crime.

The government’s face database already has 12.5 million people – or so it has admitted to. The Home Office, embroiled in all sorts of privacy and surveillance legal cases caused a scandal last April when an official said it would simply be too expensive to remove innocent people from its criminal face databases of mugshots.

Without proper, enforcible regulation that can be fully scrutinised by civil society, there are many opportunities for the misuse of biometric data. It means nothing when the Home Office says its collection of biometric data will be “lawful,” when it is found by the highest courts in both Britain and the EU of breaking basic surveillance and data protection laws. And what laws there are, remain deliberately ambiguous on how they will ethically collect, store, or share biometric data.

Without any obstacles put in its way, the Home Office has essentially granted itself the right to end anonymity of any type to all the people of Britain.

Big Brother Watch recently released a report detailing a staggering 90% false positive rate for its face recognition systems and then went on to describe the Home Office defence of these systems – “misleading, incompetent and authoritarian.”

The fact that on Remembrance Sunday 2017, the Metropolitan Police used automated facial recognition to find so-called ‘fixated individuals’ – people not suspected of any crime, but who might be suffering mental health issues, should be a wake-up call for us all.

TruePublica has just reported on one local authority in Thurrock using databases and algorithms to deliver public services. More particularly it is surveilling its own systems and citizens to pinpoint and target certain families, vulnerable people, the homeless and anti-social behaviour. The system is called a “predictive modelling platform” and was only revealed through a freedom of information request by a local journalist.

Council data from housing, education, social care, benefits and debt all contribute to the creation of a profile that is used to predict whether a person is at risk or what services is provided. The profiles then assign people a score that indicates whether they need attention from social services. That risk score is stored in a centre where identifiable details are replaced with artificial ones, a process known as pseudonymised data.

The warning we gave was that it wouldn’t be that long when all citizens will be given such scores by local councils, local authorities, the police and various other government agencies. The speed of implementation has surprised even us though. One should not forget that there are 78 high profile government agencies and a further 401 public bodies closely associated with them.

To be fair to Thurrock council the system has become so embedded within their social services system that it is responsible for 100 per cent of referrals to the Troubled Family programme, a government-led scheme aimed at early social work intervention. The council also claims it has an 80 per cent success rate in predicting children who are at risk and should enter safeguarding. It does not say how the system failed the other 20 per cent or how it affected them.

However, there is a dark side to this. TruePublica warned two years ago that social scoring systems were on the way. We wrote in 2016 and then again in early 2017 as a result of an in-depth report by Civil Society Futures regarding a new wave of surveillance:

Citizens are increasingly categorised and profiled according to data assemblages, for example through data scores or by social credit scores, as developed in China. The purpose of such scores is to predict future behaviour and allocate resources and eligibility for services (or punishment) accordingly. In other words, rules will be set for citizens to live by through data and algorithms.”

The government is now building, without debate such a system for all of its agencies to access and input. Once complete the next step will be to ‘manage’ population behaviour through social credit scores.

Current common forms of biometric data collection include – fingerprint templates, iris and retina templates, voiceprint, 2D or 3D facial structure map, hand and/or finger geometry map, vein recognition template, gait analysis map, blood DNA profiles, behavioural biometric profiles and others.

These days gathering biometric data generally requires the cooperation (or coercion) of the subject: for your iris to get into a database, you have to let someone take a close-up photograph of your eyeball. That is no longer the case. Researchers at Carnegie Mellon University in the USA have perfected a camera that can take rapid-fire, database-quality iris scans of every person in a crowd from a distance of 10 meters. Consent is not required otherwise the technology would be worthless if it did.

In the meantime, biometric data will not be secure. It never can be, especially in the hands of the government. That’s because the hacking industry, already costing a mind-blowing $1.6 trillion annually across the world is expected to reach $2.1 trillion in just 3 months time. That’s the sum spent fighting off cyber-crime, not the sum spent of conducting it.

Identity theft directly affected 174,523 individuals in Britain last year – an increase of 125 per cent in little more than ten years. The authorities have simply been unable to stop this inescapable rise. Recent research has found that fraudsters operating on the dark web can buy a person’s entire identity, everything, the lot, for just £820. At that point your bank accounts are emptied, credit cards maxxed out – the horrendous list goes on. £4.6 billion was stolen in cyber-crime from Brits last year.

Would a new form of identity theft develop with biometric data added to the armoury of criminals?

At the very least, the government should restrict the collation of different types of biometric data into a single database. And it should certainly require that all biometric data be stored in the most secure manner possible. Currently, it is not proposing either as the database will be available to thousands of governments workers and hundreds of technology contractors.

And how easy is that theft going to be? Edward Snowden, a third-party contractor for the NSA stole 58,000 files from GCHQ sitting at his desk in Hawaii and then calmly flew off to Moscow for protection against the USA/UK. If GCHQ can have such sensitive information so easily stolen, that they claim is of national security, what guarantees can the government give that your biometric data will be safe? The short answer – is they can’t.

BigBrotherWatch Director Silkie Carlo said:

“The Government’s biometrics strategy is a major disappointment. After five years of waiting, it reads like a late piece of homework with a remarkable lack of any strategy.

While Big Brother Watch and others are doing serious work to analyse the rights impact of the growing use of biometrics, the Home Office appears to lack either the will or competence to take the issues seriously. For a government that is building some of the biggest biometric databases in the world, this is alarming.

Meanwhile, the Met today is surveilling Londoners with facial recognition cameras that they have no legal basis to even use. The situation is disastrously out of control.”

*

Note to readers: please click the share buttons above. Forward this article to your email lists. Crosspost on your blog site, internet forums. etc.

Featured image is from TP.


Articles by: True Publica

Disclaimer: The contents of this article are of sole responsibility of the author(s). The Centre for Research on Globalization will not be responsible for any inaccurate or incorrect statement in this article. The Centre of Research on Globalization grants permission to cross-post Global Research articles on community internet sites as long the source and copyright are acknowledged together with a hyperlink to the original Global Research article. For publication of Global Research articles in print or other forms including commercial internet sites, contact: [email protected]

www.globalresearch.ca contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to our readers under the provisions of "fair use" in an effort to advance a better understanding of political, economic and social issues. The material on this site is distributed without profit to those who have expressed a prior interest in receiving it for research and educational purposes. If you wish to use copyrighted material for purposes other than "fair use" you must request permission from the copyright owner.

For media inquiries: [email protected]