Council’s benefit algorithm is secretive, unevidenced, incredibly invasive and more likely than not discriminatory and unlawful
Artificial intelligence (AI) systems and algorithmic decision-making are mainstays of every sector of the global economy.
From search engine recommendations and advertising to credit scoring, predictive policing and the digital welfare state, algorithms are deployed in an expansive range of uses. They are often posited by advocates as a dispassionate and fairer means of making decisions, free from the influence of human prejudice.
In the last five years, tens of thousands of households across the Folkestone & Hythe District who’ve claimed Housing benefit (HB) and/or Council Tax Reduction (CTR) have had their claims processed by Xantura’s Risk Based Verification black box algorithm. None of the claimants were made aware they were being screened by an algorithm, nor were they made aware they were assigned a fraud risk score and nor did they give their explicit consent.
In the last few days the Xantura’s RBV algorithm has come under scrutiny by Big Brother Watch (BBW) who have uncovered vast quantities of evidence in their – Poverty-Panopticon report, that the use of algorithms to process HB & CTR claims – including Xantura’s – to process HB & CTR claims is secretive, unevidenced, incredibly invasive and more likely than not discriminatory and unlawful.
What is Risk Based Verification
Risk Based Verification (RBV) in local government is used to assess the risk score of ‘fraud and error’ in Housing Benefit & Council Tax Reduction claims. It was promoted by the Department for Work and Pensions (DWP) to councils in Circular HB/CTB S11/2011 from 2011 as a modern way to streamline benefit applications by allowing low risk applicants for housing benefit and council tax support to supply fewer documents to support their claims and to allow councils to focus resources on verifying riskier applications.
The rules applying to Risk Based Verification (RBV) are very clear: the risk score which each claimant is assigned by Xantura’s algorithm cannot be downgraded; it can be upgraded, for a whole host of reasons – and there is no process of appeal – nor are HB & CTR claimants aware they are being processed by an algorithm.
What did the DWP Circular say?
Once a claimant is assigned a risk score and category, “individual claims cannot be downgraded by the benefit processor to a lower risk group. They can however, exceptionally, be upgraded if the processor has reasons to think this is appropriate.”
The information held in the [Councils RBV] Policy, which would include the risk categories, should not be made public due to the sensitivity of its contents
It also stated
The Policy must allow Members, officers and external auditors to be clear about the levels of verification necessary. It must be reviewed annually but not changed in-year as this would complicate the audit process.
When did Folkestone & Hythe District Council introduce Risk Based Verfification?
On the 7th Sept 2016, Folkestone & Hythe District Council Audit & Governance Committee met to discuss Agenda item 14 – Report C/16/41. Seven days later this went to the Cabinet for sign off. RBV went live in Oct 2016.
Report C16/41 states at Para 3.3
“Due to the nature of the content of the policy, it is not made publicly available.”
There is no mention of Xantura in Report C/16/41, so how do we know Xantura provide the Council the algorithm?
In July 2018 Data Justice Lab made an FoI request to the Council, on the 16th August the Council responded by say: “The only software that Folkestone & Hythe District Council operate that uses algorithms is Xantura. This is used to return the “risk score” of a benefit claim…”
How many people who have claimed HB or CTR have had a risk score for fraud assigned to them?
Since the Council started using the Xantura algorithm back in Oct 2016, tens of thousands of claimants for HB and CTR have been assigned a risk score for fraud without their knowledge and without their explicit consent. These are predominantly people on low incomes.
What is the aim of the policy?
The aim of the policy according to the Council is to prevent fraud & error in HB & CTR claims and save the council money.
Has it achieved that?
No. Each year the council must disclose the total number of fraud/irregularity cases investigated. In 2016/17 that was 1,172. Since then it has not published this data. It must also disclose the total amount spent on the investigations and prosecutions of fraud, which in 2016/17 was £42,041. Since then it has not published this data. This contravenes statutory guidance.
Has the RBV Policy ever been made public?
No!
Is that lawful?
That is very unlikely as it is not in line with the First Data Protection Principle (which makes it clear data must be processed lawfully, fairly and in a transparent manner in relation to individuals – the Council do not inform HB and CTR claimants their information was/is being processed by an algorithm). As such the Council’s RBV Policy is more likely than not unlawful, unfair and opaque. In this respect it has failed to respect basic human rights.
Are there any legal cases which support this stance?
Yes. The Hague District Court’s judgement in February 2020 on Systeem Risico Indicatie (SyRI), determined that the legislation enabling their risk profiling system does not comply with Article 8 of the European Convention on Human Rights (ECHR). Given the close functional similarities between SyRI and RBV algorithms as they both risk profile claimants for state benefits, it is legitimate to ask whether the use of RBV does in fact violate British law.
Has the RBV Policy been reviewed by Cllrs, their internal auditors EKAP, or their external auditors, Grant Thornton?
There is no evidence in the public domain that shows since the RBV policy was introduced in Oct 2016 that any reviews of the RBV Policy have been undertaken by any of those mentioned above
That appears to contradict what the DWP Circular says about reviewing the policy every year.
It does, but no Cllr has cottoned onto the fact the Council are in breach of their Governance principles and disregrding their legal responsibilities.
If the Council cannot determine how the black box algorithm rules are being applied, or adjust them, how can they know how those rules are applied to different groups?
They can’t and in Report C/16/41 at Para 5.3 it states: The policy has been produced in line with Department of Work and Pensions guidance on the use of Risk Based Verification circular S11/11and is the basis for seeking to reduce the amount of fraud and errors within the process. It is therefore not deemed to have an impact on the protected characteristics and will in fact, seek to ensure a fairer and more robust process in the system.
How can they say there is no impact on the protected characteristics as set out in the Equality Act 2010?
The Council simply cannot determine how, or even if, the Xantura RBV algorithm is taking into account the Protected Characteristics as per the equality act. This is made clear in the Big Brother Watch – Poverty-Panopticon report. Nor can they they know which characteristics are being processed and what weightings (if any) are being applied to them, then how – other than very carefully designed post hoc sampling.
How can the Council determine whether people with Protected Characteristics are not being discriminated against without opening up the black box algorithm?
They can’t as they cannot open the Xantura algorithm
Why not?
The algorithm is not theirs.
So what’s all the fuss about as nobody really cares:
Well Big Brother Watch have submitted their Poverty-Panopticon – the hidden algorithmns shaping Britain’s welfare state – to the ICO for investigation as they and many others believe the algorithms including Xantura’s, are secretive, unevidenced, incredibly invasive and more likely than not discriminatory and unlawful.
What does the Xantura algorithm look like?
Like this.
S = the risk score from the model
D = is the variable (such as having a partner) present
C = the regression coefficient associated with the variable (the larger the coefficient the greater the impact on the final score and therefore the higher perceived risk associated with the variable)
Const = a constant value in the model
n = The total number of variables in the model
How did Xantura’s develop their algorithm?
They used data from four diverse local authorities over a period of sixteen months and now includes more than 50 variables. The regression coefficient, which describes the influence of a variable on the outcome i.e. risk score, is also kept secret which means that nobody knows what makes a particular HB or CTR application high or low risk.
Does having more children assign you a riskier or safer fraud risk score?
What we can say is – the number of children you have is not a good proxy for honesty and should not normally be used as such.
Does Age, Sex or Disability assign you a riskier or safer fraud risk score?
What we can say is – none of these three characteristics are a good proxy for honesty and should not normally be used as such.
The Council’s head of legal, Amandeep Khroud, said in report C/16/41: [The RBV Policy] “is therefore not deemed to have an impact on the protected characteristics [such as the three above] and will in fact, seek to ensure a fairer and more robust process in the system. No Equality Impact Assessment therefor has been undertaken.
The RBV policy was waived through without any consideration of the real risk of discrimination the algorithms pose and nobody has a clue the damage these algorithms could do because bias and disproportionality are not monitored.
So the Council’s lawyer is saying the Xantura RBV algorithm model does not take into account any of the protected characteristics dealt with by the Equalities Act – How does she know?
Put simply, neither she nor the council can determine that those with Protected Characteristics are not being adversely affected by the Xantura algorithm; which may not be directly fed such characteristics – so are claiming no information about age, sex or disability is entered in the algorithm. However, the images below taken from the Council’s website, clearly show they do appear to take them into account.
Sssshhh!
Why?
The Council don’t want you to know that they don’t know if the Xantura RBV algorithm is lawful or not.
Who are the Cabinet Members & Council Officer responsible for the Xantura RBV algorithm?
Cllr Tim Prater – Cabinet Member for Revenues, Benefits, Anti-Fraud and Corruption
Cllr Ray Field – Cabinet Member for Information technology, information access & security, RIPA and Customer service
Amandeep Khroud -Assistant Director for Governance, Law and Regulatory Services
Has Mrs Khroud investigated?
Leave a Reply