This article is more than 1 year old

Report on AI in UK public sector: Some transparency on how government uses it to govern us would be nice

Ya think?

A new report from the Committee for Standards in Public life has criticised the UK government's stance on transparency in AI governance and called for ethics to be "embedded" in the frameworks.

The 74-page treatise noted that algorithms are currently being used or developed in healthcare, policing, welfare, social care and immigration. Despite this, the government doesn't publish any centralised audit on the extent of AI use across central government or the wider public sector.

Most of what is in the public realm at present is thanks to journalists and academics making Freedom of Information requests or rifling through the bins of public procurement data, rather than public bodies taking the proactive step of releasing information about how they use AI.

The committee said the public should have access to the "information about the evidence, assumptions and principles on which policy decisions have been made".

In focus groups assembled for the review, members of the public themselves expressed a clear desire for openness, as you'd expect.

"This serious report sadly confirms what we know to be the case – that the Conservative government is failing on openness and transparency when it comes to the use of AI in the public sector," shadow digital minister Chi Onwurah MP said in a statement.

"The government urgently needs to get a grip before the potential for unintended consequences gets out of control," said Onwurah, who argued that the public sector should not accept further AI algorithms in decision-making processes without introducing further regulation.

Simon Burall, senior associate with the public participation charity Involve, commented: "It's important that these debates involve the public as well as elected representatives and experts, and that the diversity of the communities that are affected by these algorithms are also involved in informing the trade-offs about when these algorithms should be used and not."

Police image via Shutterstock

UK cops run machine learning trials on live police operations. Unregulated. What could go wrong? – report

READ MORE

Predictive policing programmes are already being used to identify crime "hotspots" and make individual risk assessments – where police use algorithms to determine the likelihood of someone committing a crime.

But human rights group Liberty has urged police to stop using these programmes because they entrench existing biases. Using inadequate data and indirect markers for race (like postcodes) could perpetuate discrimination, the group warned. There is also a "severe lack of transparency" with regard to how these techniques are deployed, it said.

The committee's report noted that the "application of anti-discrimination law to AI needs to be clarified".

In October 2019, the Graun reported that one in three local councils were using algorithms to make welfare decisions. Local authorities have bought machine learning packages from companies including Experian, TransUnion, Capita and Peter Thiel's data-mining biz Palantir – which has its fans in the US public sector – to support a cost-cutting drive.

These algorithms have already caused cock-ups. North Tyneside council was forced to drop TransUnion, whose system it used to check housing and council tax benefit claims, when welfare payments to an unknown number of people were delayed thanks to the computer's "predictive analytics" wrongly classifying low-risk claims as high risk.

The report stopped short of recommending an independent AI regulator. Instead it said: "All regulators must adapt to the challenges that AI poses to their specific sectors."

The committee endorsed the government's intention to establish the Centre for Data Ethics and Innovation as "an independent, statutory body that will advise government and regulators in this area". So that's all right then. ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like