Leader – Risk business

HC Crest

In the era of GDPR, the most stringent data protection rules ever introduced, and in the wake of the Cambridge Analytica scandal, it seems unfathomable that a local council would continue to secretly use information it collects from its citizens to decide which of them is a risk worth keeping an eye on.

Yet that is exactly what Hackney is doing, testing out software created by private data firm Xantura to see if it can accurately predict which families should be targeted by social workers – and paying an exorbitant amount for the privilege.

It may well be that this algorithm ends up saving lives, but that’s not the point.

What’s troubling is that our data is being used, without our knowledge or say-so, for a program designed by humans, with all their opinions, ideas and biases, to cast judgement on our abilities to raise a family.

Does our skin colour set us apart? Our religious beliefs? If we fail at school, are we marked out, ready for the day we have children?

We don’t know, because Xantura doesn’t want other treasure hunters to see its map.

We don’t know because the council is happy to play along, no doubt in the hope it will help stretched social workers, but also in the knowledge that it could lead to extra funding to fill its emptying coffers.

As campaigners have pointed out, companies that choose to go mining on taxpayers’ land should expect to do so in plain sight.

More than that, the councils that let them in on our behalf should demand it.