British local government councils are gathering data on hundreds of thousands of people and using artificial intelligence to predict child abuse and intervene before it can happen.
Predictive analytics systems are using a range of social data to algorithmically identify families that may need attention from child services. The data analysed includes school attendance and exclusion data, housing association repairs and arrears data, and police records on antisocial behaviour and domestic violence.
According to a recent study, 4.5 million children live in poverty in the UK. In addition, local government budgets are under unprecedented pressure. The Local Government Association says government funding for councils will be cut by £16bn by 2020 and that children services are at “breaking point.” It is in this context that councils are turning to predictive behaviour to better manage stressed resources. Fears have been raised however about the repercussions this could have for privacy and whether it could inadvertently discriminate against minorities.
Could schemes such as these help people at risk, or is this invading privacy? Will we see other local governments turn to AI to better manage tight financial resources?