This could change the circumstances of those suffering abuses – such as child labour – and help prosecute their perpetrators by scanning millions of photographs on social media for signs of abuse.
However, the sheer volume of images generated by security cameras, social media users and other sources means that to inspect every image in turn would be tedious and time-consuming work for humans.
“Our aim us to make the lives of those combatting abuse much easier, as at the moment they are drowning in data,” said Professor Klaus McDonald-Maier of the University of Essex, who is leading the study. The computer scientists at the University of Essex are joined by collaborators at the University of Bonn and the University of Birmingham in the Economic and Social Research Council-funded project: The Human Rights, Big Data and Technology Project.
“With this system, which can identify abuse and then categorise it according to the type of abuse, they can go through images very quickly to narrow down the field and identify pictures which need to be looked at in more detail.”
The computer vision system uses convolutional neural networks, a type of artificial neural network inspired by the organisation of the biological visual cortex. These neural networks have proved particularly effective at processing huge data sets of images or words to search for patterns.
The system was trained and tested using 5,000 images: a small number in machine learning. This included the Human Rights Understanding data set, which includes 100 images each portraying child labour, refugees, police violence and child soldiers. The system, trained on this data, was able to achieve “promising” results, being 88.1 per cent accurate at detecting human rights violations in these categories.
In the next stage of the project, the researchers will use a much larger data set of photographs, including new categories of human rights violations in order to improve the accuracy and scope of the system. It is hoped that in the future, a similar technique could be used to review video footage for potential abuses of human rights.