We had
earlier shared this news of the first week of December 2016 that indicated a
major breakthrough and advantage over the existing technology based solutions
to tackle online child sex offences. It was later learnt that not many
organizations received this news hence I am resending the news items.
I have
in the past, on several platforms shared my observation that empirical research
on pornography is physically, mentally as well as legally hazardous. Actions against
online sexual offences against children cannot be tackled merely with software
based solutions although they can facilitate them. The existing software based
solutions have severe limitations arising from legal multiplicities and
diversities, the unevenness in the political will across the States of the
world to combat the crime and the hugely unreliable data bases. Many of the
existing solutions also can not address the peer to peer transfer of images
which is very rampant.
Against
that background the new solution mentioned in the news below appears like a
significant breakthrough and a big leap. The trials of this solution have also
shown significant accuracy. It is quite surprising that the news has not
received the attention it deserved.
Our governments,
international organizations, UN agencies and the media would do better if they
facilitate the adoption of this solution or work to evolve our own versions
(given our computer and IT capacity) and integrate the enforcement agencies in
the process which will facilitate their sense of ownership.
Artificial intelligence toolkit spots new child
sexual abuse media online
1 December 2016 10:00
New artificial intelligence software
designed to spot new child sexual abuse media online could help police catch
child abusers.
The
toolkit, described in a paper published in Digital Investigation, automatically
detects new child sexual abuse photos and videos in online peer-to-peer
networks.
The
research behind this technology was conducted in the international research
project iCOP–
Identifying and Catching Originators in P2P Networks – founded by the European
Commission Safer Internet Program by researchers at Lancaster University, the
German Research Center for Artificial Intelligence (DFKI), and University
College Cork, Ireland.
There
are hundreds of searches for child abuse images every second worldwide,
resulting in hundreds of thousands of child sexual abuse images and videos
being shared every year. The people who produce child sexual abuse media are
often abusers themselves – the US National Center for Missing and Exploited
Children found that 16 percent of the people who possess such media had
directly and physically abused children.
Spotting
newly produced media online can give law enforcement agencies the fresh
evidence they need to find and prosecute offenders. But the sheer volume of
activity on peer-to-peer networks makes manual detection virtually impossible.
The new toolkit automatically identifies new or previously unknown child sexual
abuse media using artificial intelligence.
“Identifying
new child sexual abuse media is critical because it can indicate recent or
ongoing child abuse,” explained Claudia Peersman, lead author of the
study from Lancaster University's School of Computing and Communications. “And because originators of such media can be hands-on abusers, their
early detection and apprehension can safeguard their victims from further
abuse.”
There
are already a number of tools available to help law enforcement agents monitor
peer-to-peer networks for child sexual abuse media, but they usually rely on
identifying known media. As a result, these tools are unable to assess the
thousands of results they retrieve and can’t spot new media that appear.
The
iCOP toolkit uses artificial intelligence and machine learning to flag new and
previously unknown child sexual abuse media. The new approach combines
automatic filename and media analysis techniques in an intelligent filtering
module. The software can identify new criminal media and distinguish it from
other media being shared, such as adult pornography.
The
researchers tested iCOP on real-life cases and law enforcement officers trialed
the toolkit. It was highly accurate, with a false positive rate of only 7.9%
for images and 4.3% for videos. It was also complementary to the systems and
workflows they already use. And since the system can reveal who is sharing
known child sexual abuse media, and show other files shared by those people, it
will be highly relevant and useful to law enforcers.
“When
I was just starting as a junior researcher interested in computational
linguistics, I attended a presentation by an Interpol police officer who was
arguing that the academic world should focus more on developing solutions to
detect child abuse media online,” said Peersman. “Although he clearly
acknowledged that there are other crimes that also deserve attention, at one
point he said: ‘You know those sweet toddler hands with dimple-knuckles? I see
them online… every day.’ From that moment I knew I wanted to do something to
help stop this. With iCOP we hope we’re giving police the tools they need to
catch child sexual abusers early based on what they’re sharing online.”
AI technology to identify child sexual abuse offenders
A new toolkit using artificial
intelligence will help police identify new images of child sexual abuse online
and find offenders who present the highest risk to the public, according to a
UCC researcher behind the technology.
According to Maggie Brennan, researcher and lecturer in the
Schools of Applied Psychology and Criminology at UCC, who worked on the UCC
research team with Sean Hammond of UCC's School of Applied Psychology, the iCOP
toolkit will automatically identify new and previously unseen images of child
sexual abuse for police and help to reduce the volumes of materials specialists
have to view in order to find children.
“It's common to seize computers and collections of child sexual
abuse materials containing enormous volumes of illegal materials, terabytes of
individual files. Having to view this material to find victims can be traumatic
and distressing for the specialists working to find these children.”
UCC researchers - Maggie Brennan & Sean Hammond -develop
tool for identifying child abuse images @UCC @AppPsychUCChttp://bit.ly/2glr89G
Although there are already a number of tools available to help
the police monitor peer-to-peer networks for child sexual abuse media, they
usually rely on identifying known media. As a result, these tools are unable to
assess the thousands of results they retrieve, whereas the iCOP toolkit uses
artificial intelligence and machine learning to flag new and previously unknown
child sexual abuse media.
The new approach combines automatic filename and media analysis
techniques in an intelligent filtering module. The software can identify new
criminal media and distinguish it from other media being shared, such as adult
pornography.
BBC News - Toddler hand
inspired AI child sex abuse tool http://www.bbc.co.uk/news/technology-38171457 …
According
to Brennan, “law enforcement urgently need these kinds of supports to help them
manage the volumes of cases they are being faced with - to find the children
who are victimised in these images and videos, as well as those offenders who
present the highest risk to the public.”
The research behind this technology was
conducted in the international research project iCOP – Identifying and Catching
Originators in P2P Networks – founded by the European Commission Safer Internet
Program by researchers at UCC, Lancaster University and the German Research
Center for Artificial Intelligence (DFKI).
The team at UCC worked closely with
international law enforcement specialists in online child sexual abuse
investigation, to understand their needs and develop a tool that allows them to
find the most urgent cases for intervention. “Our role also involved developing
a psychological profiling system to identify viewers of child sexual abuse
images who may be at risk of committing hands-on abuse.”
Researchers in Cork develop technology that
will identify child sexual abuse images online (via @thejournal_ie) http://jrnl.ie/3116461
Current systems mean specialists have to view “traumatic
and distressing” material
“We have been researching this topic with international law
enforcement agencies like Interpol for many years, since the early 2000's. The
volumes of child sexual abuse images and videos now in circulation is a real
concern, and it can be overwhelming for law enforcement. Trying to find recent
or ongoing cases of child sexual abuse is an absolute priority, but the sheer
volume of illegal materials in circulation online makes this task incredibly
difficult for the police,” Brennan said.
There
are hundreds of searches for child abuse images every second worldwide,
resulting in hundreds of thousands of child sexual abuse images and videos
being shared every year. The people who produce child sexual abuse media are
often abusers themselves – the US National Center for Missing and Exploited Children found that 16% of people who possess such media had directly and
physically abused children.
The researchers tested iCOP on real-life cases and police
trialed the toolkit. It was highly accurate, with an error rate of only 7.9%
for images and 4.3% for videos. As the system reveals who is sharing known
child sexual abuse media, and shows other files shared by those people, it will
be highly relevant and useful to police.
No comments:
Post a Comment