Report urges fixes to online child exploitation CyberTipline before AI makes it worse
Time:2024-05-21 13:06:47 Source:sportViews(143)
A tipline set up 26 years ago to combat online child exploitation has not lived up to its potential and needs technological and other improvements to help law enforcement go after abusers and rescue victims, a new report from the Stanford Internet Observatory has found.
The fixes to what the researchers describe as an “enormously valuable” service must also come urgently as new artificial intelligence technology threatens to worsen its problems.
“Almost certainly in the years to come, the CyberTipline will just be flooded with highly realistic-looking AI content, which is going to make it even harder for law enforcement to identify real children who need to be rescued,” said researcher Shelby Grossman, an author of the report.
The service was established by Congress as the main line of defense for children who are exploited online. By law, tech companies — must report any child sexual abuse material they find on their platforms to the system, which is operated by the National Center for Missing and Exploited Children. After it receives the reports, NCMEC attempts to find the people who sent or received the material — as well as the victims, if possible. These reports are then sent to law enforcement.
Previous:The government wants to buy their flood
Next:Cruise worker 'murders newborn son on board ship': Shocked co
You may also like
- Everybody may love Raymond, but Ray Romano loves Peter Boyle
- 'Major logistics exercise' to deliver humanitarian aid from NZ to Gaza
- How directors, distributors and devotees are struggling to keep Hong Kong cinema alive
- Japan's lunar craft lands successfully but can't generate solar power
- Who is Jacob Zuma, the former South African president disqualified from next week's election?
- Chinese media levels Australian spy claim
- Alec Baldwin again charged with manslaughter in 'Rust' movie
- Communist Party anniversary will be the 'elephant in the room', expert says
- Here comes the char