Facebook will open-source two algorithms it uses to identify child sexual exploitation, terrorist propaganda, and graphic violence, the company said today. PDQ and TMK+PDQF, a pair of technologies that store files as digital hashes and compare them with known examples of harmful content, have been released on Github, Facebook said in a blog post.
Facebook said it hopes that other tech companies, nonprofit organizations, and individual developers will use the technology to identify more harmful content and add it to shared databases. That helps platforms remove the content more quickly when people attempt to upload it.
“For those who already use their own or other content matching technology, these technologies are another layer of defense and allow hash sharing systems to talk to each other, making them that much more powerful,” the company said in a blog post.
Platforms have come under increasing pressure to remove harmful content this year. After the Christchurch shooting, Australia threatened to punish executives with hefty fines and jail time if they did not swiftly remove video of the attack. In May, Facebook joined other big tech platforms in signing the Christchurch call, a pledge to devote more resources to removing harmful content and to collaborate better with other companies.
Facebook’s move also comes at a time when more child exploitation videos are being posted online, the company said in a blog post.
“In just one year, we witnessed a 541% increase in the number of child sexual abuse videos reported by the tech industry to the CyberTipline,” said John Clark, president and CEO of the National Center for Missing and Exploited Children (NCMEC), in a blog post. “We’re confident that Facebook’s generous contribution of this open-source technology will ultimately lead to the identification and rescue of more child sexual abuse victims.”
Today’s move marks the first time Facebook has open-sourced photo- or video-matching technology, the company said. Microsoft and Google previously contributed similar technologies.
In addition to the open-source move, Facebook has a partnership with the University of Maryland, Cornell University, Massachusetts Institute of Technology, and the University of California, Berkeley that’s investigating ways to stop people from making subtle alterations to banned photos and videos to circumvent safety systems.
Facebook announced the news today at its fourth annual child safety hackathon in Menlo Park.