Facebook chooses Farid’s photo-recognition, anti-child pornography tool

By Emily Baer

Facebook became the first online service to implement PhotoDNA — a photo recognition software developed by computer science professor Hany Farid in 2008 to quickly identify images of child pornography circulated on the web — to target illegal photos and their distributors, Facebook announced on May 19. Farid, who teamed up with Microsoft three years ago to develop PhotoDNA, said he is happy his tool is being used to “disrupt the global flow of child pornography.”

“It’s amazing to create something that has real-world application,” Farid said as he stepped away from a computer screen displaying complex code.

Since the advent of the Internet, the distribution of child pornography has exploded, according to Farid.

“It used to be that if you wanted this material you had to go to the seedy part of town,” he said. “There was a huge obstacle and barrier to entry if you will. Now of course it’s much easier, and with demand comes supply, of course.”

Although PhotoDNA cost Microsoft millions of dollars to develop, the company donated the software to the National Center for Missing and Exploited Children for free in 2009, Farid said. NCMEC has since been contacted by law enforcement officials worldwide expressing interest in the program, and NCMEC plans to administer PhotoDNA to companies that request the software for free, Ernie Allen, Chief Executive Officer of NCMEC, said in an interview with The Dartmouth.

Microsoft — which currently uses PhotoDNA in the programs Bing, SkyDrive and Hotmail — hopes that Facebook’s use of the software will pressure other major web services to follow suit, according to Farid.

“I think it would probably be a little grandiose to think [PhotoDNA] will eradicate [the distribution of child pornography on the Internet],” Allen said. “I think it’s going to send a message to the child pornographers and to the pedophiles that the online world is going to become a hostile place to them.”

The software will help locate thousands of pictures of children, and will hold individuals who are using the Internet for child pornography responsible for their actions, Allen said.

Once images containing child pornography are interdicted by PhotoDNA, the issue needs to be addressed from both a policy and legal standpoint, Farid said.

If an online company using PhotoDNA, such as Facebook, discovers an illegal image, it must report the photo to law enforcement officials. The government already receives more reports of child pornography than it can logistically investigate and the deployment of PhotoDNA will likely inundate law enforcers with exponentially more reports, Farid said.

Beyond reporting images of child pornography to government officials, the action that other online companies take is an internal decision, he said.

Although any company that decides to use PhotoDNA is given all pertinent information regarding how to use the program to identify sexual offenders, Microsoft decided to keep the fundamental technology “as confidential as possible” so as to prevent criminals from discovering a way to evade PhotoDNA, Farid said.

The photos that PhotoDNA identify are “the worst of the worst,” Allen said. These “crime scene photos,” are of prepubescent children under the age of 12 being violently sexually abused, according to Allen.

The software is remarkably accurate, Farid said. At Microsoft, 1 billion images have been scanned and not a single false positive was found, he said.

PhotoDNA circumvents various problems regarding the identification of offensive photos by meeting three criteria, according to Farid. PhotoDNA first extracts a signature that does not change as the image is compressed or altered. Second, no two images share the same signature. Third, the signature takes only four milliseconds to compute, which means that a single computer can extract 20 million signatures per day, he said.

The number of images of child pornography that PhotoDNA has identified so far leads Allen to believe that the sharing of such images is a much greater problem than initially expected, he said.

Before Farid was commissioned to develop his software, the U.S. Department of Justice asked major technology companies including Microsoft, Google, AOL, Earthlink and Yahoo! to develop a solution to address the flow of child pornography circulating on the Internet, according to Farid. These companies banded together to form the Technology Coalition, he said.

“They would dutifully meet in Washington once or twice a week for a few years and do nothing — absolutely nothing,” Farid said. “They kept talking about how hard this problem is.”

The coalition struggled to differentiate between acceptable photos and those containing sexual abuse without obstructing the efficiency of Internet programs, Farid said.

The Microsoft team, then headed by Tim Cranton, had read an article in The New York Times that featured Farid’s work in digital forensics. Cranton. eager to enlist the help of a professional well-versed in photo identification technology, invited Farid to a coalition meeting in Washington, D.C., in early 2008, Farid said.

Before Farid joined the team, the Coalition had been unable to develop software capable of scanning the billions of photos that are uploaded to the Internet each day, Farid said.

Farid proposed that the coalition use images in the NCMEC database to eliminate the redistribution of those same photos. The database currently holds 50 million images of child pornography and adds 250,000 each week, according to its website. Because photos are copied and redistributed to hundreds of computers, the identification of one photo as a match to a photo in the database could incriminate hundreds of individuals, Farid said.

“My idea was, don’t try to go after things you haven’t seen before — go after the things you have seen before,” Farid said. “Go after the images that you know are child porn, that you know are horrible, that you know who the victims are and that you know people keep trafficking.”

Read more here: http://thedartmouth.com/2011/05/24/news/farid/
Copyright 2024 The Dartmouth