© 2024 KASU
Your Connection to Music, News, Arts and Views for 65 Years
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Why Anti-Terror Technology Has Its Limits

SCOTT SIMON, HOST:

President Obama has called on U.S. technology companies to crack down on terrorists' use of social media. But would they have to roll back the encryption technology that millions of users want to keep their accounts from getting hacked by foreign governments, competitors, strangers and, for that matter, terrorists?

Hany Farid co-developed software a few years ago to try to track child pornography. And he says that similar technology could be used to deter terror-related content. He's now chair of the Dartmouth computer science department. Professor Farid joins us from Dartmouth Studios in Hanover. Thanks very much for being with us.

HANY FARID: Good to be with you.

SIMON: And how would this technology work?

FARID: Well, the way the technology worked that we developed for finding and removing child pornography was to actually start with a cache of known child pornography image. And what we understand from the transmission of this material is that the same images, year in and year out, keep getting transmitted. And so we had the idea, well, let's at least stop the transmission of those images. Let's at least stop people transmitting things that we know are bad content.

And what we developed is a technology that extracts from a digital image a signature of sorts. We call it photo DNA. And that signature has two very important properties. It is distinct - two different images will not share the same signature. And the signature is stable over the lifetime of the image, the way our DNA is stable as we age, we cut our hair, change out clothes, etc. And that allows us, that when images are being uploaded to, for example, Facebook or to Dropbox or to SkyDrive, to compare against these known bad content and just simply remove and report the offending material.

SIMON: How could this technology then be applied to potential terrorism?

FARID: Right. So, for example, when, trivially, when images are being shared on social media that are terrorist-related or are known to be part of campaigns to radicalize people, those can be similarly added to the database and be removed. The same holds true for audio recordings. The same holds true for video recordings. I will say, however, though, that technology won't solve all the problems. At the end of the day, you need boots on the ground. You need humans looking at this material once it gets flagged by the technology.

SIMON: I wonder about the chances for abuse. And I'm thinking of, let's say, the teenager who has to do a paper on terrorism, and he downloads a picture of ISIS and finds he's targeted by the FBI.

FARID: Yeah, that's the right question to ask. And that's always sort of the danger of this type of technology. And that's why I said at the end of the day, somebody has to be looking at the content. There has to be a human in the loop because there are subtleties and nuances to the way this technology is being used.

SIMON: And help us understand why technology companies are provisional about encryption.

FARID: You know, I can tell you that in our work with the child exploitation images, technology companies were very reluctant. They were reluctant to get into this business of, essentially, what is seen as a law-enforcement issue. And I don't see it that way. I see it as a terms of service agreement. I happen to partner with Microsoft and the National Center for Missing & Exploited Children to develop technology. And Microsoft got to a point where they said these are bad people doing bad things on our network, and we don't want it anymore. So it was really a business decision to remove this type of material, and I think that's what has to happen with technology companies. They have to stop thinking about it as a law-enforcement issue, a national security issue and start thinking of it as a business issue and start thinking of it as that - we do not want people using our networks to radicalize people and incite violence. And I think once you get to that position, you start deploying technology and you start deploying resources in order to remove this type of material.

SIMON: Hany Farid is chairman of the Dartmouth computer science department. Thanks so much for being with us.

FARID: Good to talk to you, Scott. Transcript provided by NPR, Copyright NPR.