Home   Bookmark and Share

 Print Friendly and PDF

The NSA Has Been Using An Algorithm To Decide Who Gets Killed With Drone Strikes

By Dan Seitz

February 16, 2016 "Information Clearing House" - "Ars Technica" - It’s popular, in media, to depict governments as vast machines that know exactly what they’re doing. The truth, though, is a government is just a group of people, with the same weaknesses and fallacies of people. The NSA is no different, whether it’s making AT&T do all the work or blatantly violating your privacy for laughs. And that would be fine if one of the NSA’s methods of blowing off work wasn’t using what amounts to a marketing algorithm to decide who’s getting killed by drone strikes. And it’s a badly engineered one, to boot.

Ars Technica has a detailed breakdown of the NSA’s SKYNET program. Which is an apt name, because SKYNET is a “big data” application that pulls metadata from cell phones, like where you called and who you talked to, and puts it to a machine-learning algorithm. It’s built on some questionable assumptions, as well: If you turn off your phone or let your buddy borrow it, the algorithm marks it as an attempt to avoid surveillance.

Based on this, it decides how sketchy the places you visit and the people you talk to are, and it determines how likely you are to be a terrorist. Ars Technica broke out how the algorithm is engineered and found that, the way the NSA uses it, 99,000 people in countries known to harbor terrorists like Pakistan, Somalia and Afghanistan would be “false positives.” Keep in mind that this is a list of people who may be shot with a Hellfire missile.

In fact, the NSA should be fully aware that the algorithm can’t make these kinds of life or death decisions. The Intercept uncovered this program last year and noted that it marked Ahmed Zaidan as a likely terrorist. One problem: Ahmed Zaidan is the Al-Jazeera bureau chief in Islamabad. His entire job is to try to find and interview sketchy people, but he’s obviously not a terrorist. If that weren’t enough, drone strikes often have civilian casualties, so in addition to an innocent person potentially being marked for death by a computer, a bunch of people might die for the crime of being near somebody a computer decided was a terrorist because he had a pizza delivery job.

The good news is that it seems unlikely the NSA is killing everybody this algorithm seems to be marking as a terrorist. The bad news is that there have still been between 2,500 and 4,000 drone strikes since 2004, and this algorithm appears to have been at least in testing since 2007. In other words, thousands of innocent people could have died, because a computer couldn’t figure out they weren’t terrorists.

 

Click for Spanish, German, Dutch, Danish, French, translation- Note- Translation may take a moment to load.

What's your response? -  Scroll down to add / read comments 

Email Newsletter icon, E-mail Newsletter icon, Email List icon, E-mail List icon Sign up for our FREE Daily Email Newsletter

For Email Marketing you can trust

 

 

 

 

 

 

 

 

 

 

 
 Please read our  Comment Policy before posting -
It is unacceptable to slander, smear or engage in personal attacks on authors of articles posted on ICH.
Those engaging in that behavior will be banned from the comment section.
 
 

 

 

 

In accordance with Title 17 U.S.C. Section 107, this material is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. Information Clearing House has no affiliation whatsoever with the originator of this article nor is Information ClearingHouse endorsed or sponsored by the originator.)

Privacy Statement