DYSPEPSIA GENERATION

We have seen the future, and it sucks.

Palantir, the War on Terror’s Secret Weapon

29th November 2011

Read it.

None of Fikri’s individual actions would raise suspicions. Lots of people rent trucks or have relations in Syria, and no doubt there are harmless eccentrics out there fascinated by amusement park infrastructure. Taken together, though, they suggested that Fikri was up to something. And yet, until about four years ago, his pre-attack prep work would have gone unnoticed. A CIA analyst might have flagged the plane ticket purchase; an FBI agent might have seen the bank transfers. But there was nothing to connect the two. Lucky for counterterror agents, not to mention tourists in Orlando, the government now has software made by Palantir Technologies, a Silicon Valley company that’s become the darling of the intelligence and law enforcement communities.

One can easily imagine two reactions to this story: (1) “Thank God we’re finally getting the tools needed to prevent future atrocities!'; (2) “Oh my God, Big Brother has finally arrived!’ And both would be correct, because it’s impossible to know beforehand what sort of picture ‘connecting the dots’ is going to produce.

The thing I find most disturbing about this illustrative scenario is that at no time has this subject done anything illegal, or even for which a perfectly innocent explanation is not just as likely as the one posited in the scenario. IF one approaches it with the assumption that ‘this is a terrorist planning a strike’, then his activities will point strongly in that direction. But if one does not approach it with that assumption, then these activities don’t necessarily compel that conclusion. There are a lot of people who go to Disneyland without enjoying themselves. (I’ve met a few.) People with foreign relatives often send them money. (If the guy’s name was Gonzalez, and his relatives in Mexico, nobody would think twice about it.)

This would seem to be another case where technology is enabling government oversight of personal activity to a degree not anticipated by current law, and I suspect that many in government are going to rejoice over their new toolbox without spending much time pondering what the appropriate limits on its use might be. The unfortunate aspect is, that those who would put such considerations at the top of their priority list are typically unaware of such new technological developments until they run across a news story about it, such as this one. If (as is often the case) the companies and agencies involved put forth an effort to keep this sort of thing out of the public eye — a typical argument being that to allow the existence of this technology to leak out will seriously impair its usefulness, and that’s a persuasive argument — then the chances of there being a serious discussion among interested parties about possible appropriate limits on its use grows vanishingly small. And that bothers me.

Comments are closed.