Image: Justin Sullivan/Getty Images
Much noise has rightly been made about the role Facebook played in the 2016 presidential election. Critics have pointed to a targeted ad campaign by Russian groups as proof that the Menlo Park-based company wasn't minding the store — and alleged that disaster followed as a result.
But that argument overlooks one key point: In showing microtargeted "dark ads" to users, Facebook was doing exactly what it was designed to do. The larger problem is not these specific Russian ads (which Facebook refuses to disclose to the public) — or even that Donald Trump was elected president — but the very system upon which the company is built.
Mark Zuckerberg's plan to increase transparency on political advertisements, while welcome, falls into the same trap. Yes, more disclosure is good, but what is the remedy when the underlying architecture itself is gangrenous?
Zeynep Tufekci, author of Twitter and Tear Gas and associate professor at the University of North Carolina at Chapel Hill, made this point painfully clear in a September TED Talk that dove into the way the same algorithms designed to better serve us ads on platforms like Facebook have the ability to be deployed for much darker purposes.
"So Facebook's market capitalization is approaching half a trillion dollars," Tufekci told the gathered crowd. "It's because it works great as a persuasion architecture. But the structure of that architecture is the same whether you're selling shoes or whether you're selling politics. The algorithms do not know the difference. The same algorithms set loose upon us to make us more pliable for ads are also organizing our political, personal and social information flows, and that's what's got to change."
Tufekci further argued that when machine learning comes into play, humans can lose track of exactly how algorithms work their magic. And, she continued, not fully