Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

The FTC is worried about algorithmic transparency, and you should be too

Katherine Noyes | April 10, 2015
Don't assume your Facebook friends are ignoring you -- it could simply be the site's algorithms at work.

Algorithms in the media are also increasingly used to facilitate potential censorship decisions, Diakopoulos noted, such as when automated systems help moderators filter and screen online comments and determine what is considered valid commentary versus what should not be published at all.

Then, too, there are companies such as Automated Insights and Narrative Science producing news stories at scale "based on nothing more than structured data inputs," he said. Automated Insights, for instance, recently announced that it is producing and publishing 3,000 earnings stories per quarter for the Associated Press, all automatically generated from data.

Besides being a shining example of the stuff journalists' nightmares are made of, that scenario is also associated with a host of accuracy problems. "A quick search on Google shows that the thousands of Automated Insights earning reports are also yielding a range of errors, leading to corrections being posted," Diakopoulos said. "What if these were market-moving errors? What is the source of those errors: the data, or the algorithm?"

Algorithms can even lead technology users to think and behave differently than they would otherwise.

"Say I notice that one of my posts on Facebook gets no likes" Sandvig explained. Whereas the likely explanation is that Facebook's algorithm simply filtered the post out of friends' news feeds, "we found that people will sometimes assume it's because their friends don't want them to post about that topic, and so they won't post about it ever again," he said.

What may seem to its creator like a fairly straightforward filter, in other words, could quickly snowball into something much bigger that changes people's behavior as well.

So what can be done about all this?

Efforts such as the FTC's to increase transparency is one approach.

"One possibility is that companies will need to start issuing transparency reports on their major algorithms, to benchmark things like error rates, data quality, data use and emergent biases," Diakopoulos said.

Another possibility is that new user interfaces could be developed that give end users more information about the algorithms underlying the technologies they interact with, he suggested.

Regulation may also need to be part of the picture, particularly when it comes to ensuring that elections cannot be manipulated at scale algorithmically, he said.

"Government involvement will be very important," agreed Sandvig. "Illegal things are going to happen with and without computers, and the government needs to be able to handle it either way. It's not an expansion of government power -- it's something that's overdue."

Sandvig isn't convinced that increased transparency will necessarily help all that much, however. After all, even if an algorithm is made explicit and can be inspected, the light it will shed on potential consequences may be minimal, particularly when the algorithm is complicated or performs operations on large sets of data that aren't also available for inspection.

 

Previous Page  1  2  3  Next Page 

Sign up for Computerworld eNewsletters.