Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

The FTC is worried about algorithmic transparency, and you should be too

Katherine Noyes | April 10, 2015
Don't assume your Facebook friends are ignoring you -- it could simply be the site's algorithms at work.

It's no secret that algorithms power much of the technology we interact with every day, whether it's to search for information on Google or to browse through a Facebook news feed. What's less widely known is that algorithms also play a role when we apply for a loan, for example, or receive a special marketing offer.

Algorithms are practically everywhere we are today, shaping what we see, what we believe and, to an increasing extent, what our futures hold. Is that a good thing? The U.S. Federal Trade Commission, like many others, isn't so sure.

"Consumers interact with algorithms on a daily basis, whether they know it or not," said Ashkan Soltani, the FTC's chief technologist. "To date, we have very little insight as to how these algorithms operate, what incentives are behind them, what data is used and how it's structured."

A few weeks ago the FTC's Bureau of Consumer Protection established a brand-new office dedicated to increasing that understanding. It's called the Office of Technology Research and Investigation, and algorithmic transparency is one of the issues it will be focusing on, by supporting external research as well as conducting studies of its own.

The idea is to better understand how these algorithms work -- what assumptions underlie them and what logic drives their results -- with an eye toward ensuring that they don't enable discrimination or other harmful consequences.

The term "algorithm" can seem intimidating for those not well acquainted with computer programming or mathematics, but in fact "a good synonym for 'algorithm' is simply 'recipe,'" said Christian Sandvig, a professor in the School of Information at the University of Michigan. "It's just a procedure to accomplish a task."

The concern arises when the algorithms guiding aspects of our lives produce results we don't want. The potential examples are numerous. Some of them are fairly clear-cut: discrimination in credit, housing, labor and jobs, for example, or unfair pricing practices.

In what's become a classic illustration, one 2013 Harvard study found that Google searches on "black-sounding" names such as Trevon Jones were more likely to generate ads for public-records search services suggesting that the person in question had an arrest record.

Google did not respond to a request to comment for this story.

Other examples are more subtle.

"One of the problems is that algorithms are increasingly mediating the media and information that we're exposed to, which can have implications for things like politics," said Nick Diakopoulos, a professor in the University of Maryland's College of Journalism. "We know, for example, that simply increasing the amount of hard news in the Facebook news feed can result in a larger number of people turning out to vote."

 

1  2  3  Next Page 

Sign up for Computerworld eNewsletters.