It’s no longer about trendy. It’s about trending.
As far as I can tell, “trending” means becoming popular at supersonic speeds among a mass of people. It’s a measure of what’s hot right now. Or at least, it’s supposed to be.
In “How Twitter’s Trending Algorithm Picks Its Topics” (http://m.npr.org/news/front/143013503), Laura Sydell of NPR reveals that math is not always as objective as we trust it to be. An algorithm (again, as far as I can tell) is a set of rules by which you crunch data to purportedly get a reliable answer. Cool.
So how was it that Twitter’s algorithm missed #OccupyWallStreet in late November-early December of 2011, at the height of the Occupy Wall Street protests?
“There had been thousands of tweets for Occupy Wall Street regularly over many weeks, so Twitter’s algorithms stopped putting it on the trending topics list. In some ways, Twitter’s algorithms act like a lot of human news editors who are more interested in the latest news than an ongoing story, says Tarleton Gillespie, a communications professor at Cornell University.”
So not only are these rules made by human beings and thus susceptible to error like human beings… they make human being-type calls, like: I’m so over Occupy Wall Street.
And algorithms don’t just get bored–they get biased. Check out what happened over at Amazon when it set up an algorithm to calculate its best-seller list:
“Naively, you would say, ‘Well, the most-selling book is No. 1 and the second-selling book is No. 2,'” Gillespie says.
But a couple of years ago, there was a dust-up because all the gay-themed books disappeared from the list. It turns out that Amazon doesn’t let any adult books in its best-sellers, and someone accidentally put the gay-themed books in that category.
“It’s a curated list,” Gillespie says. “It’s a list that will never show us if something that they or their publishers had classified as adult would ever show up there.”
While on one level, we’re just talking about books and tweets, on another, we’re talking about public opinion and what people understand to be not just “hot,” but more fundamentally: relevant, worthwhile and acceptable to engage. LGBTTQQ lit’s absence from Amazon’s best-seller list inadvertently contributes to a consciousness that suggests non-heterosexual people, experiences, identities and cultures are not acceptable for mainstream consumption or conversation; whereas seeing LGBTTQQ books on that list provides the general public with more exposure to diverse sexualities as just a regular part of our everyday lives, on and off-line. More people end up talking about and purchasing the books, and perhaps heterosexism as a social norm takes one step back.
Of course, maybe the presence of LGBTTQQ lit on the best-seller list stirs up controversy, but that, too, helps us engage and reckon with our personal and social biases.
If you’re not dependent on Amazon for your book choices or hungry to follow what’s trending on Twitter, you may still be experiencing the effects of biased algorithms. As Sydell concludes, “Everything from restaurant reviews to your friend’s baby pictures to your local news is getting served up to you by an algorithm. As much as programmers may think their algorithms will deliver objective results, those calculations may be just as biased as a real human being.”
Good thing we still have human beings around to notice the biases we, and our math, have.