Celie Deagle
Individual news organizations tweet upwards of 100 times per day—a content diet even the most obsessive tweeter can’t digest. Instead, we pick out small bites, our personal interest and bias helping us choose what tweets we see and which accounts aren’t worth a follow. With each retweet and mention, Twitter’s algorithm goes to work, shaping our feeds for us. And if we’re not careful, we’re soon stuck inside a chamber of the algorithm’s making, where only the things we want to hear (or see) are echoed back.
But Ania Medrek has built an app for that.
It’s called Echology. Medrek developed the app while researching the echo chamber phenomenon on social networking sites during her final year as a Master’s candidate at OCAD University in Toronto. An extension for Twitter, Echology takes note of what accounts you follow and scrapes each news tweet for important keywords. After you click the small Echology button below each tweet, the app randomly generates related tweets from news providers you don’t follow. The suggestions appear under the heading “People who’ve read this may not have read,” a conscious spin on the way sites like Amazon and Facebook recommend products and content. But, Medrek clarifies, Echology doesn’t just show opposing political views. It presents everything in between, the diversity of each news story and the context needed to understand each headline.
If you were to click the Echology button under a news tweet summarizing the latest congressional testimony from Mark Zuckerberg regarding Facebook’s data privacy, for instance, the app might generate a tweet from Politico with new reactions from lawmakers, a story from NPR that highlights a different section of the day’s testimony, and BBC coverage on the U.K. Parliament’s response. These are tweets from news providers you don’t follow, and would not have appeared in your feed otherwise.
Before the idea for Echology was born, Medrek read dozens of studies about the echo chamber phenomenon. What she found surprised her: there were distinct, non-human reasons for the polarization, aggression, and ignorance she had seen percolate on her own Twitter feed for years, reasons explaining how and why one news story plays out in countless different ways with countless different consequences, almost all unseen to the average user. Medrek “tweezed out” the 25 most important and compiled them onto deck of cards, each with its own factor. One reads “misleading headlines,” another “hashtags,” and a third, especially important one: “personalization.”
Personalization is just what it sounds like: the ability to make your Twitter feed unique by filtering out who you follow and who you don’t. When it comes to news, Medrek says, personalization is dangerous. “You start seeing only one perspective,” she says. “You’re not understanding the people around you.” And following a range of news sources won’t necessarily help you break free of a social network’s algorithm, which is programmed to show you more of the content you interact with. “It can trump your decisions,” Medrek says. “The algorithm can decide, oh, but you only actually ever click on this point of view, so we’re actually going to hide and suppress the others, even though you chose to follow those [accounts] too.” You may follow CBC on Twitter, but if you only ever click on articles from CTV News, you’re less likely to see CBC tweets on your feed. It’s the algorithm giving you what it thinks you want.
Medrek knew that to break free of the echo chamber, Twitter users would need to see news stories the algorithm was blocking. So, armed with her deck of 25 contributing factors, Medrek sat down with a group of news industry professionals for what called “participatory design” workshops. After three meetings, Echology had taken shape.
Once she linked Echology to her own Twitter account, Medrek was intrigued by what she noticed. “The different tones, the hierarchy of words, what you chose to highlight and what you didn’t totally shapes peoples news experiences,” she says. “Those little differences mean a lot.” Visitors to OCAD’s graduate exhibition, where Medrek debuted the project in May, tried out Echology and had similar reactions. “People we’re like, ‘Wow, I didn’t even know I needed this, but now that I see it, I need this,’” she says.
Echology isn’t ready to be distributed to the public yet, but Medrek aims to find more developers and work out the app’s kinks, like improving how Echology recognizes keywords in each news tweet. And the workshops she held during the development stage could morph into tools for teaching in high schools, colleges, and newsrooms.
Thinking about and engaging with the echo chamber phenomenon can bring change, Medrek says, pointing out that most social media companies are genuinely open to finding new solutions. “But it’s not just on tech giants to solve this,” she says, “it’s on journalists, it’s on designers. So there’s hope.”
UPDATE (05/24/2018): This story has been slightly modified to clarify information about social networks’ algorithms and the titles of those in the news industry involved with Echology.