Are we too apathetic when it comes to social media user experiments?
A few months ago, Facebook got into trouble for experimenting with some of their users. In the name of “science,” the company decided to start tweaking people’s newsfeeds with an excess of either positive or negative status updates from friends. The study showed that exposure to these updates could make people more positive or negative themselves. In short, Facebook made some people sad. On purpose.
Shortly after Facebook published its study, the dating site OK Cupid admitted that it, too, was screwing with its users. The company told people who weren’t matches that they were perfectly compatible. It removed photos from profiles. It tracked conversations between people. Its motive was simple: to see what would happen and maybe improve its own matching algorithm.
Both of these experiments were wildly fascinating. They were also wholly unethical. Neither company had anything even close to informed consent from the people they toyed with. These sites treated their users like guinea pigs, which is weird because I’m not entirely sure it’s even legal to treat guinea pigs like guinea pigs anymore.
The response to these experiments was strange. Some people were outraged, obviously. But most either didn’t notice or didn’t care. Facebook—a site with more than a billion active users—decided to screw with random people’s emotions and the general response was an overwhelming “Meh.” Data collection is bland and uninteresting.
Maybe we just aren’t surprised when this stuff happens anymore. If you use the Internet, you’re experimented on. It isn’t new. In the early 2000s, I worked for a digital agency and one of our clients was a large retail website. For about six weeks, we showed half the site’s visitors yellow “buy” buttons, while the other half saw shiny new green buttons. The green ones showed a marginally higher click rate, which, extrapolated over a year, meant about $50 million. So all the buttons turned green. This is basic A/B testing—an exercise in using data to determine and influence behaviour.
By today’s standard, that kind of experiment is quaint. In the last decade, the Internet has become exceedingly good at tracking and manipulating people. Amazon uses browsing and purchase history to flog products, Google “scans” (but doesn’t “read”) email to try targeting ads, and pretty much every website you visit weighs and measures the actions you take for their own gain. As far as the Internet is concerned, you are the sum total of your clicks, likes and purchases. You are a data profile they can apply an algorithm to and nothing more.
It’s not the worst deal. You get a worldwide network of infinite information and constant communication; they get to sell you stuff. As far as Faustian pacts go, that seems sort of fair. But how far does it go? We get upset when a government starts peeking at our data, but we willingly hand it over to Facebook, Amazon, and Apple and, well, everyone else, assuming that the “Terms of Service” we didn’t actually read are reasonable. (It’s worth noting that Facebook inserted the clause saying they could experiment on you only after their emotion experiment had been conducted, but before they told anyone about it.)
I’m not bringing this up to fear-monger about the evils of modern technology. I like technology, I use Facebook and I shop with Amazon. And I understand that sometimes it brings on big sweeping cultural shifts. But we’re on the cusp of owning Apple Watches that can send our heartbeat to our spouse. Or, theoretically, Facebook. Or a doctor. Or an insurance company. Given where technology is headed, it’s not too much to ask that the companies handling our data be honest about what exactly they’re doing with it (or that we bother to pay attention).
Dave Eggers’ 2013 novel, The Circle, examined people willfully (even gleefully) handing over their information, their privacy and, ultimately, their humanity. People who didn’t like the book criticized that Eggers doesn’t understand technology; that he just doesn’t get it. After seeing the the crowd at Apple’s iPhone and Watch announcement react with almost religious fervour, though, I’m convinced saying Eggers doesn’t understand technology is a lot like reading 1984 and saying George Orwell doesn’t understand government.