Don’t  make us  lab rats, even  happy ones

Editorials

July 1, 2014 8:54PM

Lead story image
Updated: July 2, 2014 2:19AM

If you’re not feeling very chipper today, don’t blame yourself. A social media company might secretly be manipulating your emotions.

We learned about that possibility in recent days with the revelation that in 2012 Facebook toyed with the feelings of nearly 700,000 users. By tweaking its algorithms, the social media company secretly supplied mostly positive news feeds to one group and mostly negative to another. By analyzing the subjects’ writing, Facebook found people who saw positive posts tended to be sunnier in their own writing, while those who saw negative posts leaned toward gloom and doom. It turns out there’s a way to spread “emotional contagion” even among people who have never met.

Should we be alarmed, though Facebook said the effect was slight? Yes.

Social media have become so pervasive that even a slight effect can mean a large number of people were significantly impacted. And individuals’ emotional states are not something to be trifled with. Most suicides follow a mood disorder, and depression is known to boost the risk of heart failure.

Moreover, secret manipulation can have broad policy implications. The New Republic reported last month researchers found that posting photos and voting status of Facebook friends who had gone to the polls along with information on casting ballots inspired an extra 340,000 other people to head to the voting booths. If social media has that kind of power to direct people to the polls, it would be a simple step to encourage only those likely to vote in a way that the people in charge of the algorithms prefer.

It’s a brave new world Vance Packard’s hidden persuaders never imagined.

Drug companies and others do tests on people all the time, of course, but the difference is that they disclose information about the study in advance and give people the choice of opting out. You don’t have to be a lab rat if you don’t want to be.

But Facebook is playing by different rules, and there’s nothing to stop other social media companies from doing similar “research.”

The only “permission” that Facebook could point to was a reference to research in its voluminous terms of service, which few people read. And even that reference reportedly was added only after the study was completed. That isn’t informed consent.

Sitting on the brink of a whole new frontier, we don’t know what other manipulative studies have been done or are underway right now. And secret tests on large population groups have led to many horror stories in the past.

The answer can’t be that people are free to delete their Facebook accounts if they don’t like its policies. Social media plays such a big role in modern lives that to cut oneself off is to sever ties with friends, family and others. Instead, Facebook and other companies should abide by the standard rules of ethical research.

Businesses tinker all the time with their products to find ways to increase sales. We should draw the line when they try to tinker with us.

Return to Top Back to Top