This is just one of the reasons I don't use Facebook.
Facebook Conducted Psychological Experiments On Unknowing Users
By Annie-Rose Strasser
Original Link from ThinkProgress
The latest way that Facebook has been peeking into its users’ personal lives may be the most surprising yet: Facebook researches have published a scientific paper that reveals the company has been conducting psychological experiments on its users to manipulate their emotions.
The experiments sought to prove the phenomenon of “emotional contagion” — as in, whether you’ll be more happy if those in your Facebook news feed are. They took place over the week of January 11th-18th, 2012, and targeted 689,003 English-speaking Facebook users.
The study, which was published in the Proceedings of the National Academy of Sciences. was successful. It found that, indeed, manipulating the algorithm to show more “positive” posts in your news feed will actually inspire you to write more “positive” posts yourself. So, for example, if you see a lot of people happy about their jobs or excited to be seeing the concert of their favorite band, then you’re more likely to post that you are happy about something in your life, too.
While that little fact in itself may be interesting, there’s one disturbing aspect of the study: None of the people involved in the experiment were explicitly told that they would be a part of it.
Facebook does have terms of service — ones that every Facebook user has agreed to — that specify users’ data may be used “for internal operations, including troubleshooting, data analysis, testing, research and service improvement.” The researchers of this psychology experiment argue that their experiments fall under these terms of use because “no text was seen by the researchers.” Rather, a computer program scanned for words that were considered either “positive” or “negative.”
“As such,” the researchers write, “it was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.”
People have long been familiar with the double-edged sword of using Facebook: The social network may be useful for staying in touch with your friends, but Facebook can find out who you are from your data — information as specific as where you went on vacation or what movie you went to see on a given Friday night.
Sheryl Sandberg, Facebook’s chief operations officer, explained recently what the company aims to do with the massive amounts of data it collects from users. “Our goal is that every time you open News Feed, every time you look at Facebook, you see something, whether it’s from consumers or whether it’s from marketers, that really delights you, that you are genuinely happy to see,” she said.
The company doesn’t just hang onto that data or use it for research. It also gives other entities access to it. Sometimes it’s for something as banal as a targeted advertisement. Other times, data on users can be handed over for intelligence agency investigations, or to an insurance company.
While users may have some awareness of the terms they’ve agreed to when signing up, they aren’t fully aware of what their information is being used for. A Consumer Reports survey, for example, found that “only 37 percent of Facebook users say they have used the site’s privacy tools to customize how much information apps are allowed to see.” Those third parties are one of the biggest collectors of data. And, the report explains, this touches everyone, not just those who don’t change their app settings. “Even if you have restricted your information to be seen by friends only, a friend who is using a Facebook app could allow your data to be transferred to a third party without your knowledge.”
Advocates have said one way to combat this absolute lack of knowledge — and to avoid being unwittingly used in a psychological experiment — is to mandate that tech companies put their terms of use in “plain English.”
“There’s a burden on the individual to get educated, but there’s also a burden on the companies,” Dr. Pamela Rutledge, director of the Media Psychology Research Center, told ThinkProgress earlier this year. “We’re not all lawyers, we’re not all IT guys,” she said.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment