Facebook Study of Users’ Behavioral Data Prompts Privacy Concerns

July 1, 2014 in News

A weeklong Facebook experiment in which the site collected behavioral data on how users were affected by posts in their news feeds has invoked mixed reactions and privacy concerns, Computerworld reports (Gaudin, Computerworld, 6/30).

Details of Experiment

For one week in 2012, Facebook manipulated the main news feeds of 689,003 users in an effort to determine how the social media website affects users’ emotions (Meyer, NextGov, 6/30). To do so, researchers manipulated users’ news feeds to show either mostly positive comments and posts or mostly negative comments and posts in an effort to determine how the content affected users’ emotions.

The social media site currently has more than one billion users, and only about 0.04% of users — or one in 2,500 — were affected by the experiment.

Researchers found that users who viewed more negative messages in their news feeds tended to post more negative comments and status updates, and those who viewed more positive messages wrote more positive comments.

A report on the experiment was published in the Proceedings of the National Academy of Sciences.

Privacy Concerns

Some observers and Facebook users have voiced trust and privacy concerns related to the experiment.

Patrick Moorhead, an analyst with Moor Insights Strategy, said the experiment “violates the trust of Facebook users who rely on their protection” by manipulating news feeds and publishing the results externally (Computerworld, 6/30).

NextGov reports that the experiment was “almost certainly legal” because users of the social media site surrender the use of their data for analysis, screening and research when agreeing to the company’s terms of service.

Susan Fiske, a psychology professor at Princeton University who edited the study before it was published, said that an institutional review board also approved the experiment “apparently on the grounds that Facebook apparently manipulates people’s news feeds all the time” (NextGov, 6/30).

However, Leslie Meltzer Henry — a bioethicist and lawyer with the Johns Hopkins Berman Institute of Bioethics and the University of Maryland Carey School of Law — said, “The type of one-click consent that Facebook users provide when they agree to the site’s data use policy is very different from the type of informed consent that’s ethically and legally required” for most behavioral health studies (Lupkin, ABC World News, 6/30).

Response From Facebook

Adam Kramer, a data scientist at Facebook who participated in the study, in a Facebook post apologized for upsetting users. He wrote, “Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone.”

He said, “I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused,” adding, “In hindsight, the research benefits of the paper may not have justified all of this anxiety” (Computerworld, 6/30).

Be the first to like.
VN:F [1.9.22_1171]
Rating: 0.0/5 (0 votes cast)

Article source: http://feedproxy.google.com/~r/Ihealthbeat/~3/HkT8opsYxZQ/facebook-study-of-users-behavioral-data-prompts-privacy-concerns

Be Sociable, Share!
Bookmark and Share

Leave a reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>