04-07-2014, 09:15 AM
Great.
Quote:Facebook conducted widespread experiments' on user data to alter people's behaviour'
Former member of data science team says researchers conducted tests with little oversight in wake of revelations about company's human emotion experiments
New information claims that the secret Facebook experiments were more widespread than the controversial mood manipulation study Photo: BLOOMBERG
By Philip Sherwell, New York
7:48PM BST 03 Jul 2014
15 Comments
Researchers at the social media giant Facebook were given almost free rein to manipulate the news feeds and sometimes the emotions of many of the company's 1.3 billion users without their knowledge, a former employee has disclosed.
The company's data science team is said to have operated with virtually no supervision and little corporate oversight as it conducted experiments with such regularity that researchers worried that they were repeatedly using the same information.
The revelations by The Wall Street Journal follow controversy over the revelation that Facebook ran psychological experiments to determine how the emotions of almost 700,000 users were affected by highlighting negative or positive postings from their friends on their public news feeds, without informing them.
"There's no review process, per se," said Andrew Ledvina, who worked as a Facebook data scientist from Feb 2012 to July 2013. "Anyone on that team could run a test. They're always trying to alter people's behaviour."
His disclosures signal that the secret experiments were more widespread than the controversial mood manipulation study, revealed last weekend, that sought to discover whether the content presented on users' pages made them happier or sadder.
Related Articles
- Facebook conducted secret psychology experiment on users' emotions
28 Jun 2014
- Mark Zuckerberg: the face of Facebook
21 Feb 2014
- What's up with Facebook?
20 Feb 2014
- Third of Facebook posts get deleted
17 Dec 2013
In one experiment described by the newspaper, thousands of users received a message that they were being locked out of the social network because Facebook believed they were robots or using fake names. The message was actually a test designed to help improve Facebook's antifraud measures.
Mr Ledvina described discussions within the team about conducting other tests without informing users. "I'm sure some people got very angry somewhere," he said. "Internally, you get a little desensitised to it."
Facebook's data science team has reportedly run hundreds of tests and studies without the knowledge or explicit content of participants since its creation in 2007, relying on the broad terms of service agreement in which state that user data can be used for research.
The controversy about the human emotions experiment took Facebook by surprise after its organisers published the details in an academic study.
The company said that it has now implemented stricter controls on the activities of its data science team, including reviews by a 50-person panel of unnamed experts in data security and privacy. "We are taking a very hard look at this process to make more improvements," a spokesman said.
Sheryl Sandberg, Facebook's chief operating officer, this week apologised that the study was "poorly communicated", but not for the research itself.
Although all major internet players such as Google, Yahoo and Twitter study users habits and data, there are particular concerns about Facebook's activities as its contributors post such personal information about their lives, beliefs and emotions.
The latest revelations have fuelled criticisms that the company's regular changes to users' news feeds its key structure are at best shameless and at worst sinister attempts to mould and shape how people act online.
The company has recruited psychologists, behavioural experts and social scientists from the academic world and its internet rivals to bolster its data science and research operations.
Adam Kramer, the lead author of the emotions study, took to Facebook itself with a posting this week to address concerns about the work.
"In hindsight, the research benefits of the paper may not have justified all of this anxiety," he wrote. "The experiment in question was run in early 2012, and we have come a long way since then. Those [new] review practices will also incorporate what we've learned from the reaction to this paper."
But in an earlier interview on the company's website, Mr Kramer, who holds a doctorate in social psychology, explained the allure of conducting research at Facebook. It was "the largest field study in the history of the world", he said. "I just message someone on the right team and my research has an impact within weeks, if not days."
Cameron Marlow, the head of the data science operations, was similarly enthusiastic in an interview with the Technology Review website. "For the first time, we have a microscope that not only lets us examine social behaviour at a very fine level that we've never been able to see before but allows us to run experiments that millions of users are exposed to," he said.
"This is the first time the world has seen this scale and quality of data about human communication."
Even before the controversy about the human emotions study, Technology Review reported that one of Mr Marlow's researchers had developed a way to calculate a country's "gross national happiness" from the Facebook activity of its citizens, by logging the occurrence of words and phrases that signal positive or negative moods.
The shadow is a moral problem that challenges the whole ego-personality, for no one can become conscious of the shadow without considerable moral effort. To become conscious of it involves recognizing the dark aspects of the personality as present and real. This act is the essential condition for any kind of self-knowledge.
Carl Jung - Aion (1951). CW 9, Part II: P.14