Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Facebook experiment to manipulate human behaviour and emotions
#1
Great.

Quote:Facebook conducted widespread experiments' on user data to alter people's behaviour'

Former member of data science team says researchers conducted tests with little oversight in wake of revelations about company's human emotion experiments

[Image: FACEBOOK_2963641b.jpg]
New information claims that the secret Facebook experiments were more widespread than the controversial mood manipulation study Photo: BLOOMBERG


[Image: Sherwell_60_1770851j.jpg]
By Philip Sherwell, New York

7:48PM BST 03 Jul 2014

[Image: comments.gif]15 Comments


Researchers at the social media giant Facebook were given almost free rein to manipulate the news feeds and sometimes the emotions of many of the company's 1.3 billion users without their knowledge, a former employee has disclosed.

The company's data science team is said to have operated with virtually no supervision and little corporate oversight as it conducted experiments with such regularity that researchers worried that they were repeatedly using the same information.

The revelations by The Wall Street Journal follow controversy over the revelation that Facebook ran psychological experiments to determine how the emotions of almost 700,000 users were affected by highlighting negative or positive postings from their friends on their public news feeds, without informing them.

"There's no review process, per se," said Andrew Ledvina, who worked as a Facebook data scientist from Feb 2012 to July 2013. "Anyone on that team could run a test. They're always trying to alter people's behaviour."

His disclosures signal that the secret experiments were more widespread than the controversial mood manipulation study, revealed last weekend, that sought to discover whether the content presented on users' pages made them happier or sadder.

Related Articles



In one experiment described by the newspaper, thousands of users received a message that they were being locked out of the social network because Facebook believed they were robots or using fake names. The message was actually a test designed to help improve Facebook's antifraud measures.
Mr Ledvina described discussions within the team about conducting other tests without informing users. "I'm sure some people got very angry somewhere," he said. "Internally, you get a little desensitised to it."
Facebook's data science team has reportedly run hundreds of tests and studies without the knowledge or explicit content of participants since its creation in 2007, relying on the broad terms of service agreement in which state that user data can be used for research.
The controversy about the human emotions experiment took Facebook by surprise after its organisers published the details in an academic study.
The company said that it has now implemented stricter controls on the activities of its data science team, including reviews by a 50-person panel of unnamed experts in data security and privacy. "We are taking a very hard look at this process to make more improvements," a spokesman said.
Sheryl Sandberg, Facebook's chief operating officer, this week apologised that the study was "poorly communicated", but not for the research itself.
Although all major internet players such as Google, Yahoo and Twitter study users habits and data, there are particular concerns about Facebook's activities as its contributors post such personal information about their lives, beliefs and emotions.
The latest revelations have fuelled criticisms that the company's regular changes to users' news feeds its key structure are at best shameless and at worst sinister attempts to mould and shape how people act online.
The company has recruited psychologists, behavioural experts and social scientists from the academic world and its internet rivals to bolster its data science and research operations.
Adam Kramer, the lead author of the emotions study, took to Facebook itself with a posting this week to address concerns about the work.
"In hindsight, the research benefits of the paper may not have justified all of this anxiety," he wrote. "The experiment in question was run in early 2012, and we have come a long way since then. Those [new] review practices will also incorporate what we've learned from the reaction to this paper."
But in an earlier interview on the company's website, Mr Kramer, who holds a doctorate in social psychology, explained the allure of conducting research at Facebook. It was "the largest field study in the history of the world", he said. "I just message someone on the right team and my research has an impact within weeks, if not days."
Cameron Marlow, the head of the data science operations, was similarly enthusiastic in an interview with the Technology Review website. "For the first time, we have a microscope that not only lets us examine social behaviour at a very fine level that we've never been able to see before but allows us to run experiments that millions of users are exposed to," he said.
"This is the first time the world has seen this scale and quality of data about human communication."
Even before the controversy about the human emotions study, Technology Review reported that one of Mr Marlow's researchers had developed a way to calculate a country's "gross national happiness" from the Facebook activity of its citizens, by logging the occurrence of words and phrases that signal positive or negative moods.
The shadow is a moral problem that challenges the whole ego-personality, for no one can become conscious of the shadow without considerable moral effort. To become conscious of it involves recognizing the dark aspects of the personality as present and real. This act is the essential condition for any kind of self-knowledge.
Carl Jung - Aion (1951). CW 9, Part II: P.14
Reply
#2
...[quantifying] a country's gross national happiness....?!!? Hey, one could then market them as derivatives! Modern Western Society is fucking insane and immoral - totally out of touch with Natural/moral ethics and values - more so those who run it in secret! I have never used Facebook, and have personally suffered a lot because of that decision.....but this only reinforces my determination not to be put on the NSA/CIA/FBI/NRO/DJI/NASDAQ/DHS/ et al. spy and hit lists....which is now impossible, but try to minimize it, we must! Fight back!
"Let me issue and control a nation's money and I care not who writes the laws. - Mayer Rothschild
"Civil disobedience is not our problem. Our problem is civil obedience! People are obedient in the face of poverty, starvation, stupidity, war, and cruelty. Our problem is that grand thieves are running the country. That's our problem!" - Howard Zinn
"If there is no struggle there is no progress. Power concedes nothing without a demand. It never did and never will" - Frederick Douglass
Reply
#3

The Troubling Link Between Facebook's Emotion Study and Pentagon Research



By Natasha Lennard
July 4, 2014 | 3:05 am Facebook users were rightfully unnerved on discovering that the social media giant had been experimenting with manipulating users' emotions through tweaking the content of news feeds.
Without informing the experiment subjects, data scientists skewed what almost 700,000 Facebook users saw when they logged in. Some were shown posts containing more happy and positive words; some were shown content analyzed as sadder and more negative. After one week, the study found that manipulated users were more likely to post either especially positive or negative words themselves.
They called the effect "emotional contagion." It's troubling both because the experiment was carried out without user consent and, above all, because it seems to have worked. So much for autonomous subjects, freely spreading ideas over social media platforms. Cybernetic dreamers wept.
Facebook has defended the experiment as a means to improve its product. However, evidence has emerged that links the "emotional contagion" experiment to US Department of Defense research into quelling social unrest. One of the authors of the Facebook study, Jeffrey T. Hancock of Cornell University, also received funding from the Pentagon's s Minerva Research Initiative to conduct a similar study on the spread of ideas through social media under authoritarian regimes.
As I have written, the Minerva initiative a series of projects across universities that, in sum, aims to provide a picture both descriptive and predictive of civil unrest is a pernicious example of using the academy for the militaristic purposes of studying and stemming dissent.
The link between the Facebook study and the DoD research comes only through Hancock, and it is to be expected that scientists receive funding and directives from a number of sources. The DoD did not pay for the Facebook study, but the research link is not irrelevant. The Minerva research on the spread of social unrest through online vectors mirrors Facebook's interest in emotional resonance.
Both a Silicon Valley giant and the Pentagon are pouring funds into tracing how we relate emotionally online. I'd argue that this illustrates the limits of platforms like Facebook for radical social change. As the "emotional contagion" experiment suggests, these platforms are all too easily manipulated by those in positions of power. There is an insurmountable asymmetry between those behind Facebook and those using it. There is a vast power differential between those interested in preventing dissent and those interested in spreading it. As we know from the existence of Minerva, too, networked societies are closely observed by those with a stake in social control.
The parlance of "contagion" infects both the DoD and the Facebook research projects. It's a significant metaphor for tracing affect (emotional or political) across networked societies. It is significant too because contagion the passing of disease is that which is understood as necessary to control. Facebook wants control over its users' experiences in order to monetize them better. The government wants control because, at base, it is in the business of control. The task then, for those of us unnerved by Facebook and DoD efforts here, is to be less predictable than a contagious disease.
https://news.vice.com/article/the-troubl...n-research
"The philosophers have only interpreted the world, in various ways. The point, however, is to change it." Karl Marx

"He would, wouldn't he?" Mandy Rice-Davies. When asked in court whether she knew that Lord Astor had denied having sex with her.

“I think it would be a good idea” Ghandi, when asked about Western Civilisation.
Reply
#4

Ex-Facebook Data Scientist: Every Facebook User Is Part Of An Experiment At Some Point




Andrew Ledvina used to be a data scientist at Facebook. He recently made the mistake of talking to a reporter about that, notably telling a WSJ reporter that when he was there from 2012 to 2014, there was no internal review board that might have had qualms about Facebook's now infamous emotion manipulation study, that he and other data scientists were allowed to run any test they wanted as long as it didn't annoy users, and that people working there get "desensitized" to the number of people included in their experiments as it's such a tiny percentage of Facebook's overall user base. Ledvina, like many a person quoted in the media, didn't like the way the reporter presented his words and so took to his blog to defend himself, Facebook, and the Facebook study but I think he simply dug a deeper hole for the company that he quit this April. He facetiously titled the blog, "10 Ways Facebook Is Actually The Devil," to needle those who see the study as evil, and then went on to confirm the WSJ's report and shed new light on how Facebook's data science team views users.
1. The Facebook emotion manipulation study didn't get vetted before it was run on users, but it likely did get vetted by Facebook's PR and legal team before it went into a scholarly journal; they apparently didn't think it would make people angry and result in a reported investigation in Europe and legal complaint with the FTC in the States.
Ledvina: "While I was at Facebook, there was no institutional review board that scrutinized the decision to run an experiment for internal purposes. Once someone had a result that they decided they wanted to submit for publication to a journal, there definitely was a back and forth with PR and legal over what could be published."
2. If you're on Facebook, you have definitely been a test subject at some point.
Ledvina: "Experiments are run on every user at some point in their tenure on the site…"
3. But you may have been a test subject in a very boring experiment.
Ledvina: "…Whether that is seeing different size ad copy, or different marketing messages, or different call to action buttons, or having their feeds generated by different ranking algorithms, etc." [Ed Note: Link on "call to action" is mine not Ledvina's.]
4. This ex-employee of Facebook still doesn't understand why people are upset that Facebook researchers tried to see if they could upset people.
Ledvina: "The fundamental purpose of most people at Facebook working on data is to influence and alter people's moods and behaviour. They are doing it all the time to make you like stories more, to click on more ads, to spend more time on the site. This is just how a website works, everyone does this and everyone knows that everyone does this, I don't see why people are all up in arms over this thing all of a sudden."
5. Facebook researchers forget that what they're doing has an effect on the real live people that use Facebook.
Ledvina: "Every data scientist at Facebook that I have ever interacted with has been deeply passionate about making the lives of people using Facebook better, but with the pragmatic understanding that sometimes you need to hurt the experience for a small number of users to help make things better for 1+ billion others. That being said, all of this hubbub over 700k users like it is a large number of people is a bit strange coming from the inside where that is a very tiny fraction of the user base (less than 0.1%), and even that number is likely inflated to include a control group. It truly is easy to get desensitized to the fact that those are nearly 1M real people interacting with the site."
Ledvina expressed surprise that the experiment playing with the emotional content of users' News Feeds was getting so much play in the press while other Facebook research has been ignored. He pointed to an event last year, where Facebook researchers and academic researchers got together to talk about work done to see how "Facebook and social networks in general can be more compassionate," and prevent bullying particularly among teens. "I am a bit taken aback by the fact that the most recent paper has gotten as much press as it has, when the work done as part of the compassion research days has never been mentioned," he writes. "Some of these papers are based on experiments that influence people's behavior in similar ways, but I guess they do not have as much cachet for whatever reason."
I went ahead and watched the hours of presentations archived by Facebook from the "Compassion Research Day." There were a couple of key differences from the January 2012 manipulation study. First, none of the work done by the researchers aimed to make people feel worse. Secondly, the research on people's behavior was far less surreptitious. Thirdly, researchers didn't interfere with people's "natural" experience of the site beyond obvious (and some not-so obvious) prompts for feedback.
One of the videos about "new tools to understand people" doesn't have a presenter talking about trying to influence people's emotional state in a negative way and then measure it by monitoring their status updates. " We asked users, What are you trying to do? Why did you click that button?'" says the presenter. "We learned a lot from asking people for feedback."
This is a transparent way of "running tests" on users, presenting them with questions and asking them for feedback, a rather traditional approach to experimentation. Another presenter talked about measuring emotion by asking people to put "emoticon" faces on their status updates. Okay, Facebook users may not have realized they got these cute digital stickers for Facebook to measure their emotional states, but it's at least a translucent way of taking users' emotional temperature.
[Image: Facebook-emoticons.jpg] Facebook researchers worked up these digital stickers so they could measure users' emotions across the world

In trying to see how teens deal with stuff they don't like on the site, Facebook proactively asked the youngish ones how they felt when they flagged something they didn't like. That's a far more aboveboard approach than the one taken by the Facebook data scientist in the 2012 emotion study. The researchers were trying to find out why people get offended and how they resolve the issue, and giving them better tools to aid the process, which is all a far cry from turning the tone of News Feeds negative to see if it makes someone blue and turns them off using Facebook. The perhaps unexpected thing done by the compassion researchers' is collecting the messages Facebook users send to other users when asking them to take down an embarrassing photo or offensive post; they do this to try to understand why people object to content on the site.
[Image: Facebook-asking-people-how-they-feel.jpg] Facebook asking users how a post made them feel, rather than secretly divining it from their status updates

[Image: Facebook-asking-for-feedback.jpg] Facebook's innovation' in gauging feedback, as presented at a company event in December 2013

Ledvina is pessimistic about the hubbub' over the study leading to any change at Facebook, though he makes very clear that he is an ex-employee who knows little about the inner workings at Facebook now. "The only thing I see changing from this is not whether similar experiments will be run, but rather [whether they will] be published," he writes, confirming the outcome that many commentators, including the New York Times' Farhad Manjoo, fear. "Similar experiments have been and will continue to be run, but you probably just won't see a paper about it anymore."
http://www.forbes.com/sites/kashmirhill/...ome-point/
"The philosophers have only interpreted the world, in various ways. The point, however, is to change it." Karl Marx

"He would, wouldn't he?" Mandy Rice-Davies. When asked in court whether she knew that Lord Astor had denied having sex with her.

“I think it would be a good idea” Ghandi, when asked about Western Civilisation.
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Facebook's CIA Study: Massive-Scale Emotional Contagion Through Social Netw David Guyatt 0 6,722 29-10-2016, 08:36 AM
Last Post: David Guyatt
  Modified Milgram Experiment - Leftists Less Inclined to Follow Orders. Peter Lemkin 2 5,004 03-07-2014, 02:31 AM
Last Post: Magda Hassan
  Facebook account apparently frozen for Chemtrails picture David Guyatt 7 5,753 17-02-2014, 03:16 AM
Last Post: Lauren Johnson
  Facebook is scaring me Ed Jewett 49 36,042 18-09-2012, 12:00 AM
Last Post: Greg Burnham
  Facebook: ‘Dark Profiles’ Ed Jewett 0 2,362 06-08-2012, 07:51 PM
Last Post: Ed Jewett
  A History of Secret Human Experimentations from ter Bernice Moore 11 8,507 29-04-2012, 05:04 AM
Last Post: Lauren Johnson
  Discovery News: Earth, Space, Tech, Animals, History, Adventure, Human, Autos Bernice Moore 0 2,733 15-03-2012, 09:25 PM
Last Post: Bernice Moore
  Stanford prison experiment Magda Hassan 0 2,824 18-08-2011, 02:02 PM
Last Post: Magda Hassan
  Toxin from GM Crops Found in Human Blood Ed Jewett 0 2,356 18-05-2011, 10:37 PM
Last Post: Ed Jewett
  Scientists target drugs that improve behaviour Ed Jewett 4 3,852 27-04-2011, 08:43 PM
Last Post: Ed Jewett

Forum Jump:


Users browsing this thread: 1 Guest(s)