Facebook experiment to manipulate human behaviour and emotions - Printable Version +- Deep Politics Forum (https://deeppoliticsforum.com/fora) +-- Forum: Deep Politics Forum (https://deeppoliticsforum.com/fora/forum-1.html) +--- Forum: Science and Technology (https://deeppoliticsforum.com/fora/forum-11.html) +--- Thread: Facebook experiment to manipulate human behaviour and emotions (/thread-12811.html) |
Facebook experiment to manipulate human behaviour and emotions - David Guyatt - 04-07-2014 Great. Quote:Facebook conducted widespread experiments' on user data to alter people's behaviour' Facebook experiment to manipulate human behaviour and emotions - Peter Lemkin - 04-07-2014 ...[quantifying] a country's gross national happiness....?!!? Hey, one could then market them as derivatives! Modern Western Society is fucking insane and immoral - totally out of touch with Natural/moral ethics and values - more so those who run it in secret! I have never used Facebook, and have personally suffered a lot because of that decision.....but this only reinforces my determination not to be put on the NSA/CIA/FBI/NRO/DJI/NASDAQ/DHS/ et al. spy and hit lists....which is now impossible, but try to minimize it, we must! Fight back! Facebook experiment to manipulate human behaviour and emotions - Magda Hassan - 05-07-2014 The Troubling Link Between Facebook's Emotion Study and Pentagon ResearchBy Natasha Lennard July 4, 2014 | 3:05 am Facebook users were rightfully unnerved on discovering that the social media giant had been experimenting with manipulating users' emotions through tweaking the content of news feeds. Without informing the experiment subjects, data scientists skewed what almost 700,000 Facebook users saw when they logged in. Some were shown posts containing more happy and positive words; some were shown content analyzed as sadder and more negative. After one week, the study found that manipulated users were more likely to post either especially positive or negative words themselves. They called the effect "emotional contagion." It's troubling both because the experiment was carried out without user consent and, above all, because it seems to have worked. So much for autonomous subjects, freely spreading ideas over social media platforms. Cybernetic dreamers wept. Facebook has defended the experiment as a means to improve its product. However, evidence has emerged that links the "emotional contagion" experiment to US Department of Defense research into quelling social unrest. One of the authors of the Facebook study, Jeffrey T. Hancock of Cornell University, also received funding from the Pentagon's s Minerva Research Initiative to conduct a similar study on the spread of ideas through social media under authoritarian regimes. As I have written, the Minerva initiative a series of projects across universities that, in sum, aims to provide a picture both descriptive and predictive of civil unrest is a pernicious example of using the academy for the militaristic purposes of studying and stemming dissent. The link between the Facebook study and the DoD research comes only through Hancock, and it is to be expected that scientists receive funding and directives from a number of sources. The DoD did not pay for the Facebook study, but the research link is not irrelevant. The Minerva research on the spread of social unrest through online vectors mirrors Facebook's interest in emotional resonance. Both a Silicon Valley giant and the Pentagon are pouring funds into tracing how we relate emotionally online. I'd argue that this illustrates the limits of platforms like Facebook for radical social change. As the "emotional contagion" experiment suggests, these platforms are all too easily manipulated by those in positions of power. There is an insurmountable asymmetry between those behind Facebook and those using it. There is a vast power differential between those interested in preventing dissent and those interested in spreading it. As we know from the existence of Minerva, too, networked societies are closely observed by those with a stake in social control. The parlance of "contagion" infects both the DoD and the Facebook research projects. It's a significant metaphor for tracing affect (emotional or political) across networked societies. It is significant too because contagion the passing of disease is that which is understood as necessary to control. Facebook wants control over its users' experiences in order to monetize them better. The government wants control because, at base, it is in the business of control. The task then, for those of us unnerved by Facebook and DoD efforts here, is to be less predictable than a contagious disease. https://news.vice.com/article/the-troubling-link-between-facebooks-emotion-study-and-pentagon-research Facebook experiment to manipulate human behaviour and emotions - Magda Hassan - 10-07-2014 Ex-Facebook Data Scientist: Every Facebook User Is Part Of An Experiment At Some PointAndrew Ledvina used to be a data scientist at Facebook. He recently made the mistake of talking to a reporter about that, notably telling a WSJ reporter that when he was there from 2012 to 2014, there was no internal review board that might have had qualms about Facebook's now infamous emotion manipulation study, that he and other data scientists were allowed to run any test they wanted as long as it didn't annoy users, and that people working there get "desensitized" to the number of people included in their experiments as it's such a tiny percentage of Facebook's overall user base. Ledvina, like many a person quoted in the media, didn't like the way the reporter presented his words and so took to his blog to defend himself, Facebook, and the Facebook study but I think he simply dug a deeper hole for the company that he quit this April. He facetiously titled the blog, "10 Ways Facebook Is Actually The Devil," to needle those who see the study as evil, and then went on to confirm the WSJ's report and shed new light on how Facebook's data science team views users. 1. The Facebook emotion manipulation study didn't get vetted before it was run on users, but it likely did get vetted by Facebook's PR and legal team before it went into a scholarly journal; they apparently didn't think it would make people angry and result in a reported investigation in Europe and legal complaint with the FTC in the States. Ledvina: "While I was at Facebook, there was no institutional review board that scrutinized the decision to run an experiment for internal purposes. Once someone had a result that they decided they wanted to submit for publication to a journal, there definitely was a back and forth with PR and legal over what could be published." 2. If you're on Facebook, you have definitely been a test subject at some point. Ledvina: "Experiments are run on every user at some point in their tenure on the site…" 3. But you may have been a test subject in a very boring experiment. Ledvina: "…Whether that is seeing different size ad copy, or different marketing messages, or different call to action buttons, or having their feeds generated by different ranking algorithms, etc." [Ed Note: Link on "call to action" is mine not Ledvina's.] 4. This ex-employee of Facebook still doesn't understand why people are upset that Facebook researchers tried to see if they could upset people. Ledvina: "The fundamental purpose of most people at Facebook working on data is to influence and alter people's moods and behaviour. They are doing it all the time to make you like stories more, to click on more ads, to spend more time on the site. This is just how a website works, everyone does this and everyone knows that everyone does this, I don't see why people are all up in arms over this thing all of a sudden." 5. Facebook researchers forget that what they're doing has an effect on the real live people that use Facebook. Ledvina: "Every data scientist at Facebook that I have ever interacted with has been deeply passionate about making the lives of people using Facebook better, but with the pragmatic understanding that sometimes you need to hurt the experience for a small number of users to help make things better for 1+ billion others. That being said, all of this hubbub over 700k users like it is a large number of people is a bit strange coming from the inside where that is a very tiny fraction of the user base (less than 0.1%), and even that number is likely inflated to include a control group. It truly is easy to get desensitized to the fact that those are nearly 1M real people interacting with the site." Ledvina expressed surprise that the experiment playing with the emotional content of users' News Feeds was getting so much play in the press while other Facebook research has been ignored. He pointed to an event last year, where Facebook researchers and academic researchers got together to talk about work done to see how "Facebook and social networks in general can be more compassionate," and prevent bullying particularly among teens. "I am a bit taken aback by the fact that the most recent paper has gotten as much press as it has, when the work done as part of the compassion research days has never been mentioned," he writes. "Some of these papers are based on experiments that influence people's behavior in similar ways, but I guess they do not have as much cachet for whatever reason." I went ahead and watched the hours of presentations archived by Facebook from the "Compassion Research Day." There were a couple of key differences from the January 2012 manipulation study. First, none of the work done by the researchers aimed to make people feel worse. Secondly, the research on people's behavior was far less surreptitious. Thirdly, researchers didn't interfere with people's "natural" experience of the site beyond obvious (and some not-so obvious) prompts for feedback. One of the videos about "new tools to understand people" doesn't have a presenter talking about trying to influence people's emotional state in a negative way and then measure it by monitoring their status updates. " We asked users, What are you trying to do? Why did you click that button?'" says the presenter. "We learned a lot from asking people for feedback." This is a transparent way of "running tests" on users, presenting them with questions and asking them for feedback, a rather traditional approach to experimentation. Another presenter talked about measuring emotion by asking people to put "emoticon" faces on their status updates. Okay, Facebook users may not have realized they got these cute digital stickers for Facebook to measure their emotional states, but it's at least a translucent way of taking users' emotional temperature. Facebook researchers worked up these digital stickers so they could measure users' emotions across the world In trying to see how teens deal with stuff they don't like on the site, Facebook proactively asked the youngish ones how they felt when they flagged something they didn't like. That's a far more aboveboard approach than the one taken by the Facebook data scientist in the 2012 emotion study. The researchers were trying to find out why people get offended and how they resolve the issue, and giving them better tools to aid the process, which is all a far cry from turning the tone of News Feeds negative to see if it makes someone blue and turns them off using Facebook. The perhaps unexpected thing done by the compassion researchers' is collecting the messages Facebook users send to other users when asking them to take down an embarrassing photo or offensive post; they do this to try to understand why people object to content on the site. Facebook asking users how a post made them feel, rather than secretly divining it from their status updates Facebook's innovation' in gauging feedback, as presented at a company event in December 2013 Ledvina is pessimistic about the hubbub' over the study leading to any change at Facebook, though he makes very clear that he is an ex-employee who knows little about the inner workings at Facebook now. "The only thing I see changing from this is not whether similar experiments will be run, but rather [whether they will] be published," he writes, confirming the outcome that many commentators, including the New York Times' Farhad Manjoo, fear. "Similar experiments have been and will continue to be run, but you probably just won't see a paper about it anymore." http://www.forbes.com/sites/kashmirhill/2014/07/07/ex-facebook-data-scientist-every-facebook-user-is-part-of-an-experiment-at-some-point/ |