Quantcast
Channel: Chuq Von Rospach » Computers and Technology
Viewing all articles
Browse latest Browse all 69

Facebook Steps In It

$
0
0

Over the weekend Facebook stepped in it in a big way. One of the earliest sites breaking this was A.V. Club:

Scientists at Facebook have published a paper showing that they manipulated the content seen by more than 600,000 users in an attempt to determine whether this would affect their emotional state. The paper, “Experimental evidence of massive-scale emotional contagion through social networks,” was published in The Proceedings Of The National Academy Of Sciences. It shows how Facebook data scientists tweaked the algorithm that determines which posts appear on users’ news feeds—specifically, researchers skewed the number of positive or negative terms seen by randomly selected users. Facebook then analyzed the future postings of those users over the course of a week to see if people responded with increased positivity or negativity of their own, thus answering the question of whether emotional states can be transmitted across a social network. Result: They can! Which is great news for Facebook data scientists hoping to prove a point about modern psychology. It’s less great for the people having their emotions secretly manipulated.

This issue has grown across the net and is now being covered by the major media, so this is building into a possible flash point. For reference, the full paper is here.

I’ve had an increasingly frustrated relationship with Facebook over the years as they continue to change the site from a place where people interact with each other to one designed primarily to push brand advertising into your face while you are never quite sure if people saw what you posted or not (and chances are, increasingly, no, they didn’t. Unless you paid for placement).

My first reaction to this was — I’m done — but I realized I was reacting from anger, so I decided to sit on it for a few days while we waited for more information to come out. I was also somewhat fascinated by how I (and many of the people I talked to) reacted very viscerally to this. There were early questions whether Facebook and the researchers followed protocol for approval of this study (answer: yes), and if you step back and think about it objectively, most of us who build and run web sites have done A-B testing and done work designed to encourage one behavior instead of another, so I think one big unanswered question for the industry in general is where draw the lines between that kind of operational behavioral management and, well, this.

Because it’s clear most of us think this kind of manipulation is over that line, but I doubt any of us can explain where the line is. That’s something we need to grapple with and find consensus on. Much of the web is doing behavioral modification in some way or another — but this was explicitly studying emotional modification. The two are very closely related, so where do we draw that line that one’s okay and the other isn’t?

My problem with this is == Well, beyond that nobody was told it was being done to them or had any option to opt in or out — is the possible side effect of someone already fighting depression having  their account tweaked here to condition their mood more negative. What if that pushes them into a suicidal state, folks? This isn’t “hey, if we make the button orange, sales go up 3%” territory.

On top of that, the results were trivially small — and some researchers think the study was so badly designed it was worthless. So the impact was small enough I don’t think it was statistically useful, and even if it was, reading through the analysis of the study, the data is crap.

I think  Business Insider nails the essence. Those of us reacting to this aren’t reacting to what they did, but the fact that Facebook did it.

Even Susan Fiske, the professor of psychology at Princeton University who edited the study for Proceedings of the National Academy of Sciences of America, had doubts when the research first crossed her desk.

“I was concerned,” she told me in a phone interview, “until I queried the authors and they said their local institutional review board had approved it—and apparently on the grounds that Facebook apparently manipulates people’s News Feeds all the time… I understand why people have concerns. I think their beef is with Facebook, really, not the research.”

After thinking about it for a while, she’s right. I’m reacting to this because it’s Facebook.

Facebook’s initial response was pulled from the big book of generic responses, and focussed on privacy, which came across as tone deaf and completely missing the point. That hasn’t helped settle this controversy at all, and to me reinforced that Facebook seems oblivious to the larger issues a study of this type touches. That, unfortunately, doesn’t surprise me one bit.

I am probably leaving Facebook

I’ve had a muddled relationship with Facebook for a while. I removed any reference to my Facebook account from my sites over a year ago to discourage anyone from following me there. There are lots of things they do I’m not happy with, from their Terms of Service and their rights grabs on content to their continuing manipulation of feeds that make it hard to actually use it to stay in touch with each other — they’ve effectively killed the ability of a small or medium-sized business to use it as a marketing site unless they want to start paying for placement, for instance. I’ve been using it less and less over time, putting less content on it. The primary use I have for it now is to keep track of people I can’t keep track of anywhere else.

I’m going to give it another week to think it through, but as of now, I believe this is the last straw, and this camel’s back has had it with Facebook. It’s not about the study, per se, but about Facebook’s attitude toward it’s users, and from watching the reaction out on the wide web in general, I’m not alone feeling like Facebook doesn’t see it’s user base as people, but as things to be manipulated to improve the revenue stream.

My bottom line is that I keep looking for ways to spend less time on Facebook because I get very little value for the time spent there — I’d rather put that time into more productive things. Facebook just isn’t that interesting or useful, and I’m not thrilled with their tendency to set policies that ignore the needs and interests of their users in favor of things that benefit Facebook.

So I think I’m done. But it’s not something I want to make a final decision on until I see how (if…) Facebook responds. Maybe they’ll surprise me. This controversy is growing fast enough it really needs a strong response from Zuckerberg to get some control over it again. I’m going to be really curious to see if that happens.

But assuming nothing changes my mind, a week from now, I’ll be closing my Facebook account. If you follow me on Facebook, you probably ought to hook in with me somewhere else. And you should ask yourself if you really get value out of the hours you invest in Facebook every month. If not, maybe you should consider a change of venue, too.

We need to figure out where to draw the lines in the sand

A more important issue is this: A lot of companies are doing a lot of manipulation of a lot of people online, and there’s a lot we don’t know about what that does to the people involved — and if your team does any A-B testing at all, you’re part of that.

Most of that manipulation is harmless, of the “Red buttons generate more sales”, but there’s a huge amount of knowledge we don’t have about the impacts on our users. The Facebook study could have been an interesting step in the direction of filling some of those gaps, if it’d gotten good data and had been well-designed, but it was neither.

We as a community need to figure out where we cross that line from simple tweaking of a users preferences to encourage a specific action and into manipulating a user at a deeper level in a potentially dangerous and damaging way. We have no clue where those limits are, and there is no standards or oversight for teams experimenting in these areas or building sites that use these techniques. We need them.

The question is, who’s going to bell that cat? I have no idea, but I felt the question had to be asked.

 

 

 

The post Facebook Steps In It appeared first on Chuq Von Rospach.


Viewing all articles
Browse latest Browse all 69

Trending Articles