Either way, this post concerns a story by Erika Morphy published on October 3rd, entitled Being Facebook Means Never Having to Say You're Sorry. And the subtitle/blurb for it reads as follows:
Facebook may seem apologetic about the way it manipulated users in a study to gauge their emotional responses to certain types of posts, but it stopped short of actually making an apology. It also didn't quite say it wouldn't do it again. In fact, the guidelines Facebook has promised to follow in conducting future research are general, vague and, well, they're guidelines.
The article then begins in earnest with a brief summary of the most recent installment in Facebook's soap opera relationship with the public:
Facebook on Thursday announced it had developed a framework for conducting research on its 1.3 billion or so users.Although Facebook so far has revealed only the general outlines, this framework clearly is a response to the onslaught of criticism the company received this summer, when it blithely reported the findings of a study about how News Feed content affected a user's mood.
In carrying out that research, Facebook withheld certain posts and promoted others to see how users would react.
And then on to how Facebook's actions were received by most of us:
When its methodology became public, reactions were immediate and harsh.
"While all businesses of this scale constantly experiment with the factors that influence their customers and users, there's something especially spooky about the idea of being experimented on in our digital lives," said Will McInnes, CMO of Brandwatch.
Facebook apparently was caught off guard by the vitriol.
At this point, a new section begins with the heading, "A Look at the Framework":
The new framework includes giving researchers clearer guidelines and, in certain cases—such as when dealing with content that might be considered deeply personal—putting the project through an enhanced review process before the research begins.
Further review is required if the work involves collaboration with someone in the academic community.
Toward that end, Facebook has created a panel comprised of its most senior subject-area researchers, along with people from its engineering, research, legal, privacy and policy teams to review projects.
Facebook also will put its new engineers through a six-week training boot camp on privacy and research related issues.
Now that all of the basic information is established, it's time for some discussion, analysis, and evaluation, and this begins a new section under the heading of "Informed Consent?":
Facebook's new guidelines appear to be missing some fundamental ingredients, starting with actual policies on what will or won't be permissible.
For instance, it is unclear whether Facebook would repeat this summer's study under the new guidelines.
The company has not exactly said that it shouldn't have tinkered with users' News Feeds—just that it should have considered other, perhaps nonexperimental, ways to conduct its research. Facebook acknowledged that its study would have benefited from review by a more senior group of people. It also owned up to having failed to communicate the purpose of its research.
And now it's time to hear from the peanut gallery, aka moi:
Facebook has not promised to inform users the next time it conducts a research project, noted Lance Strate, professor of communications and media studies at Fordham University.
"Instead, Facebook is in effect saying, 'I'm sorry, I made a mistake, I won't do it again, I can change, I promise—just trust me,' while giving their users absolutely no concrete reason why they should be trusted," he told TechNewsWorld.
The irony is that Americans usually are very willing to participate in consumer research and divulge all sorts of personal, private information in focus groups, interviews, surveys and opinion polls—as long as they are asked whether they are willing to take part in the study first, Strate pointed out.
Indeed, asking permission to conduct such studies goes beyond privacy and business ethics to common courtesy and basic human decency, he said. "It's the sort of thing we teach children long before they enter kindergarten—to ask for permission, to say, 'Mother, may I' and 'please' and 'thank you.'"
Facebook's apparent sense of entitlement regarding the collection of user data and the violation of user privacy is one reason for the extraordinary amount of buzz surrounding the launch of Ello as an alternative social network, Strate added.
It is funny how there seems to be an overall decline in civility in American society that correlates to the rise of new media, and first starts to appear in relation to the electronic culture of television. So perhaps it should not come as a surprise that Facebook would be a part of this trend, and it would be reflected in the ways in which Facebook treats its users.
Thinking about the term user itself, it does suggest what Martin Buber would call an I-It relationship. And while the alternative, an I-You relationship, is based on a sense of mutual respect between participants, a mutual recognition of each other as entities, the I-It relationship is fundamentally asymmetrical. And calling us users suggests that we are the I in the relationship, and Facebook is the It. But Facebook's behavior indicates that they see the situation as quite the opposite: The users are the It, and Facebook is the I.
So maybe, rather than referring to us as Facebook users, we should be called the Facebook used?
Anyway, here's how the article ends:
That thought surely has occurred to Facebook's executive team, which might have been one factor behind the release of the guidelines, McInnes told TechNewsWorld.
"Facebook's greatest fear and business risk is a user exodus, and so it knows that the trust of users is crucial," he said. "This move represents Facebook stepping up and looking to close down the risk of such a backlash again."
So, how about we set up a pool to see when the Facebook exodus will actually occur? I'm thinking some time around 2019, but it could be as soon as 2016. I guess it all depends when the New Media Moses arrives to part the Web(2.0) Sea...
Anyway, just for the record, here's the original comment that my quotes were taken from:
The essential ethical principle for research involving human subjects is informed consent, and that is exactly what Facebook has failed to address, or apparently to accept. Instead, Facebook is in effect saying, "I'm sorry, I made a mistake, I won't do it again, I can change, I promise, just trust me," while giving their users absolutely no concrete reason why they should be trusted. The irony is that Americans by and large are quite willing to participate in consumer research and divulge all sorts of personal, private information in focus groups, interviews, surveys and opinion polls, as long as they are given a choice, as long as they are asked whether they are willing to take part in the study first. This goes beyond issues regarding privacy and business ethics to common courtesy and basic human decency. It's the sort of thing we teach children long before they enter kindergarten, to ask for permission, to say "mother may I" and please and thank you. Facebook's apparent sense of entitlement regarding the collection of user data and the violation of user privacy is undoubtedly the reason for the extraordinary amount of buzz surrounding the launch of Ello as an alternative social network, and for this reason we can expect to see continued erosion of Facebook's near monopoly over the social media sector.
Not much different from the article, Morphy did a good job with it I thought, but this way you can see my complete thought. And since then, aside from Ello, I've heard a little about another alternative, MeWe. And no doubt there are dozens of folks out there working on alternative social media platforms, hoping that theirs will be the next one to catch fire and take over as the internet's burning bush.