After riding a public relations roller-coaster for several days, Facebook appears to be ending the week on a peak rather than in a valley: it's apologized to the LGBTQ community for a real-name policy warning that targeted drag queens using the site, it revealed a new framework aimed at defusing criticisms of its good-news/bad-news research on nearly 700,000 unsuspecting users, and it won the European Commission's approval for its planned acquisition of WhatsApp.
As always, however, the devil is in the details. While the WhatsApp acquisition now seems settled and LGBTQ users might be less tempted now to jump ship for rival social networking site Ello, Facebook's stance on human research continues to draw scrutiny.
FTC, EU WhatsApp Hurdles
Announced this past February, Facebook's planned acquisition of the messaging service WhatsApp -- for $4 billion in cash and around $12 billion in Facebook shares -- has faced several hurdles since then. In March, the Electronic Privacy and Information Center and the Center for Digital Democracy both filed complaints with the U.S. Federal Trade Commission expressing concern about the acquisition's possible effect on the privacy protections WhatsApp had promised to users.
The FTC responded in April by sending a letter to Facebook and WhatsApp specifying the privacy requirements both companies needed to meet. The letter noted that any "material changes" made in how user data is collected would require the companies to first get "affirmative consent." Without that, Facebook could face a repeat of the situation it found itself in in 2011, when it settled FTC charges that it had deceived users by failing to keep its privacy promises.
Facebook in May then requested a European Union review of its proposed WhatsApp purchase, apparently in a bid to prevent individual reviews by each of the EU's 28 member nations. The European Commission on Friday announced its approval of the acquisition, with Joaquin Almunia, vice president in charge of competition policy, noting: "While Facebook Messenger and WhatsApp are two of the most popular apps, most people use more than one communications app. We have carefully reviewed this proposed acquisition and come to the conclusion that it would not hamper competition in this dynamic and growing market. Consumers will continue to have a wide choice of consumer communications apps."
'Authentic, Real-Life' Names OK
On Wednesday, Facebook also issued a mea culpa to its community of lesbian, gay, bisexual, transgender and queer (LGBTQ) users for having sent out numerous messages warning them to either use their legal names on their Facebook accounts or risk having their accounts disabled. The messages were sent out after Facebook received a large number of user reports regarding, for example, drag queens' use of names such as "Sister Roma" or "Lil Miss Hot Mess" on their Facebook accounts.
In the ensuing days, numerous news reports described how some Facebook users offended by such warnings were jumping ship to Ello, a rival social networking site. Facebook switched gears and met with a number of those users -- including the drag queen Sister Roma -- on Wednesday.
Following that meeting, Facebook Chief Product Officer Chris Cox posted a lengthy apology about the real-name policy, noting that the company would allow users to post under the "authentic name they use in real life," rather than under their legal names.
"We owe you a better service and a better experience using Facebook, and we're going to fix the way this policy gets handled so everyone affected here can go back to using Facebook as you were," Cox said. "We're already underway building better tools for authenticating the Sister Romas of the world while not opening up Facebook to bad actors. And we're taking measures to provide much more deliberate customer service to those accounts that get flagged so that we can manage these in a less abrupt and more thoughtful way."
The 'Emotional Contagion' Furor
Finally, Facebook Chief Technology Officer Mike Schroepfer on Thursday, posted an update on the company's news pages about "some changes we're making to the way we do research."
The changes are being made in response to a furor that erupted in June after news emerged that Facebook had experimented with the news feeds for 689,003 users without their knowledge. The results of that weeklong experiment in 2012 -- in which some users' news feeds were manipulated to highlight good news, while others received feeds of predominantly negative news -- were eventually published in the Proceedings of the National Academy of Sciences in a journal article titled "Experimental evidence of massive-scale emotional contagion through social networks."
Facebook came under considerable criticism for that experiment in the weeks and months following, as many pointed out that one essential for ethical research involving human subjects was informed consent. Facebook claimed its data use policy covered that concern, but did not add "research" to that policy until after the "emotional contagion" study came to light.
In his Thursday post describing Facebook's new research framework, Schroepfer wrote: "We were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism. It is clear now that there are things we should have done differently. For example, we should have considered other non-experimental ways to do this research. The research would also have benefited from more extensive review by a wider and more senior group of people. Last, in releasing the study, we failed to communicate clearly why and how we did it."
Schroepfer said Facebook's new policies call for "clearer guidelines" for researchers, a review panel "including our most senior subject-area researchers," additional research training during Facebook's six-week boot camp for new engineers, and a new single location for all of Facebook's published academic research.
As many have already noted, the new framework is short on many specifics and does not include any mention of review by independent, outside experts. Nor does it specifically state that "informed consent" will be a research requirement for any other Facebook experiments going forward.