That's true. However, most A/B testing doesn't result in an academic paper, and there are strict ethical rules about academic work based on human subjects. Second, obviously not all A/B tests are the same. If you're testing two different fonts and comparing bounce rates, there are far fewer ethical concerns than if you're intentionally spreading an "emotional contagion" (that's the term Facebook used). And note that if you wanted to publish an academic study based on the font experiment, you would still need informed consent, not just "these people agreed to let us do whatever we want when they signed up 10 years ago, so it's fine".
There is nuance to this. The point of IRB is ethics.
1. If the experiment is minimal risk and expected to benefit participants (as the notorious fb experiment was—it was emotional contagion of positive emotions), then informed consent may be waived.
2. When there is a greater risk from the act of gathering informed consent (e.g., because you need to store identity for that purpose), then informed consent may be waived, if minimal risk.
> as the notorious fb experiment was—it was emotional contagion of positive emotions
No, they tried both positive and negative.
> When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred.
> If affective states are contagious via verbal expressions on Facebook (our operationalization of emotional contagion), people in the positivity-reduced condition should be less positive compared with their control, and people in the negativity-reduced condition should be less negative.
The experiment was expected to make half of the subjects less positive.
Edit: This discussion of whether consent could be waived is sort of irrelevant when Facebook didn't even have an IRB at the time, and Cornell's IRB inexplicably declined to review the study even though two of the authors were part of Cornell.
I still find it challenging that it would be completely acceptable for Facebook to run private A/B tests with the same variables to optimizing advertising revenue but the unethical part is making it public after peer review.
I guess i just don't get or agree with the ethics of that situation.
It absolutely is not acceptable to do this without publishing it, although obviously they do it anyway. Imagine if you worked at Facebook and your boss said "Hey can you crank up the negativity on everyone's feed for a few days and then track their emotions? Pretty sure it's gonna make them all angrier, but I want to be certain." You would refuse, because that's obviously unethical. Maybe doing it for published research makes the engineers more likely to go along with it, I don't know.
I guess the main point is that just because there wasn't much negative PR about this before Facebook published their study, that doesn't mean what they were doing was acceptable, it just means most people didn't know it was happening. And experimenting on people without consent in order to enrich yourself has always been and will always be unethical, no matter how many people make their living that way.
One more thing I'm thinking about: sometimes people compare this kind of A/B test to, for example, a supermarket rearranging items on the shelves to maximize purchases. I think there are important differences, although still some ethical concerns. With a supermarket you don't need to interact with or even know about the existence of the shoppers, you can just arrange the store and then monitor sales. More importantly, (I hope) supermarkets aren't running these tests with the expectation of harming anyone. The equivalent of the Facebook study would be putting the unhealthiest products in convenient places and then monitoring customers to see if they gain weight.
I guess you really don't like the advertisement and UX design industry, then! I mean, there is an endless stream of human experimentation...
What should we do about it? I personally think it is better to run the experiments in public so we understand, but i can get why you might disagree. But i would like to know what you think the policy should be.
Yes I strongly dislike the advertising industry. UX designers I don't have a problem with - they often use informed consent ("check out our new design!"), and as far as I know they don't usually run intentionally harmful experiments. I'm not sure how they could, Facebook is in a somewhat unique position there as it's so embedded in some people's lives.
I'm not sure there's really a policy solution, I think any law that made what Facebook does illegal would also make a million other legitimate activities illegal. It's good that Facebook was shamed into apologizing and creating some kind of internal review board, society at large did a great job making that happen, and we should keep that up. Maybe journals can institute better policies about not publishing unethical studies so they don't legitimize and encourage this behavior.
Note that it's not a choice between "Facebook does everything in secret" or "Facebook publishes all its research". Even if there had been no consequences for the emotional contagion study, Facebook would still keep most research internal, just like the tobacco industry buried cancer studies or the oil industry buried climate change research.
(And everyone agreed to the terms of service)