With that in mind, I'd find it important if our social media analytics publicly accounted for such emotional manipulations.
It could be wise to begin to share such emotional-information influences to equally let users and admins be so sensitive (and acknowledge its a real part of our systems).
Users aren't fed every post by every one of their friends -- users would be overwhelmed with posts flying by so fast they couldn't keep up -- so it's no secret FB tweaks its feed algorithm to keep users engaged. And experimenting with network effects is what social networks do.
Every time you see a picture of someone you like it's like a little shot of dopamine goes off in your head. Facebook wants to optimize those dopamine shots to continually bump engagement and create an experience where everyone is habitually checking their feed every 10 mins.
Exactly why I'm trying to articulate importance of giving public facing language to such persuasive technologies, by actually having formal public markers for emotions being manipulated.
(e.g. Apple ad with astericks linking to emotional signaling research and analytics, to help users determine deeper interests [this ad attracts people who feel X, Y, Z based on...]).
Pretty much every study? Most of the important psychological studies tend to follow a pattern where scientists tell the participants that they're doing X (e.g. measuring their control of body parts when drunk) while actually measuring a completely different Y (half of the group were given fake, non-alcoholic drinks; the goal of the study was to check how much of drunken behavior comes from you knowing that you drink alcohol).
You can't get good results if your subjects know what you're testing for.
Since those participants did give consent to physically being _in_ a study, that's voluntary.
If social norms within the test population (e.g. graduate students) are that pretty much every study tells participants they are doing X, but they end up doing Y, then people are clearly agreeing (consent, free will, expectation based on norms) that they won't know. The norms are a form of disclosure.
What we have here are invisible, silent experiments that violate precedents and norms, without consent.
It could be wise to begin to share such emotional-information influences to equally let users and admins be so sensitive (and acknowledge its a real part of our systems).