Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Isn't this what Sacha Baron Cohen asked for in his speech? That there be state mandated rules on what's fake vs not. I don't think this issue has any good solutions - damned if you do, damned if you don't. The only 'okay' way to solve for this is to crowd-source a variety of opinions on the issue and show what else people are saying about it - not as the ultimate truth, but rather to broaden people's perspective on it that there are different ways to look at the same issue.


I've always wondered how Wikipedia has solved this... Isn't it essentially the same problem?


Wikipedia built a culture in which editors who lean towards truthful informative are encouraged, and a userbase which values that. This is only possible with non-profits having strong voluntary and academic participation. Nobody will dedicate quality time for a closed for-profit network like Facebook.

But I feel that the bigger problem is that people want fake news. Most people have a hard time living their everyday lives; fake news which supports their choices and viewpoints is a release, and it makes them feel good. And networks do a good job of surfacing similar content once they gauge one's interests.


The sheer volume of user-generated traffic on Facebook suggests that plenty of people are willing to dedicate quality time to it.



It's not really solved on Wikipedia. Wikipedia fights happen all the time in controversial topics, and it's essentially solved by techno-theocracy; moderators show up and lock a contentious topic against vandalism when it's too heated, and whatever the current state of the edit war is gets frozen.


Wikipedia just has far less active users than Facebook. As in, number of users actively creating content. The amount of content on Facebook is also orders of magnitude higher than Wikipedia. And the friction in creating content is also much lower on Facebook.


Wikipedia publishes a lot of incorrect information.

The fact that you can source a piece of information doesn't make it true.


They haven't. Wikipedia is essentially a battleground when it comes to recent topics that are politicized. Sometimes an admin will essentially lock editing of the page and whatever version was up at the time is essentially staying there.


They haven’t solved it! A group has wrangled the power and Wikipedia represents their vision of the truth. This is fine where talking about objective things like Trump was at the White House for Thanksgiving in 2019. This is not fine for subjective things which is WAY more of the world then people want to admit.

We eat food for breakfast.

Is this statement true or false? You could mark it either way and both could be supported by rational thought! If you don’t come up with a framework for answering first and then evaluate that question within the context of the agreed upon framework the truthiness of the statement will remain subjective.


But then it bursts people’s political and opinion bubbles, which is something Facebook might not want.

Keeping people in bubbles is how Facebook keeps people browsing and wasting their time on Facebook. Because people like their bubbles.


I don't think you can regulate opinions.

But what you can do quite easily is to disallow statements that are factual wrong.

That would solve at least a part of the problem.

If I say things which where already proven wrong, then I can't post them on Ads.


> things which where already proven wrong

Proven by whom?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: