Facebook has taken a lot of heat recently for promoting fake or misleading news. The debate was sparked by politics with many users wondering if it had any influence in the U.S. election. CEO Mark Zuckerberg has promised to overhaul Facebook's news algorithms that classify news as legit or fake. By strengthening them, the hope is to help weed out false or misleading stories before they become popular. He also argued that these fake stories accounted for less than 1% of content on Facebook. What wasn't clear is if he meant total content on the site or content in news feeds, where such stories seem to make up a larger percentage.

Facebook users are reporting on Twitter that surveys have started to appear underneath news stories on their feeds. These surveys appear to be a small scale test of a new community reporting method aimed at stopping stories after they have been published. By asking users specific questions about misleading language in a title or withholding key details of a story, Facebook is trusting its users to be the judge.

Facebook made (real) news themselves recently when they fired their existing news team and created a task force to help stop these fake stories. This team would use linguists and large scale data analysis to help stop clickbait and other fake stories. Even with dozens of people on the team it was clear that this approach wasn't going to work by itself.

The survey approach is clearly different from a team of news experts at Facebook deciding what's real and what's not, but this could be part of a multi-step plan. Zuckerberg's mission to clean up his site will no doubt take time and many different strategies. There will still be users who believe the fake stories, so this must be accounted for. Hopefully a team of fact checkers and linguists, a beefed up algorithm, and crowdsourced flagging can help make this issue yesterday's news.