Facebook News Survey Raises Eyebrows

Facebook News Survey Raises Eyebrows

Facebook has put out a news survey, and many are questioning its effectiveness. The news survey, meant to learn which news publishers Facebook users trust, only includes two questions. Because of this, some wonder how accurate it will be and how much use it can really be for Facebook.

The two questions, according to the BBC, ask users about specific websites. Users are first asked if they recognize the chosen websites. Then they’re asked to use a scale to represent how much they trust those websites. The scale has five options, starting with “not at all” and ending with “entirely.”

Facebook has taken many different steps in recent months to specifically try to combat false news stories taking hold on their platform. For instance, in November they announced a tool that would let users see if they’d interacted with Russian propaganda sites. This tool went live soon after, and it can be found by visiting the Facebook Help Center.

They’ve also tried various different ways of letting users spot fake news. For a while, Facebook used Disputed Flags to show users that other sources had disputed certain content. Then Facebook switched to using Related Articles to help identify false news stories. The Related Articles that show up underneath content can help offer different views on the story, including showing third-party fact-checking articles.

Though people are questioning the value of a survey with only two questions, Facebook has said there’s a reason for its simplicity. They’ve said surveys can sometimes be confusing and “bias signal,” and they’ve mentioned the information gathered is only applied to “publishers for which we have enough data.” Naturally, Facebook will not be basing their decisions solely on this news survey, instead working with various other data, as well.

Facebook’s news trust survey may be cause for concern for some users who see it as too simple, but Facebook seems to feel it will be a useful way to help prioritize news articles from sources that can be trusted. As it’s implemented, we’ll see how useful it really might be while working alongside Facebook’s other features and updates that are meant to stop the spread of false news.

Facebook Creating Tool to Address Russian Activity

Facebook Creating Tool to Address Russian Activity

Facebook will soon release a tool that lets you see whether you interacted with any Russia-run Facebook pages during the 2016 election. Not too long ago, The Guardian reported that 126 million Americans probably saw Russia-backed pages that were working to interfere with the most recent presidential election. Facebook thinks there were likely 120 fake pages working to influence the political atmosphere from 2015 up to this year.

Since all this news has come out about the political trolling that happened on Facebook, many people have surely wondered whether they were the victims of any of these fake pages. Many people interacted with the fake pages and had no idea what the underlying agendas really were. Facebook’s upcoming tool is the answer to this.

In a press release titled “Continuing Transparency on Russian Activity,” Facebook wrote on the 22nd that they were working on a “portal” that would help people figure out how they interacted with the Russian Internet Research Agency on both Facebook and Instagram. Specifically, it will show whether a user liked or followed any of these Russia-run pages from 2015 to 2017. This tool will be released before the year ends.

The screenshot in Facebook’s press release shows a section in the Facebook Help Center that features the soon-to-be-released tool. In the screenshot, it shows a list of different pages (with some examples including “Being Patriotic” and “Blacktivist”) with the pages’ profile photos. Next to each page is a “Like/Follow Date” that shows when the page was interacted with.

Of course, so far it looks like it will only show whether you liked a page or followed it. Based on what has been released so far, it doesn’t appear to show whether you simply liked a post or commented on one. These pages could have influenced many people even if users simply saw them shared from friends and then interacted with individual posts.

This portal, though, will at least show whether a user was following or supporting a page that Facebook says “tried to sow division and mistrust” over the course of the last election. It’s important to know whether we have seen and trusted these pages, because it may help us learn how to spot misleading news and posts in the future. Facebook and other online sites can be powerful tools in the wrong hands, and Facebook’s upcoming portal in the Help Center may be able to show us just how influential these political pages have been.