Facebook – News Trust Survey

 

As most users will know, Facebook has faced considerable criticism regarding the spread of misinformation as well as the prevalence of fake news items on the site. In an attempt to counteract and correct these the CEO (Chief Executive Officer – Mark Zuckerberg) announced changes are to be made to the newsfeed algorithm utilised for determining what to show to its users: see Facebook – News Feed Changes.

 

The Survey

To help with the above, Facebook also decided to survey users regarding which news sources they consider to be trustworthy. The survey consists of two questions:

Question 1: ‘Do you recognise the following websites?’

Q1 – Answer options: ‘Yes’ or ‘No’.

Note: This question principally asks if the user has heard of particular publications.

Question 2: ‘How much do you trust each of these domains?’

Q2 – Answer options: ‘Entirely’; ‘A Lot’; ‘Somewhat’; ‘Barely’; ‘Not at All’.

Participation is ONLY BY INVITATION. It is also understood, for now at least, the survey is limited to users within the USA though it may be extended to other territories in the future.

Survey Censure

Some have criticised and expressed concern regarding the nature of the survey and its potential, overall, effectiveness.

Concern 1: Several have expressed concern about the brevity of the survey stating trust in news is a far more complicated matter than may be dealt with by two short questions.

Response 1: Facebook acknowledge the issue but have stated it would be too complicated to have a long detailed list of questions that would probably confuse more than help. In addition: The survey results will be considered in conjunction with reader habits e.g. what they usually read. (Facebook already has a considerable database of each users habits.)

Concern 2: Some professionals have also commented that the matter would be better dealt with by surveying ‘experts’ rather than the general user.

Response 2: Facebook management considered the option of approaching professionals and experts for their observations but ultimately decided this would probably not provide ‘objective’ results. (Some critics said this was just because Facebook did not want to pay for ‘experts’ time and services through Facebook have argued against this.)

Note: Mark Zuckerberg also stated Facebook was ‘not comfortable deciding for itself whether news outlet is reliable.’

Concern 3: Result could be distorted by those who have partisan viewpoints e.g. left-wing v right-wing, etc.

Response 3: No specific response made. However, common sense would indicate, though, of course, there will be those who will favour a preferred site or publication, it is unlikely to be on such a scale as to overly distort the survey results.

Undoubtedly, there will be those who may have other concerns/criticisms. The above are simply some of those that were immediately expressed on a wide scale.

Survey Result Integration

These trust surveys are not carried out in isolation: they have been designed to form part of an overall reassessment of newsfeed algorithms. The aim is to enable Facebook to provide a far more relevant service. The overall intent is to help decide whether news may be considered trustworthy; informative; relevant to people’s local community; etc. (As already stated above, survey results are to be considered in conjunction with users reading habits.)

Note: News trustworthiness is a very real issue for users e.g. a recent survey in the UK (United Kingdom) indicated ONLY 24% of users trust news and information available through social media.

Of course, despite all their efforts Facebook, or any other social media source, can not guarantee though a source/site/publication has been marked as trustworthy by users, the news presented will be accurate. It would be unrealistic and outside the services mandate to expect them to be able to check the validity of each and every news item shared.

Facebook have also stated algorithm changes will not reduce the amount of news users see but will change the balance to give preference to news from sources determined trustworthy.

Those Surveyed

The purpose of this article is not only to inform readers about the survey and its content and purpose but to also request it is taken seriously by those who receive an invitation to participate (as previously mentioned, participation is only by direct invitation). Interviewees are asked to:

  • carefully consider each website/publication/news source they are asked about;
  • seriously consider/determine how much they trust information from them;
  • honestly give due thought and weight to the answer they give.

Though, as already indicated, there can be no guarantee as to overall accuracy and trustworthiness of news sources, interviewees responses will have a considerable impact upon how Facebook organises the algorithms they utilise for newsfeed content.

Conclusion

Though constant changes can be and are frustrating, users should be grateful Facebook is taking the issues of misinformation and fake news seriously. And also that they are involving users in the process.

There may be merits to the concerns some have expressed and perhaps there are other ways the matter could be dealt with but at least, if nothing else, this is a start. Undoubtedly, if the problems continue, which regrettably is likely considering the prevalence of cyber criminals and hackers these days, Facebook management will build upon the changes currently being incorporated.

Those who are invited to participate in the survey are asked to carefully and seriously consider the questions and their responses. The information they provide will have an impact upon all Facebook users.

It may help to read this article in conjunction with Facebook – News Feed Changes.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s