Guidance: Opinion polls, surveys, questionnaires, votes and 'straw polls'

Editorial Policy issues

This guidance note relates to the following Editorial Guidelines:

Key points

  • The BBC rarely commissions voting intention polls.
  • Commissioning an opinion poll on politics or any matter of public policy must be referred in advance to Chief Adviser, Politics. Consultation with the Political Research Unit is advised in most cases.
  • Before proposing the commissioning of an opinion poll, programme-makers should weigh up several factors, including whether it would tell us anything new, what they would do if the results contradict a preconceived “narrative”, whether it is possible to track a trend, what other factors might skew the results and whether there might be any reputational damage from the BBC doing a poll on a given subject.
  • On matters of public policy, political or industrial controversy, or on ‘controversial subjects’ in any other area, polls should normally, in the UK, be commissioned using members of the British Polling Council.
  • A large sample does not make up for inadequate methodology.
  • The BBC never commissions voting intention polls during election campaigns.
  • Any proposal to commission a survey on a controversial subject must be referred in advance to Chief Adviser, Politics. Its credibility will depend on having spoken to a significant and agreed proportion of the whole measurable group with an approved methodology.
  • When a survey has been commissioned by an outside body with an interest in the issue, the audience should be told and we should exercise real scepticism in how we treat it.
  • Focus Groups and Panels can provide qualitative but not, generally, quantitative data. They should not usually be treated as representative.
  • Any proposal to use either focus group research or a panel on party political issues must be discussed with Chief Adviser, Politics at an early stage - before it is commissioned.
  • “Straw Polls” have no statistical or numeric value. They should only be used with an explicit reference to the audience about its limitations. They should never feature in news bulletins or be used to “gather serious information on party political support”.
  • Anyone proposing to carry out a telephony vote must submit the relevant referral form and should also read the logistical guidance on phone or SMS voting.
  • Vox pops are a tool of illustration, not a tool of research.
  • Any proposal to conduct an online vote on an issue which is political, concerns public policy or is in any way controversial must be referred to the Chief Adviser, Politics.
  • Anyone proposing to carry out an online vote must submit the relevant referral form and should also read the logistical guidance for online voting. 

Guidance in full

Introduction

BBC journalists and programme-makers routinely invest much time, effort and professional pride in ensuring the accuracy, clarity and credibility of their output. Especially when information is being summarised, the audience must be able to trust that the journalism behind what they see and hear is robust, that research is reliable and meaningful – and that the language used is both consistent and truthful.

This accuracy, clarity and credibility is as important when we report on “polls” and “surveys” as it is in the rest of our journalism. When we commission such work ourselves and invest the BBC’s authority, it is even more vital that the audience is able to trust what we are saying.

Similarly, when we invite the audience to interact with our services through voting by phone or online, especially on serious or controversial issues, it is important that we deal responsibly with their views and do not allow such votes a greater significance than they merit.

Opinion polls, surveys, questionnaires, phone and online votes are useful and fruitful ways of listening to our audiences – but we must be rigorous in applying due scepticism and in using precise language to ensure the integrity of the BBC’s journalism is not damaged.

This guidance – which should be read in conjunction with Section 10 of the BBC’s Editorial Guidelines – aims to:

  • help programme-makers and journalists using polls and surveys to do so appropriately and within the Editorial Guidelines;
  • clarify terminology and methodology;
  • promote greater consistency across the BBC in the use of polls, surveys and other attempts to gauge or illustrate opinion;
  • set out the uses and, importantly, the limits of voting and questionnaires online and of “straw polls”;
  • encourage programme-makers to think creatively about how they can include public opinion in their output without compromising journalistic standards.

Commissioning Opinion Polls

An opinion poll is normally trying to seek a representative view of the population as a whole or a significant section, (eg “under 35s”, “Londoners” or “parents”) by reaching an appropriate sample.

  • Its authority will lie in the credibility of the company used and its methodology, including how it is “weighted”;
  • Its reliability may depend, for instance, on:  the sample size; the complexity of the issue; how long it has taken (an immediate poll, conducted over a day or two, is not likely to be as robust as a less topical poll carried out over several weeks);
  • A series of polls carried out over a period, using the same methodology and the same questions, are likely to be more robust, with more helpful information about shifting opinion than one-off or sporadic polls, or different polls using varying methodology and questions. 

An opinion poll is attempting a form of measurement – inviting the audience to draw some broader conclusions, trusting that the statistical basis is sufficiently robust to justify that the results have some representative value of the population (or the relevant section of it) as a whole. 

So when we commission such research ourselves and disseminate it in the name of the BBC, the science and the data, as well as the accuracy of the language, must stand up to the most searching public scrutiny. 

  • When appropriately conducted, opinion polls can add real editorial value to our output; they can be a highly creative and informative device to complement and enhance our output and may reveal opinions, policies or behaviour which shed new light on important issues;
  • However, when the main purpose for commissioning them is to draw attention to a programme – to create publicity, or, perhaps, to provide focus for an otherwise uncertain editorial theme – they are usually of less value to the audience, risking predictability and – worse – a poor use of the licence fee. 

The BBC rarely commissions polls on voting intention or other indications of party political support.

Commissioning an opinion poll on politics or any matter of public policy [1] involves a mandatory referral - in advance – to the Chief Adviser, Politics, for consultation and approval.

In most cases, there should also be consultation with the Political Research Unit regarding phrasing of questions, sample size and other technical issues, or advice on appropriate companies.

Reference to PRU and/or Chief Adviser Politics is advisable when commissioning any opinion polls – especially on potentially controversial issues.

Programme-makers should ask themselves searching questions before proposing the commissioning of an opinion poll. These might include:

  • Are the results likely to tell me something new, or are they geared towards reinforcing something I think I already know?
  • If the results are unexpected, or indicate views which run contrary to other evidence gathered for my programme – what would I do?
  • How useful is a one-off snap-shot poll on this subject? Is there a way of demonstrating a trend, a movement in opinion? Or, are there other ways of achieving the same editorial objective?
  • What about the timing of the fieldwork? Are there other factors at work, other stories in the news, which may have a short-term impact on the results?
  • How appropriate is the subject matter for a BBC-commissioned opinion poll - will the mere fact of asking these questions reflect on the BBC as a whole?
  • Are respondents likely to have sufficient knowledge/interest for the results to be meaningful?

We should take particular care in commissioning opinion polls seeking the views of children and young people:

  • there could be circumstances in which the need for parental consent may have a detrimental affect on the reliability of the results;
  • there may be occasions when we need to strike a balance between, on the one hand, caution over the reliability, knowledge or experience of respondents and, on the other, the importance of giving young people and children the opportunity to have their views reflected in our output;
  • on some issues, of particular sensitivity, we may have to accept that there is no appropriate polling methodology for children;
  • advice should always be sought from the Chief Adviser, Politics.

Polling Methods

The BBC may commission polling conducted face to face, over the telephone or online; other methodologies may be developed and this will be kept under review.  In the UK, on matters of public policy, political or industrial controversy, or on ‘controversial subjects’ in any other area, polls should normally be commissioned using members of the British Polling Council.  Outside the UK, Chief Adviser Politics and/or the Political Research Unit should be consulted over appropriate methodology or polling companies.

With any methodology it is worth remembering that sample size is no guarantee that something is representative. Tens of thousands may respond to a text vote or a questionnaire – but it will still not be robust. Unrepresentative methods of seeking opinion do not become representative because a high number respond – there is no “threshold” to legitimise them.

Reporting Opinion Polls

This guidance applies whether we are reporting on polls the BBC has itself commissioned or on polls commissioned by other organisations, especially if they are members of the British Polling Council.  We should always make clear who has carried out the poll and who has commissioned it (as well as giving information about the size and nature of the sample, the margin of error and the dates of the fieldwork).

If we have doubts about the methodology or the bona fides of the pollsters, for instance, companies which are new or based abroad, either that scepticism should be reflected – appropriately – in the way we report the results, or we should consider whether the data is sufficiently credible for inclusion in BBC output. If in doubt, seek advice from the Political Research Unit.

Any exception to the Editorial Guidelines on reporting polls – for instance, any proposal to lead a bulletin, or headline a poll – or outside what the guidelines refer to as “normally”, should be referred to the Chief Adviser, Politics.

Care should be taken in reporting a trend of opinion – not just in voting intention polls – to ensure that like is being compared with like. Advice is available from the Political Research Unit.

Even where an opinion poll has been commissioned in an appropriate way, we should take care not to use elements of the research inappropriately. For instance, taking a poll of 500 teenagers may give us robust data on the whole group – but we should not then strip out, say, all the 16 year olds, (where the sample size would be only a fraction of the whole) and imply they are similarly representative.

Although the word “survey” has a slightly different and specific meaning (see below “Surveys”), it is acceptable to describe an opinion poll as a survey (though not the other way round). 

Always bear in mind that even properly conducted opinion polls by trusted companies – especially voting intention polls - can be wrong or contradicted by other evidence. When we report polls – no matter how convincing they may seem or what the attitude of the rest of the media – we should always ask how much of the rest of our story – and its prominence - is dependent on their accuracy and credibility?   Would the scepticism we’ve used in both the language and the direction of our reporting read strongly enough if they turned out to be wrong or contradicted by other evidence?

When an opinion poll is commissioned by a BBC department, the onus for ensuring that it is properly reported elsewhere in the BBC, with appropriate language, rests in the first instance with the commissioning area. Press releases or copy outlining the results of the poll must abide by the same standards as programme output. Other BBC areas making use of the poll must ensure they report it without changing the meaning or extending the significance of the data.

Polls at Election Times

The BBC never commissions voting intention polls during election campaigns.

Extra care must be taken in commissioning any opinion polls on politics or public policy [1] either during election campaigns or during the period before any campaign where the political context of the election is already prominent. For instance, commissioning a poll which appears to endorse or reject a specific party’s policy on a given issue in their manifesto, may open the BBC to criticism that it is intervening in a current controversy, contrary to Editorial Guidelines.

Surveys and Questionnaires

A survey – as against an opinion poll - is normally addressed to a smaller and specific group, which may be individuals, such as constituency chairmen, MPs, university vice-chancellors etc or maybe organisations, such as health trusts, FTSE 100 companies, local authorities, etc. Its credibility will depend on having spoken to a significant and agreed proportion of the whole measurable group with an approved methodology – as well as on the language we use to report the results.

If the audience are told that a survey has been commissioned by the BBC, they must have confidence that it has a level of statistical credibility which justifies any claims or assumptions about how representative it may be. 

So if a survey is commissioned by the BBC on any controversial subject (not just public policy, political and industrial controversy), it must involve the following:

  • a mandatory reference to the Chief Adviser, Politics – before it is commissioned;
  • a defined and finite group whose opinions, policies or behaviours are being analysed;
  • numerical parameters agreed in advance, such as an acceptable minimum response rate;
  • an agreed methodology, for instance, in ensuring questions are worded properly and posed consistently;
  • care taken with the language in reporting the results to ensure nothing is claimed which cannot be supported by the data;
  • clear guidance to other BBC outlets (including, for instance, press releases) who may report the outcome, but must ensure that adapting the language for other audiences does not alter the meaning or inflate the claims of the original research.

If the study, or research, or questionnaire does not involve all the above – then it is not a BBC survey with a numerical or statistical basis and no such claims should be made for it.

Anyone unclear about what sort of research they are looking to commission should consult the Chief Adviser, Politics. For advice on methodology, consult the Political Research Unit, which can, sometimes, be commissioned to carry out surveys in accordance with BBC Guidelines.

So, when we distribute a questionnaire online or through a third party, or presenters invite people to ring in or text, or we try to contact as many members of an undefined group as we can – the results will, by definition, give us a “self-selecting” outcome which has no representative validity. Such a method should not be called “a BBC survey”.

However, there will be many instances where such a method is valuable and a very useful programme tool. It may produce excellent anecdotal material, potential interviewees, useful interaction with audiences and vivid illustrations of the editorial content of a programme. But as a guide to relative opinions, it will be statistically valueless and BBC programmes should not use any language which implies that the numbers involved have any significance.

One figure which should normally be used in such questionnaires is of the total number of respondents. The proportions or percentages within that figure should not normally be used; neither should we use any language that implies “counting” them has meaning (eg “a majority said…”). If the issues being discussed are serious or controversial,  relative figures should not be used at all.

We can, however, use language which does not imply numeric value (eg “the mood of those responding was generally hostile”).

Occasionally, the actual number responding with a particular view may have significance in itself (eg: several hundred members of the armed forces directly criticising the standard of accommodation for their families). Although there will need to be a robust verification process, such a figure may be reported where editorially justified.

Any reference to the proportions of respondents to a questionnaire must explicitly make clear that they do not have a statistical or representative value. (It is not enough – and is potentially misleading – to say, for instance, “they may not be representative”, which wrongly implies to the audience that there is some value.).  Such a reference will normally only be appropriate where the issues being discussed are light-hearted and uncontroversial.

Where a survey fails to meet its pre-set criteria (eg, too low a response rate), the factual information gathered may, under many circumstances, still be used in the same way as a non-statistical questionnaire would be used. But it is NOT a BBC survey and the specific information gathered should only be used in a normal journalistic way, without referring to the numbers or proportions involved. Again, the key factor is transparency about the value and limitations of the data.

Surveys by other organisations

Other organisations often claim they have conducted a survey – or a poll – when what they actually have is a self-selecting questionnaire of some sort. The results may be interesting and newsworthy, but we should not necessarily accept claims about how representative they are at the face value of a press release; we should not report them in a way which leads our audience to believe they are more robust than they are. If they are of no statistical value and appear to have been promoted only to generate attention for a particular cause or publication, we should exercise real scepticism and consider not using them at all, especially when they are concerned with serious or controversial issues.

If we report “polls” and “surveys” commissioned by other organisations, either knowing their methodology is less rigorous, or unsure of its robustness, we should make that clear to the audience in the language we use to describe it, for instance, by sourcing claims and interpretations. This is particularly important in news bulletins and programmes – and for controversial subjects including politics and public policy.

If the research has been commissioned by an organisation which has a partial interest in the subject matter, we should show extra caution, even when the methodology and the company carrying it out are familiar. The audience must be told when research has been commissioned by an interested party.

We should not use language which allows the audience to assume the BBC has accepted that methodology is robust, unless it has been tested to our own standard. However, we should normally use the language of detachment, rather than doubt.

When reporting surveys – and opinion polls – we should remember that even with comparatively robust methodology, they can be wrong or contradicted by other evidence.  It is always worth applying a “common sense” test: if the results seem odd or surprising, or conflict with other evidence, or even with “gut instinct” – do not ignore those doubts. For instance: double-check the timing, the framing of the questions, the spread of locations, ages, social background or any other relevant variables.  When the nature and subject of the survey is known to respondents in advance, that may have an influence on those choosing to take part and thus impact on the results.  If possible, factor in the element of doubt, or possible explanations, to the way the survey is reported.

Beware, however, of commissioning surveys or opinion polls and then not using the results because they do not match expectations or fit a particular programme narrative. Especially in controversial areas, in politics or public policy [1], such an action could be seen by others as “covering up” results which do not seem to match a perceived “BBC view.”

Focus Groups and Panels

Focus groups and panels can provide programme-makers with qualitative research, examining opinion in more depth and often with more colour, flavour and spontaneity than conventional opinion polling or surveys. However, because they are not generally quantitative, they should not usually be regarded as representative.

We can draw a distinction between focus groups and some sorts of panels. The latter, if selected with robust criteria by a credible company and of sufficient size, may be used as a legitimate method of polling on some issues. Panels can be useful, over time, in indicating changing views, in reaching groups where conventional opinion polling has difficulties, such as children or particular religious groups, or in analysing contrasting attitudes of different groups. They should never be used to estimate party support or voting intention.

Those in the BBC commissioning panels should be aware of the impact of “conditioning” – in other words, a controlled group of individuals who are asked on a number of occasions for their views over time will, by definition, become untypical of the population as a whole, or of their own part of the population.

Focus groups do not necessarily need to be “balanced”, even if the research is about politics or public policy [1]. It may be legitimate to conduct such research into particular groups, such as “Labour voters” or “working women”. But we should be aware of the limitations of focus group research and ensure that our output does not make claims for its value beyond the particular set of people who have taken part.

Advice in this area should be sought from the Head of Political Research and, if there is a proposal to use either focus group research or a panel on party political issues, that must be discussed with the Chief Adviser, Politics “at an early stage” – before it is commissioned.

Phone-in and text votes and other forms of straw polls

“Straw polls” are using the word “poll” as in “vote” – not as in “opinion poll”. In other words, a “straw poll” is NOT some sort of opinion poll which is unrepresentative – it is an actual vote based on an unrepresentative group, such as a studio audience, listeners to a phone-in programme, text voters. The term “straw poll” is widely misunderstood and should normally be avoided in output.

Better to be explicit – phone-vote, text-vote – with a clear caveat about the meaning:

Straw polls have no statistical or numeric value. They should only be used with an explicit reference making it clear to the audience that they are not representative or “scientific” (this may often be in the context of “this is just a bit of fun” or an alternative, appropriate phrase).

With that warning:

  • the results can be given within the context of the programme concerned in terms of actual numbers or (depending on the total numbers involved) percentages;
  • programmes should not “seek publicity” for the results of such straw polls outside the specific output areas in which they were conducted;
  • it cannot normally be said even that text votes represent only the audience of the programme – merely those who chose to participate;
  • a large response to a straw poll does not make it representative;
  • straw polls should not feature in news bulletins;
  • when straw polls are carried out on the same subject at different times, the results must not be presented in a way which would indicate a trend;
  • straw polls, phone-in or text votes should never be used to “gather serious information on party political support.”

We should be particularly careful about using text or phone-in votes on those controversial issues which are vulnerable to highly organised pressure groups. Their ability to influence the outcome – even when we make it clear such votes are not representative – has the potential to damage the BBC.

Anyone proposing to carry out a telephone vote must submit the relevant referral form and should also read the logistical guidance on phone or SMS voting.

Studio Audiences

Even the most carefully selected studio audiences are not “representative.  Straw polls – or more usefully, a “show of hands” – of studio audiences should state explicitly that they have no wider statistical or representative value.

Vox Pops

It is important to remember that vox pops are a tool of illustration, not a tool of research. That must be reflected in the language we use to describe them.

Avoid terminology such as: “We’ve been out on the streets to find out what the people of Manchester think about this...”

More appropriate would be: “Here’s what some passing Mancunians thought about this...”

We should think carefully about whether the subject matter is appropriate for vox pops and how asking the question itself – perhaps in the street, without warning – might reflect on the BBC.

We should also think about which people are being approached and why – and how, in a public place, that might be perceived. On politics and other matters of public policy [1], vox pops can be used to illustrate a range of views or – occasionally - a single view. We can either use a spread of opinions, reflecting different strands of argument, or, where clearly signposted, present a proportionate reflection of those whose opinions we have sought. Either way, we must not imply the samples are representative and we should be explicit in describing their purpose and limitations.

Online Voting (political and public policy)

Conducting a vote online has the same statistical value as holding a “straw poll” (though it should never be described simply as a “poll”). It is not representative and must be couched – explicitly - in terms of having no scientific value or of being “a bit of fun” or similar phrase. It is not “indicative”, neither is it sufficient to say that “it may not represent public opinion”. It categorically does not represent public opinion – at best, it may coincide with it.

Results of an online vote may not be reported beyond the programme area or site which initiates it.

Online votes are particularly vulnerable to campaigns, lobby groups and individuals who seek to organise mass or multi voting. For that reason, some highly controversial issues are not, normally, suitable for online voting as the risk of being hijacked is too great.

Any proposal to conduct an online vote on an issue which is political, concerns public policy [1] or is in any way controversial must be referred to the Chief Adviser, Politics.

Anyone proposing to carry out any online vote must submit the relevant referral form and should also read the guidance for audience interactivity.

[1] In this guidance, “public policy” should be defined as any issue which falls within the remit of government, local government or other public bodies, such as health, education, crime, constitutional affairs, foreign affairs, economic policy etc. If in doubt – refer to the Chief Adviser, Politics. 

Last updated July 2019


Where next?