Benjamin Disraeli once said that "There are three kinds of lies: lies, damned lies, and statistics.". As this quote alludes to, one must be cautious when dealing with statistics, as they can often be accidentally or deliberately misleading. This is especially true when considering statistics which use surveys as their basis, as surveys are particularly prone to problems of poor design. Let's have a look at some of the ways a survey can be poorly designed:
The questions or the answer choices can be biased by the type of language used. For example, imagine a survey which asked "Do you support Australia's participation in anti-terrorist operations overseas?".
If we wanted responders to answer yes to this question, we might reword it as "Do you support Australia's participation in an international alliance to prevent the spread of terrorism?", which carries a more positive connotation. Alternatively, we could ask "Do you support Australia's decision to risk the lives of our soldiers by invading foreign countries?", which carries a more negative connotation.
Whilst die-hard supporters of intervention or non-intervention will be unlikely to be swayed by such tricks, people who are "on the fence" or have no real opinion on the matter may be influenced by the way a question is asked, particularly when the only possible answers are "yes" or "no". With our results, we could then argue that Australians do/do not support participation in anti-terrorist operations overseas, and publish news articles or political statements using our "evidence".
Write your own survey question with a neutral basis.
Now change this question to have a positive bias (try to get your respondents to answer yes)
Now change this question to have a negative bias (try to get your respondents to answer no)
Test out each version of your survey question with a different group of students, and see if the wording actually does change how people answer.
Questions may be phrased in a way that makes it difficult to understand what they are trying to ask.
A particularly common case is the "double negative", for example "Do you disagree with the position that Australia does not need high speed fibre optic internet?". Someone who is reading this quickly might not spot the double negative, and would answer it incorrectly.
Ambiguity in a question can also confuse respondents. For example, a question might ask "do you think primary school students having access to a smart phone is a good thing?". Some people may interpret this as meaning that the students would be able to use their parents' phones, whilst others may interpret this as meaning that the students would own their own phones.
Have a go at writing your own double negative question.
Now try writing an ambiguous question, and describe the two (or more) ways in which people might interpret it.
These are questions which ask two things at once. For example, "Do you support school students spending less time at school and doing more homework?". Some people might be happy with less time at school, but against more homework.
Write your own double-barrelled question, and then show the proper way to do it, by separating them into two separate questions.
As surveys are usually done on a voluntary basis, it is important not to ask questions which are too sensitive, or add a "prefer not to answer" option, so that the respondent does not throw out the survey entirely to avoid answering. Questions relating to race, religion, income, drug and alcohol use, medical conditions and sexuality are all types of questions which should be dealt with in a sensitive manner. For example, you may have noticed that government forms which ask "Are you Aboriginal or Torres Strait Islander?" always have an option for "prefer not to answer".
The reasons why people would be concerned about answering such questions vary greatly according to the type of survey and the context in which it is asked.
Have a talk amongst yourselves about some scenarios where people may find answering survey questions problematic.
In case you're thinking that no one would be foolish enough to make a mistakes like these when designing a real survey, take a look at this survey which the National Rifle Association sent to American politicians.
What examples of problems with the survey design can you find?
Why might the survey questions be asked in this way?
How could you rewrite them?
So far we've seen that statistics produced from surveys can be seriously flawed, even to the extent that they can be used to deliberately lie. So what should you do when you see a newspaper article that says, for example, "74% of Australians believe that a carbon tax is a bad idea"? Well, the ideal would of course be to try to find a copy of the survey itself online, which you can then check yourself for poor survey design. However, it would be very difficult to do this for every statistic you see.
A good shortcut is to check who conducted the survey. Well funded, politically-neutral organisations are usually fairly good at designing surveys which are well made and free from bias. If the above statistic came from the Australian Climate Council, one could be fairly confident in its findings. On the other hand, if the statistic was from the Australian Coal Association, one would have to be a lot more suspicious that deliberately poor survey design played a part in its findings.
However, it is always worth looking up the organisation, as some lobby groups choose deliberately misleading names - for example, the Global Climate Coalition was the name for a lobby group comprised of mainly of oil firms seeking to prevent action on global warming. Indeed, it is standard practice for front groups to use neutral sounding names like "Institute of Research" or "Centre for ... Studies" to mask their biases.
In the end, when it comes to statistics you can never be too careful!