topic badge
New Zealand
Level 7 - NCEA Level 2

Statistics in the Media - Misleading (Investigation)

Lesson

Everyday we read, watch and hear media reporting on statistics.  This could be in newspapers, on TV news programs, radio, social media, magazines and even through commercial advertisements within those media avenues. 

Sentence that start with, "In a recent survey, " "Research reports indicate, " and "A study shows" are usually reporting on a statistical study. Numbers, percentages and various measures of centre and distribution are used to indicate if something has increased, decreased and what the reliability rates or confidence intervals are.  They are reported using language like, "more likely," "less likely," "prone to" and "susceptible to."

A lot of media reporting is designed to illicit an emotional response from you. For example, the media coverage of Allan Kurdi (a refugee child who drowned fleeing Syria in 2015) created a flood of outrage and concern around the world about the refugee crisis landing on Turkish shores. The images were moving but the headlines certainly grabbed attention.  Other emotions that can be evoked from media reporting can include amazement, disbelief and wonder.  

Is bacon really a killer?

Sometimes the response might be commercially related.   "Bacon causes cancer', was a newspaper headline bandied around during October 2015. Not only does this illicit an emotive response (often the word cancer does that), but it could also have a commercial outcome, as people may only read the headline and then not ever buy bacon again.  

Incidentally, the background to this story was that the Cancer Research Council in the UK published a link between pig meat to an increased risk of colorectal cancer.  The report discusses how colorectal is actually a rare cancer with a 5.6% chance of developing the disease but for those that ate bacon most days, then this rose to 6.6%. If we consider the mathematics here, what this means is that for every 100 people who don't eat bacon any more, only one of those people will have avoided this already rare cancer. Unfortunately these statistics seemed to be missing in a lot of the media reporting on this topic. This was an excellent case of the media using statistics poorly either by using the data incorrectly or intentionally omitting some of the facts to reveal a punchier story.  

Another common display of misleading statistics in the media is in the reporting of health claims.  Diets, health products, exercise patterns all feature fairly regularly. 

Among all these statistics, it's hard to believe that anyone is actually misleading us - after all numbers don't lie right?  Well true but people do! You have to know ALL the information to determine the full context of the media report.

What we will do now is explore the use of surveys in articles that could be consider misleading.  To look first at how the media use surveys in general you can check out this lesson covered earlier.  

Exercise 1

In the early 2000's Colgate Toothpaste had an advertising campaign that stated 4 out of 5 dentists recommended Colgate Toothpaste.  Wow!  What a claim.  

Before I give you some of the stats, what questions do you have?  The key underpinning idea behind being statistically literate is the ability to be a critical thinker.  What would you like to know to verify this claim?   List as much as you can! 

With a friend, classmate, peer or you parents or carers have a discussion about what you would like to know.  What do you need to know to be satisfied of the claim.  Do really more than 80% of dentists recommend just one brand of toothpaste out of all the possible brands? 

 

Here's the stats and facts of the case!

Colgate did carry out a survey. Good Did you want to know if it was a survey or a census?  Maybe now we want to know who they surveyed?
The survey included dentists and dental hygienists. Good So not only dentists, but hygienists should know what they are talking about.  I wonder how they did the survey? 
The survey was conducted via phone. OK Participants were cold called, but then volunteered to continue to stay on the phone and answer the questions.  This matters because it's important to know whether the group surveyed received any remuneration or reward for completing the survey.  It is also important to know if the participants of the survey were aware that the survey was for Colgate itself?  So was it? 
The participants did not know the survey was for Colgate.  Bad The survey was being conducted on behalf of Colgate and was used for advertising purposes - would the dentists have answered the same way if they knew this?  Should this be something that they knew about? Well it turns out that they should be aware because endoresments of products by health professionals is prohibited as they are classed as “medicinal products”. So in fact none of the dentists should have answered the survey to being with!
    Did you wonder what questions were actually asked?
What the survey asked for was for participants to recommend several toothpastes and brands, not just a single choice. Bad So we don't actually know if dentists recommend Colgate ABOVE other brands, just that 80% of the dentists listed Colgate as one of the brands.  Here lies the crux of the problem.  The title, More than 80% of dentists recommend Colgate eludes to the meaning 80% of dentists recommend Colgate OVER other brands.  When this is not what was asked at all.

So, it is misleading then?

VERY misleading.  In fact it was so misleading that in 2007 the Advertising Standards Authority in the UK made Colgate remove the advertisement and Colgate was cited for breaches of the advertising standards.

Key Points!

Key points we just picked up here -

  • we need to know who was surveyed
  • how many were surveyed,
  • was it a representative sample,
  • the questions asked in the survey and
  • how the answers were interpreted. 

 

Exercise 2

9 out of 10 Americans don’t get all the nutrients they need from what they eat, and, in fact, are missing out on important vitamins and minerals

This was a claim made by the nutritional supplement company Centrum in the late 90's.  The full page advertisement referred to a survey.  Well this is a good start, there must be some data to support the statement above then.

Questions - we should have lots!  Write down or discuss what you would need to know to verify this amazing claim.  Obviously as an advertisement it appears effective, imagine that many people being vitamin deficient, then maybe a supplement is indeed the answer. 

The stats and facts!

The data this was based on was collected between 1976 and 1980.   Bad But wasn't the advertisement in the 90's.  YES.  So is the data relevant?  Maybe, maybe not? Certainly something to consider.
The results indicated that 9% of participants "remembered eating the recommended daily allowance of fruits and vegetables on the particular day covered by the survey" Hmm A few points here.  That we were relying on what participants could remember, and that they would need to know the daily allowance of fruit and vegetables to know if they had eaten that amount.  Do you know the daily allowance?  What would you answer if you had to answer that question right now?  

 Does this survey question actually support the claims being made? 

Not eating the correct intake on a particular day, doesn't prove that those people were in fact vitamin deficient.  They could have a really healthy level of vitamins and minerals in their system and on that day they may not have eaten the recommended amount.  Maybe they did the day before or the day after?

So, it is misleading then?

I think it can be considered misleading.  They are using a survey that was old, unrelated to vitamin deficiency and using it to support the push to buy a product to fix a deficiency you may or may not have. (of course the small print does detail the need to get specific advice if you think you are deficient).

Key Points!

Key points we just picked up here -

  • we need to know when the survey was conducted,
  • how many were surveyed,
  • was it a representative sample,
  • the questions asked in the survey.  

 

Exercise 3

In March 2005 the American Medical Association reported an alarming rate of unhealthy behaviours such as binge drinking and unprotected sex among college women during spring break.  It was based on a survey of "a random sample of 644" women and stated a "margin of error of \pm 4%.  

All sounds fairly reasonable so far.  Do you have questions already?  Any doubts yet?  

I would like to know how random was the sample? Where did they find these woman to question?  And is 644 enough people to survey if you want to then generalise to the whole college woman population?

It turns out that the sample was not random at all.  The survey was conducted with only woman who volunteered to come forward and answer questions. They were from the one college, and it turns out that only \frac{1}{4} of these women had even taken a spring break trip.  

So, it is misleading then?

VERY misleading.  There isn't really any data that suggests an "alarming rate", although even the use of the word alarming is subjective.  We might consider 10% alarming, but the AMA might consider 2% alarming. There is no random sample and it is not a reasonable cross section of college women on spring break.  

Key Points!

Key points we just picked up here -

  • getting an objective idea about subjective words is difficult,
  • how the sample is obtained is important as it connects to the reliability of the claims being made.  

 

Summary

When being critical of what you read, see or hear try asking yourself the following simple questions

Author - Who is writing the article and what are they trying to achieve?

Audience - Who is the intended audience and are you part of that audience? Would the intended audience react differently or the same? 

Purpose - Why is the author writing to that audience, is it for an emotional response? a commercial outcome? a financial reward?  Is it for government funding for more research? 

Reliability - Does the study encompass the necessary statistical elements to be considered reliable?  

 

Outcomes

S7-3

S7-3 Evaluate statistically based reports: A interpreting risk and relative risk B identifying sampling and possible non-sampling errors in surveys, including polls

91266

Evaluate a statistically based report

What is Mathspace

About Mathspace