‘Drop in’ session draws 11 questions from RSOs

12.00 | 20 November 2012 | | 2 comments

The joint RoSPA and Road Safety GB ‘drop in’ session on evaluation attracted 11 questions from road safety professionals. Thank you to everyone who participated – we hope you found it useful.

Lindsey Brough, RoSPA’s research and evaluation officer, is happy to answer any questions you may have and can be contacted on 0121 248 2149. Lindsey will also be attending the MAST Conference on 5 February 2013 at Dunchurch Park, Rugby, where she will be delighted to discuss evaluation in more detail.

Q1: How can we avoid asking leading or misleading questions in surveys/questionnaires?

Make sure that your questions do not suggest anything to your participants, e.g. ‘Do you think that this presentation is a good way to influence young people?’. Always ask someone outside of your project team to check your questions first. Also, offer the same amount of positive and negative response options, i.e. strongly agree, agree, neutral, disagree, strongly disagree but NOT: very strongly agree, strongly agree, agree, disagree, strongly disagree

Some examples of leading (suggestive) questions:
• Do you agree this workshop was enjoyable?
• How strongly do you agree that drink-driving is socially unacceptable?
• Evidence shows that using a hand-held mobile phone while driving can cause accidents. Do you think the legal penalty for mobile phone use should be increased?
• The Government is looking at increasing the legal penalty for hand-held mobile phone use while driving. How serious an issue do you think hand-held mobile phone use is for road safety?

This is a real question forwarded to me recently:
• How hopeful are you that road casualties will reduce in the future?
Very hopeful, hopeful, not very hopeful, don’t know

Firstly there is an unequal number of positive and negative response options, i.e. 2 possible ‘hopeful’ responses and only 1 possible ‘not very hopeful’ response.

I would argue that this question makes the reader think that road casualties can be expected to be reduced in the future. It would have been a more neutral and therefore less leading question to ask something like:
• What do you think will happen to the rate of road casualties by 2014?
With response options of: Stay about the same, Reduce a lot, Reduce Slightly, Increase a lot, Increase Slightly (this is an example only!)

Q2: As the objectives of interventions across the country are likely to be very similar would it be a good idea if a common survey questionnaire was developed for all organisations to use? This would also allow comparisons on the effectiveness of activities. And maybe the development of a single piece of software that could then analyse all questionnaires?

This is a question often asked!

As you know, the survey questions depend on your intervention’s specific objectives. If interventions across the country had the same key objectives then yes a standard or pro-forma questionnaire could be developed for all to use who are delivering that intervention.  This assumes that the intervention does not vary depending on the local circumstances. If there are interventions that are common across authorities and have the same objectives then this is something we could look at developing!

However, the standard questionnaire would probably always need tweaking to suit the particular version of the intervention actually delivered. Also, the questionnaire may be slightly different depending on whether it’s completed before and then again after the intervention, or in a post-then-pre format.

As for software to analyse the questionnaires there are different software packages already available, for example survey monkey (just an example). Some packages collect the data via on-line surveys and do basic analysis of the data ready to print off.

Does this help?

Q3: The advantages of scientific trials are obvious but these have never been used in road safety. There are almost always far more sites that qualify for intervention than there are resources available to treat all of them so sites are available for “control”. What are the disadvantages of scientific trials that are preventing their use?
Scientific trials have been used in road safety but not commonly due to the main disadvantages of financial cost, expertise, time, and administrative resource. Also very large sample sizes are typically required in both the ‘experiment’ and ‘control’ groups so this naturally limits their application in social interventions. However, I would encourage the use of RCTs in large-scale evaluations.

Q4: Advertising campaigns are expensive and adding a thorough evaluation to the cost of a campaign can increase these costs significantly. In current climates we may have to chose between the full campaign we want to do with little or no evaluation and a lower weight (and potentially less effective) campaign with evaluation. What would your advice be?

If a campaign is expensive then I would have to question running it without collecting scientific evidence of its effectiveness. At the very least a prospective cost-benefit analysis should be conducted before the campaign is agreed (what are the likely costs and benefits). Without evaluation there can be little accountability for the money spent on the campaign. I would advise running a pilot campaign with evaluation before deciding whether or not to roll it out wider. Thanks for the question!

Q5: Some course evaluation takes place at the end of a training day when really all the participants want to do is fill the sheet in as quickly as possible and get home. Asking participants to return feedback at a later stage has resulted in a lower response rate. Do you have any suggestions?

Good question! This is the catch-22 situation we all face. If the day allows it, build-in extra time at the end of the training for the evaluation/feedback so that it is part of the training day, rather than squeezing it in at the end. That way the day still finishes at a reasonable time and the evaluation shouldn’t be too rushed by those wanting to go home. For those who do take it home to return later, include an email address so they can scan it in and email back, ensure the return postal address is on the form, and stress how helpful it is to have the forms returned and say how the feedback will be used.

Q6: Should we use the same questionnaire to go back after say 3 or 6 months to assess knowledge retention or attitude change or should we change the questions and if so, how?

Yes, use the same questionnaire so you can track changes more accurately. You may need to add an additional question to reflect longer term changes you are following up on, so for instance if you ran an event aimed at encouraging people to take up post-test rider or driver training, you would need to add a question asking about whether or not they have done so since the event. Please feel free to send your questions to me to look at.

Q7: With an ETP interventions budget of less than £3K, I should be allocating between 5-10% of this for evaluation. How would you suggest this is best spent as it wouldn’t cover the cost of a thorough evaluation with focus groups.

Depends on your evaluation questions (e.g. how sensitive they are) and the population being evaluated. Telephone interviews for example may be a cost effective alternative to focus groups. Also interviews by email have recently started to be used. You may have to prioritise just one or two interventions to evaluate to enable more robust research methods to be afforded. Happy to speak more about this.

Q8: Do you have any good examples of incentives being used to encourage evaluation of initiatives within road safety?

In a recent focus group RoSPA ran with parents we offered each parent participant £10 for one hour of their time. This was well received although it should be added that the parents had minimal travel or other expenses to attend this particular focus group. In work we have done with young drivers the incentive needed to be around £30 per young driver to ensure we had sufficient numbers taking part. Incentives do help and I’m afraid it tends to be the cash options that are most effective rather than ‘in-kind’ compensation such as a free driving lesson for example. It does depend of course on how likely/inclined you think your target audience is to respond!

Q9: Found the following questions about surveys in a document about surveys by Ben Gammon, head of visitor research, Science Museum. written in 2001. To update it what other questions should I ask and in what order before any survey?
As with all evaluation before you do anything else answer these questions in this order:
1 What do I want to find out? Who do you want to find this out from?
2 Why do I want to find this out?
3 How will this information be used?
4 How will I find this out – choose your methodologies (note the plural) and sample?
5 Who am I finding this out for and how shall I tell them the results?
6 How much money do I have for this project? How many staff? How much time?

That’s a very good list of questions to ask yourself before conducting any evaluation research. You could include:
•  In addition to ‘Who do you want to find this out from?’ I would ask: ‘Who are the different stakeholder groups involved in this project and how am I going to ensure they are all included in the evaluation?’ This helps to include programme designers and deliverers for example, as well as programme recipients.
•  Ethics – What are the ethical implications of my intended study?

The order is not too important as long as they are all asked, the below seems a sensible order though:
1 What do I want to find out? Who do you want to find this out from?
2 Why do I want to find this out?
3 How will this information be used?
4 How much money do I have for this project? How many staff? How much time?
5 How will I find this out – choose your methodologies (note the plural) and sample?
6  What are the ethical considerations for this study?
7 Who am I finding this out for and how shall I tell them the results?

Q10: How can I reduce bias in how I layout my survey?
One form of bias is acquiescence bias which is where people tend to always answer positively and select the ‘agree’ response, regardless of the question asked. This is more likely to be an issue where you have a list of statements in your survey with an agree/disagree response scale.  To tackle this you could include an equal number of ‘pro’ and ‘anti’ statements (where if someone ticked ‘agree’ to all they would be contradicting themselves) so that any acquiescence effect will be cancelled out, e.g. ‘I think the police should prosecute more people for driving up to 5 mph above the 30mph speed limit’, and, ‘I think it is ok to drive up to 5mph above a 30mph speed limit’.

Q11: When interventions are made at sites on the road network, it can be very difficult to measure (rather than just estimate) other effects at the sites? Specifically, how can the "regression to the mean effect" be measured?

This is not a question I can answer off the top of my head but this link may be useful: http://www.socialresearchmethods.net/kb/regrmean.php



Comment on this story

Leave a Reply

Your email address will not be published. Required fields are marked *

Report a reader comment

Order by Latest first | Oldest first | Highest rated | Lowest rated

    Comments on answer to Q2.

    1)If this is a common question asked then maybe this is an area that many would like to see developed.

    2)Does it matter if interventions vary? – I would expect that the objectives of interventions targeting, for example, young urban drivers are similar across the UK. The outcomes would be measured and not the process.

    Dave, Sheffield
    Agree (0) | Disagree (0)

    I felt that this was a very interesting and worthwhile session – congratulations Nick on yet another innovation – I hope you are already planning the next session.

    Brian, Road Safety GB
    Agree (0) | Disagree (0)

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.