Table 2

Appropriately presented research findings could be relevant in the policy alternatives stream

Stories are used to make research compellingIt's important for people who use evidence, to understand that evidence in and of itself is not persuasive. They have to learn to tell stories. They have to learn to translate the evidence into something that is understandable by the average legislator, average citizen. (P#02, administrator, emphasis added)
Now it helps to have, in addition to your statistical evidence, it helps to have a few anecdotes so that people can see it concretely. That's always useful. (P#04, legislator, emphasis added)
I think most legislators are reasonable people if you can try to relate to them and get them to understand “this could be me”. (P#14, administrator)
The health department would always come to me with… the research. And I always had to tell them, “Look you understand, if some jerk in the legislature has one anecdote that goes directly against this I could lose it.” The anecdote that tells you what happened to a person, only a person is just very powerful. So I used to make them go look for anecdotes on their side. The researcher thought I was a nut. I don't personally really give a crap about anecdotes. But I need one to fight politically. (P#16, legislator, emphasis added)
It's good to combine things. You know, you take data that's good data and then you back it up with a human face on it. Because then you've got the logic and compassion going for you as a part of the argument. And I think that combination is powerful. (P#17, legislator)
Numbers are not persuasive[Legislators] tend not to want to do numbers… when I testify I watch the eyes; use numbers, and they glaze. Because if you think about the background of most [legislators], they're not science people, they're mostly non-science, non-mathematicians, non-engineers. And so when I talk about five parts per billion, they have no concept. If you say one grain of sand on the beach at Waikiki, they kind of get it. (P#02, administrator)
Most legislators don't understand cause and correlation. They don't have any clue about statistics. (P#03, legislator)
Simplified study assessment guidelines can guide decision making[S]ometimes we answer fire with fire. We say “That's a great article, it's a great subject, we think that we would love to research this topic, or see more data and evidence on this topic when you get it in a peer reviewed journal, in a controlled study.” Sometimes we get it out of the press. Reporters typically ask “Doctor so and so says he's doing this study.” [We say that] we would like to see it in a reproducible study where all the variables are controlled, and we'll be more than happy to consider the evidence at that time; conclusions based on evidence. (P#02, administrator, emphasis added)
When you talk to [certain advocacy groups] generally their sources are themselves. That's when you know that they've cooked the data. We went through some [training that said] all you have to do is ask one question: Is this statistically significant? And the answer is no. And then once you ask that question you ask, “Well, what was your sample? Who did you talk to? Are there any other corroborating studies that don't come from [your own organisation] that show this is the case?” (P#07, legislator, emphasis added)
We've had to continually go back to, you know and each meeting we go through another set of interventions that people have come up with. I mean these families… come up with these studies where there's like five kids. They'll come to the meeting, here's this study and then we go through it. It's like, “Okay, how many children? Was there a control group?” We go through all the stuff. And it's like “Well no.” “Well no.” And it's like “Well, then we can't support it, can we?” “Well no…” You can just come back to four or five basic points. You don't need to go into a lot of scientific depth. (P#13, administrator, emphasis added)
What you've got to find is the inconsistencies in that article and rip it right in front of their eyes. That's the only thing you can do. In the past at least no matter what the administration you'd be able to go back and say “Look in this area the CDC says this is the best way to do it.” And people would just shut up at that point. But now it's harder [and] you've got to be able to attack stupid research that isn't research. (P#16, legislator)
One of our standard responses when a company comes and asks us to cover something, we ask them for any randomised controlled trials that they've done or anyone has done on the product. But generally they don't have it. So we don't cover it. (P#20, administrator, emphasis added)
Using research can make some policy alternatives more credibleThey may say “Okay, this is a policy we want to adopt, but we want to bounce this off of somebody that really knows how to analyse and find evidence.” So they can say “All right, here are the reviews that we looked at, here's the policy that we're articulating. What are the weak points? What level of confidence can we have if we move forward with this?” (P#18, administrator)
 [W]hen you look at a study, and you see that it has had a worthwhile number of people studied, and that it has been done in basically a double blind way. So that you know there's credibility to it. When you take a look at that you have a much better opportunity to make good decisions than if you're just shooting from the hip or getting involved emotionally. (P#01, legislator, emphasis added)
By knowing and using the scientific base you are going to be able to be more protected than if you just go out on a political limb by yourself. Even if I had enemies that are looking to tarnish our credibility, the fact that evidence for secondhand smoke exposure is pretty overwhelming almost negates whether that can grow legs and propagate. (P#09, administrator, emphasis added)