Table 1

Reflective evaluation questions

What (What happened?) So what? (What does it mean?) Now what? (What to do differently?)
How many survey responses did we receive?
Whose responses did we capture?
What was the quality of data collected through this survey?
What feedback did survey respondents and interviewees provide about:
  • the relevance, format and use of the report?

  • the survey?

  • supporting resources?

What were team members’ experiences of recent implementation processes?
What worked well/not so well for you in terms of refinements and modifications made?
Do we need to promote and/or distribute reports in other ways and target particular people?
Do we need to clarify, adjust, add or delete survey questions to elicit robust data and encourage engagement?
Do we consider modifying the next phase, or the ESP process we use for the next dataset?
Do we need to present or explain the data differently to enhance understanding?
Do we need to modify report formats and content to make them more accessible to those targeted?
Does the literature about presenting research to different user groups match respondent feedback?
How does feedback and observation connect with what we know from our experience of engaging stakeholders in CQI?
Based on the explicit and experiential evidence, should we be making further changes to enhance the:
  • quality of data collected?

  • processes?

  • presentation of reports?

What is the supporting evidence for a particular direction or modification?
How should we prioritise these changes (eg, considering resources needed, time involved, alignment with theory)?
What is the plan of action for making changes?
How will these changes impact on the project and others involved (eg, clinical leaders and report co-authors involved in data analysis)?
  • CQI, continuous quality improvement; ESP, Engaging Stakeholders in Identifying Priority Evidence-Practice Gaps and Strategies for Improvement in Primary Health Care.