Measuring & Monitoring Clinical Quality: News and Updates
This one day conference focuses on measuring and monitoring clinical quality in line with the findings outlined in the 2017 Care Quality Commission publication ‘The State of Care in Acute Hospitals’.
Nancy Dixon Healthcare Quality Consultant Healthcare Quality Quest opens the conference with an update on 'Measuring and monitoring the quality of your service' discussing;
- what is the standard for quality?
- key elements of clinical quality
In her presentation Nancy stated:
“The way you express a measure or a monitor depends on your purpose for using it”
“Measuring of clinical quality refers to specifying important features of quality, determining if those features are being provided by collecting data to compare actual quality with intended quality and finding and acting on any gaps between actual care and intended care”
“Acceptability- Are patients satisfied or very satisfied with the information they are given about their treatment – so they understand the risk of the surgical procedure?”
“Appropriateness - Did patients have the indications for the interventions they had — justification for preoperative tests per NICE guidance?”
“Effectiveness - Was the ‘right’ process followed for patients — preoperative assessment for elective orthopaedic surgery?”
“If we are providing effective/timely care – were will you find this information, should the information be there”
“We refer to evidence in 4 parts:
- Evidence of quality of care or service
- Definitions of terms and instructions for the data collection”
“The way we work there are 3 different standards:
- screening - all or no patients – this will find every single time were the patient has not had the care expected
- Acceptable – what best practice services do – looking at a service that does really well. We need to do as the best does
- Target – what we aim for – we do use these standard occasionally”
“Reviewing individual cases – are the cases flagged by the measures acceptable? The reasons may be the measure was incomplete or poorly defined, the measure was designed with no exceptions, the data collector made a mistake and the case is a clinical exception not previously listed”
“Exception types include: Forgotten or omitted – patients declines treatment, Rare – exception occurs in 1 in 100,000 cases, Complex – patient has several conditions, which affect treatment and State of the art – evidence is unavailable or in conflict”
Nancy Dixon's Biography
Nancy Dixon is a specialist in the subject of measuring and improving the quality and safety of healthcare services.
Nancy is Director of Strategic Services for Healthcare Quality Quest (HQQ), a small independent organization. She has developed and refined clinical audit, quality improvement and root cause analysis methodologies. She is the author of Getting Clinical Audit Right to Benefit Patients and Getting Quality Improvement Right to Benefit Patients, both published by HQQ. She is also the author or co-author of a number of guides produced for the Healthcare Quality Improvement Partnership, HQIP, including the Guide to Ensuring Data Quality in Clinical Audit.
Nancy teaches and consults regularly in the UK and has worked in Botswana, Holland, Italy, Saudi Arabia and Taiwan in the last two years. She is a trained psychologist with the following qualifications in healthcare quality: Certified Professional in Healthcare Quality (USA); Fellow of the National Association for Healthcare Quality (USA); and Fellow of the Chartered Quality Institute (UK).
The morning sessions continued with a presentation from Simon Swift Managing Director Methods Analytics on 'Measuring & Monitoring Quality' and covered;
- prioritising and agreeing the clinical quality metrics to focus on in your organisation
- ensuring a range of measures: ensuring you are not relying on too few metrics, or focusing on overarching measures such as mortality
- what’s the difference between a metric and an indicator?
- how many indicators or metrics should an organization focus on?
- establishing and agreeing the individual metric limits and targets
- integration into Clinical Quality Dashboards
In his presentation Simon stated:
“ONLY show me a piece of information if it is something I need to know and can act on”
If you measure something does it actually relate to the real world – if it doesn’t then there is not point in doing it”
“Make sure the measures are relevant to your hospital/organisation”
“The approach that I take is does it answer the questions on variation, Are we different?, Do you know why?, What does it mean? And Are you comfortable being different?”
“Currently there is not enough detailed clinical data available to evaluate care delivery. But as hospitals implement EPRs this data: omitted drugs, full set of obs etc will become available.”
“When creating dashboards you need to think about Who in your organisation can access intelligence on performance, quality, safety, efficiency outcome…..?, Is there stuff in there that is pertinent to CDs, ward managers, directorate mangers, NEDs, execs, unit managers...? And Does it help them do the day job?”
“Design your interface so that it is engaging, if it is not no one will use it. Its got to be something people will choose to engage with. It has to be easy to use”
“A picture might be worth a thousand words, but a graph without any words is worthless. It is better to write a short, simple sentence than show a lot of complicated data.”
“Data, information and intelligence alone are not the answer: they help pose more meaningful and focused questions.”
“A dashboard is the tin opener for a conversation, it is not a test. Engage in the conversation.”
“Don’t argue about the data and ignore the message.”
“Don’t ignore the story because it makes you uncomfortable or contradicts your personal experience.”
“Act on any findings, follow up your actions.”
Jenny King Chief Research Officer Picker continued the afternoon sessions with an update on 'Learning from what people tell us we get right!: Always Events'
- Importance of listening to compliments and positive experiences of care
- Always Events: learning from what people tell us we get right
- Examples of Always Events
- Understanding the success of your Always Events
Pre conference abstract
This presentation will first look at the importance of listening to compliments and positive experiences of care. Including why patient and staff feedback can be used as a catalyst for change. Next the presentation will focus on learning from the Always Events® programme. Always Events® are aspects of the patient experience that are so important to patients and family members that health care providers must aim to perform them consistently for every individual, every time. I will discuss examples of Always Events implemented by providers and the tools that should be implemented to understand their success such as process, outcome and balancing measures.
Jenny King's biography
Jenny is the Chief Research Officer at Picker. Jenny joined Picker in 2008 and has focussed on applied social research and evaluation projects, including working on the programme to pilot and implement Always Events® across England. She is responsible for overseeing the development and co-ordination of the NHS Patient and Staff Survey Co-ordination Centres, run on behalf of the Care Quality Commission (CQC) and NHS England respectively. Jenny has a BSc in Psychology and a Msc in Forensic Psychology.
Future conferences of interest:
Consultant Job Planning
Clinical Audit for Improvement
Managing Doctors in Difficulty and Difficult Doctors
Caldicott Guardian Training Course
Developing the Role of the Physician Associate
Setting Up and Running Virtual Clinics
National PROMs Summit 2017
22 September 2017