Quality in Primary Care Open Access

  • ISSN: 1479-1064
  • Journal h-index: 27
  • Journal CiteScore: 6.64
  • Journal Impact Factor: 4.22
  • Average acceptance to publication time (5-7 days)
  • Average article processing time (30-45 days) Less than 5 volumes 30 days
    8 - 9 volumes 40 days
    10 and more volumes 45 days
Reach us +32 25889658

Research Paper - (2005) Volume 13, Issue 2

Learning issues raised by the educational peer review of significant event analyses in general practice

Paul Bowie BA PG Dip*

Associate Adviser, Department of Postgraduate Medical Education

Steven McCoy BSc (Hons)

Undergraduate Medical Student, Faculty of Medicine

John McKay BSc (Hons) MRCGP

Associate Advisor, Department of Postgraduate Medical Education

Murray Lough MD FRCGP

Assistant Director, Department of Postgraduate Medical Education

University of Glasgow, Scotland, UK

Corresponding Author:
Paul Bowie
Associate Adviser
NHS Education for Scotland, 2 Central Quay
89 Hydepark Street, Glasgow G3 8BW, UK
Tel: +44 (0)141 223 1463
Fax: +44 (0)141 223 1401
Email: paul.bowie@ nes.scot.nhs.uk

Received date: 19 November 2004; Accepted date: 19 January 2005

Visit for more related articles at Quality in Primary Care

Abstract

Introduction Significant event analysis (SEA) is proposed as one method to improve the quality and safety of health care. General practitioners (GPs) and their teams are under pressure to provide verifiable evidence of participation in SEA from accreditation bodies and the GP appraisal system in Scotland. A peer review system, based on educational principles, was established in 1998 to provide formative feedback to participating GPs on whether their event analyses were judged to be satisfactory or unsatisfactory.Objectives To identify and classify SEA reports judged to be unsatisfactory, and determine the types of deficiencies and learning issues raised by peer reviewers. Participants and setting GP principals in the west of Scotland region. Design Qualitative content analysis of SEA reports and peer review feedback.Results 662 SEA reports were submitted between 2000 and 2004, of which a potential educational issue was raised in 163 (25%), while a further 75 (11%) were judged to be unsatisfactory. Of the 75 unsatisfactory SEAs, 69 (92%) were classified as having a ‘negative’ impact in terms of patient care or the practice, with only one ‘positive event’ (1%) recorded and three (4%) non-significant events reported. Most events were principally categorised as issues concerned with diagnoses (16%), communication (13%), and prescribing (17%). Learning issues were raised in 67 cases (89%) with regard to the implementation of change; 34 (45%) in understanding why the event happened; 12 (16%) in demonstrating reflective learning; and 11 (15%) in terms of the event description.Conclusions An educational issue is potentially raised for a significant number of GPs in applying the SEA technique. This may impact negatively on the appraisal and revalidation of these doctors as well as on improving patient care and safety. The study has helped to define and share some of the nfactors and inconsistencies that may contribute to an incomplete and therefore an unsatisfactory event analysis. If SEA is to be taken seriously as a risk and safety technique, then it is clear there must be a valid means of verifying and assuring performance in this area.

Keywords

event analysis, medical education, patient safety, peer review, significant event

Introduction

Learning from significant events and sharing good practice are key requirements in improving the quality and safety of patient care in the modern NHS.[1,2] One proposed way to assist healthcare teams to do this is significant event analysis (SEA), a qualitative method of clinical audit which has risen in both prominence and importance in the past 10 years.[3] The technique is now widely promoted as an important clinical govern-ance tool and there are strong expectations that its application can make an important contribution to reflective learning, managing healthcare risk and en-hancing patient safety.[4,5]

In NHS Scotland, this is reflected in the gaining of external quality and educational accreditation status for many general practice teams where verifiable evidence of SEA activity is now a compulsory require-ment.[6] A financial incentive for participation is also available in the new General Medical Services (GMS) contract.[7] However, arguably of greater professional importance for individual general practitioners (GPs) is that SEA is now required to be undertaken as one of the five core activities of the GP appraisal system to be completed in preparation for the regulatory process of medical revalidation.[8]

The modern expectation for SEA and the associated pressures facing GPs and their teams are driven by a number of related factors. Perhaps the main driving force can be attributed to public concerns about patient safety and quality of care issues, often man-ifested in high profile media reports. The improved management of healthcare risk is now also a key clinical governance priority as this may contribute to a decrease in serious clinical and organisational inci-dents, many of which are often avoidable.[1,2] In ad-dressing issues of risk and safety, the analyses of individual cases of ‘significance’ not only enables us to reflect on clinical decision making, treatment op-tions and the personal impact of these events, but may also illuminate gaps, deficiencies or weaknesses in practice systems.[9] SEA may therefore be well suited to dealing with the daily uncertainties of general practice in terms of decision making and treatment choice, as it enables a much wider range of complex issues to be addressed, which are not necessarily covered by con-ventional criterion-based audit method.[10,11]

There is, however, strong evidence to suggest that a series of barriers and difficulties, including fear of litigation, lack of expertise, diminished clinical own-ership, professional isolation and negative attitudes impede healthcare practitioners in understanding and effectively applying audit methodology.[12] In recog-nition that practitioners may therefore require guidance and formative feedback on how to apply SEA adequately, a voluntary educational model for submitting event analysis reports for peer review has been available to all GPs in the west region of NHS Education for Scotland (NES) since 1998. Peer review in general practice has been proposed as one method of quality assuring educational and quality activities.[13]

The peer review model, which has previously been described, exists as a means of promoting SEA and acting as a proxy indicator for determining if an event analysis has been satisfactorily undertaken or not.[14,15] Against this background, this study set out to explore the SEA educational model in greater detail by in-vestigating, highlighting and sharing the learning issues that were raised by peer reviewers when judging SEA reports to be unsatisfactory. The main aims of this study were as follows:

•to identify those reports submitted by individual GPs that were peer reviewed as being unsatisfactory analyses of significant events

•to classify and categorise the types of significant events that were analysed unsatisfactorily as judged by peer review

•to determine the types of deficiencies identified by peer reviewers as contributing to the unsatisfactory nature of event analyses

•to identify the range and type of learning issues highlighted by external peer reviewers for consider-ation by submitting GPs.

Methods

Educational peer review of SEA reports

SEA reports were submitted in a simple standard format to facilitate the structured analyses of the events by GPs (see Box 1). These were screened for confi-dentiality issues before being independently reviewed by two experienced and informed GPs from a group of 20, using an assessment instrument developed for that purpose.[16] SEA reports that are considered to be un-satisfactory by one or both peers undergo a second level assessment by two further assessors. Formative written feedback on how to improve the event analyses is then provided to the submitting GP for consideration. One session of postgraduate educational allowance (PGEA) was awarded per submission. PGEA ceased to exist in April 2004 and was replaced by a quota of quality points as part of an alternative arrangement under the new GMS contract.

Figure

Box 1: Suggested report format to facilitate a structured event analyses

For the purposes of this study we decided to focus on those SEA reports considered to be unsatisfactory after second level assessment, i.e. those reports where at least three out of four peers were in agreement about the outcome. We felt that this would make the study more manageable and also provide more valuable insights into the reasons why event analyses were assessed as unsatisfactory.

Database survey

The NES regional database, which monitors and tracks the postgraduate educational activities of over 2000 GPs in the region, was searched in May 2004 for all SEA reports that were judged as unsatisfactory after second level peer review. The following personal and professional data were downloaded: demographic GP data, academic and professional status of submitting GP, year of SEA submission, and outcome of peer review. report the jointly agreed principal event code in order to convey the general ‘significance’ of the types of problems and incidents involved in the study.

Qualitative analysis of peer review feedback

A personal departmental file is created for every SEA report submitted by a GP. Each file contains the submitted SEA report, the related assessment schedules outlining the educational feedback from each peer, and a copy of a short report to the submitting GP detailing a summary of the feedback. The files con-taining those SEA reports assessed as unsatisfactory were identified and pulled for investigation.

The assessment schedules and the feedback report were subjected to content analysis during August 2004. Each of the four sections of the document was examined independently by PB and SM, and data were systematically coded and categorised. These were further modified by merging and linking them after joint discussion and agreement between both researchers.

Classification of significant events

The coding and classification system used was devel-oped by adapting and combining the categorisation systems developed in four previous research studies of significant events and errors reported in general medical practice.[3,1719] The coding system was further refined as the study progressed. An individual signi-ficant event may have been allocated a number of different codes (e.g. lack of communication and wrong drug dose prescribed). However, we only

Results

Seventy-five of the 662 SEA reports (11%) submitted over the four-year study period were judged to be unsatisfactory after second level peer review (see Table 1). A total of 55 GPs submitted the 75 unsatisfactory SEA reports studied. Twenty-three were GP principals based in non-training practices, eight of whom were GP trainers, while the remaining 32 were principals from the non-training environment.

Figure

Table 1: A breakdown of the total number of GPs participating in SEA peer review, the number of report submissions and the outcome of the peer review process in the past four years

The principal categories and types of significant events are outlined in Table 2. Most events were classified and grouped under the following headings: general administration; communication; drug pre-scribing and dispensing; and investigation and results. Sixty-nine events (92%) were categorised as having a ‘negative’ connotation in terms of patient care or the conduct of the practice, while one positive event (1%) outlining an example of good practice was categorised (see Table 3).

Figure

Table 2: Principal categories of significant event

Figure

Table 3: Type of significant event (n = 75)

Each of the four areas of the SEA report format generated a number of categorical explanations as to why an event analysis may have been assessed as unsatisfactory (see Table 4). For example, in 67 cases (89%) there was a learning issue connected to the implementation of change, while in 34 instances  (45%) the assessors identified a problem in the under-standing or description of the reasons why an event had occurred. Randomly selected examples of the written reasons provided by peers as to why event analyses were considered unsatisfactory – in each of the four report areas – are outlined in Table 5.

Figure

Table 4: Areas of event analyses identified as unsatisfactory by peer reviewers (n = 75)

Figure

Table 5: Randomly selected examples of deficiencies identified in unsatisfactory SEA reports

Discussion

The main findings clearly show that a possible edu-cational issue is raised in one-quarter of SEA reports submitted by GPs, while a smaller minority of event analyses are considered to be unsatisfactory after mul-tiple peer review. Previous studies of this model have shown that the competence of GPs in applying the SEA technique satisfactorily has highlighted similar vari-ations in the outcome of the process. A successful peer review outcome was dependent upon the academic and professional status of submitting GPs and whether the necessary implementation of change was under-taken as part of the event analysis.[14,15]

However, the importance of all of the unsatisfactory event analyses is magnified further by the actual or potential seriousness of some of the events in ques-tion, which did lead to or could have led to patient harm, but certainly involved a failure in the care process or practice systems. This raises an important issue about the potential ability of a minority of GPs to apply the SEA technique adequately. But it also highlights the possibility that similar significant events may recur because GPs (and, conceivably, their prac-tice teams) may not have fully understood why these events originally occurred, or they may have taken inappropriate action to prevent future recurrence. Due to the relatively small numbers involved, it is unclear whether unsatisfactory event analyses are associated with specific significant event categories or differ from those event topics considered satisfactory.

The study has helped to define some of the factors which may contribute to an incomplete and therefore an unsatisfactory event analysis. Among the reasons for event analyses being judged as unsatisfactory was the failure to fully describe or understand why the events happened or to adequately implement change that was considered necessary to prevent the events happening again. Arguably these are the two most important areas involved in the structured analysis of a significant event. Fully understanding why an event occurred demonstrates insight into this particular area

or practice system and the underlying reasons con-tributing to the event. Similarly, failure to adequately consider or implement change may point to an event analysis that is discursive or superficial, rather than the more investigative and rigorous approach that may be associated with a structured analysis.

The appropriate consideration or implementation of change as a part of an event analysis is associated with a successful peer review.[15] We also know from previous research that there are variations in practi-tioners’ perceived knowledge and ability to effectively apply both SEA and criterion audit method, and that lack of expertise in these areas acts as an impediment to success. The contentious areas of dysfunctional group membership and personal relationship prob-lems have also been cited as barriers to successfully applying audit.[20] It is highly likely that good team dynamics will be a major requirement of successful SEA, but we can only speculate that some of these barriers have been contributory factors in the unsat-isfactory event analyses being performed by GPs and their teams.

The vast majority of events studied were classified as having a negative impact on patient care or the organisation of the practice. The discussion, analyses and sharing of ‘positive’ events is a philosophical cornerstone of the SEA technique, but interestingly this study provides some evidence that GPs do not appear to be submitting many examples of these for peer review. This confirms anecdotal impressions gained when observing the reports as they are submit-ted, which point to a very low number of positive events. Recent qualitative research (unpublished) has also highlighted reluctance amongst GPs to formally address positive significant events because they per-ceive problem events to have greater value in improv-ing patient care and safety, and so they prioritise these accordingly. However, the GMS contract now directs GPs to undertake event analyses on specific topics such as terminal care and mental health issues. Arguably this may be viewed as restrictive, but it is also possible that future peer review submissions may include a greater number of good practice-type analyses in these areas. Overall, the impact and sharing of positive significant events may merit further research if GPs are to be convinced of their value in improving the care and safety of patients.

This study has a number of potential limitations. It is dependent on the content of the SEA reports being an accurate reflection of what actually happened in practice. However, this is clearly open to personal bias, recall bias and problems of interpretation and judge-ment by report authors. For example, the events or actions described may not have happened exactly as recounted. Conversely, learning issues identified by peer reviewers may actually have been carried out by sub-mitting GPs,but omitted from their submitted reports.

Important evidence is now accumulating which potentially points to an education and training issue among many GPs in terms of their ability to apply the SEA technique satisfactorily. In a recent study, the reported awareness of a recent significant event and GPs’ knowledge of what constitutes a structured event analysis were shown to be variable.[21] Just over 40% of GPs reported a difficulty in determining when an event is ‘significant’. Around one-fifth agreed that they sometimes avoid dealing with events because of their complexity, while one-quarter agreed that they are uncertain how to properly analyse a significant event.[22]

The inability to apply the SEA technique satisfac-torily may have important implications for practices in terms of gaining and retaining accreditation from external bodies, and optimising their income from the GMS contract. For individual GPs there may be potential repercussions with regard to providing a full portfolio of evidence to satisfy the regulatory require-ments of medical revalidation, if unsatisfactory event analyses are not addressed in the appraisal system. Crucially, important opportunities to improve the quality and safety of patient care may also be missed if the technique is not undertaken effectively.

There is growing acceptance in medicine that veri-fiable evidence of performance will be required, espe-cially with regard to medical revalidation, although how this is to be achieved has not yet been decided.[23] One possible method is through peer review, as peers may be well placed to make informed judgements on the professional performance of colleagues.[13,24] The current system of appraisal is promoted as a form of peer review, but may however provide insufficient verification as it is possible that inadequately trained GP appraisers will not have the requisite skills and knowledge to determine if an event analysis requires further educational input or improvement. If SEA is to be taken seriously, then it is clear that there must be a valid means of verifying and assuring individual performance in this area.

Conclusions

The voluntary peer review of event analyses in this study has identified a number of deficiencies in the application of the SEA technique by a minority of GPs as well as adding to the growing research evidence about the type of event analyses being addressed. Based on the learning issues raised we would rec-ommend that practitioners follow the general guid-ance outlined in Box 2 as one way of structuring an event analysis. This may minimise the chances of the event being discussed in a simple and superficial manner, without addressing the key learning issues and ensuring appropriate action is taken.

Figure

Box 2: Recommendations in facilitating the structured analysis of a significant event

SEA in primary health care is in its infancy as a risk and quality improvement technique, especially when compared with similar, more established methods applied in other industries. Because of this, inconsist-encies in the skills and knowledge levels of prac-titioners, the rigorous application of the technique, and the way SEA is integrated into practice, are now apparent. Greater research is necessary if agreement on adopting an appropriate and consistent method-ological approach to both analysing and sharing significant events is to be reached.

Conflicts of Interest

None.

References