Open Access

Suboptimal compliance with surgical safety checklists in Colorado: A prospective observational study reveals differences between surgical specialties

  • Walter L Biffl1Email author,
  • Annalee W Gallagher1,
  • Fredric M Pieracci1 and
  • Crystal Berumen1
Patient Safety in Surgery20159:5

Received: 24 November 2014

Accepted: 29 December 2014

Published: 31 January 2015



Surgical safety checklists (SSCs) are designed to improve team communication and consistency in care, ultimately avoiding complications. In Colorado, hospitals reported that use of SSCs was standard practice, but a statewide survey indicated that SSC use was inconsistent. The purpose of this project was to directly observe the compliance with the SSC in Colorado hospitals, through direct observation of the perioperative checklist process.


Ten hospitals participated in a quality improvement initiative. Trained team members recorded compliance with each of the components of the SSC. Data analysis was performed using a chi-squared test or ANOVA, depending on the number of categorical variables, with p < 0.05 determining statistical significance.


Ten hospitals representing statewide diversity submitted 854 observations (median 98, range 24–106). 83% of cases were elective, 13% urgent, and 4% emergent/trauma. There was significant variation across hospitals in: team introductions, cessation of activity, affirming correct procedure, assessing hypothermia risk, need for beta blocker, or VTE prophylaxis. Uniformly poor compliance was observed with respect to assessment of case duration, blood loss, anesthesiologists’ concerns, or display of essential imaging. Only 71% of observers reported active participation by physicians; 9% reported that “the majority did not pay attention” and 4% reported that the team was “just going through the motions”. There were significant differences among surgical specialty groups in the majority of the elements.


SSCs have been implemented by the vast majority of hospitals in our state; however, compliance with SSC completion in the operating room has wide variation and is generally suboptimal. Although this study was not designed to correlate SSC compliance with outcomes, there are concerns about the risk of a sentinel event or unanticipated complication resulting from poor preparation.


Checklist Surgery Safety Compliance Observation Implementation Surgeons Operating room Sentinel events


In 2009, The New England Journal of Medicine published a special article entitled, A Surgical Safety Checklist to Reduce Morbidity and Mortality in a Global Population [1]. This study, supported by the World Health Organization (WHO), described the use of a 19-item surgical safety checklist (SSC) that was designed to improve team communication and consistency in care in the perioperative period. After implementation of the WHO SSC in eight diverse institutions around the globe, there were statistically significant reductions in the rates of death and complications [1]. Following this publication, use of checklists was broadly embraced and strongly encouraged by groups such as WHO and the Institute for Healthcare Improvement, to the point that use of SSCs was mandated in the UK and Canada [2-4]. In the United States, many hospitals implemented surgical checklists, often adapting the original WHO SSC to fit individual facility needs [5,6]. Momentum behind checklist initiatives has grown as accumulating literature has suggested that implementation of SSCs is associated with reductions in postoperative complications and mortality [7-10].

In Colorado in 2010, the Colorado Hospital Association (CHA) partnered with the Colorado Medical Society and COPIC- the major medical liability carrier for physicians and hospitals in Colorado- to modify and standardize the WHO checklist to include other Joint Commission requirements as well as Surgical Care Improvement Project (SCIP) measures (Figure 1). In a subsequent CHA survey, over 90% of responding hospitals reported implementation of the “Colorado SSC” in their surgical areas (CHA, unpublished data). However, CHA and COPIC were concerned that the failure to consistently review all of the checklist items might put patients at risk of “Never Events” such as wrong-patient/wrong-site procedures or retained foreign bodies [11]. In fact, in spite of the promulgation of National Patient Safety Goals, these events had continued to occur in Colorado [12]. The purpose of this quality improvement project was to observe the level of compliance with discrete components of the SSC in Colorado hospitals, through direct observation of the perioperative checklist process.
Figure 1

The Colorado hospital association surgical safety checklist. Based on the World Health Organization checklist, this incorporated Surgical Care Imptrovement Project initiatives.


This observational quality improvement project was carried out between September 2012 and April 2013 at ten selected Colorado hospitals. As a quality improvement project, it was exempt from Institutional Review Board approval, and individual patients’ protected health information was never collected or transmitted. The hospitals were selected by COPIC and CHA with the goal of ensuring a representative sample of urban vs rural and academic vs community hospitals; in addition, some hospitals were primarily insured by COPIC, and some were not. All facilities that were invited to participate agreed to do so, and all ten completed the project.

A preliminary meeting was held in which representatives from CHA and the ten hospitals discussed the hospitals’ current surgical workflow and checklist use. There was no implementation intervention performed; this project was intended to assess existing levels of compliance. It was agreed that, for the purposes of the project, the CHA SSC elements would be used as the basis for comparison. A standardized assessment tool- the CHA “Surgical Safety Assessment (SSA)” form (Figure 2)- was created for the observations. The SSA form was designed to reflect the typical surgical workflow, and thus varied slightly from the CHA SSC form. For example, the display of essential imaging and choice of appropriate venous thromboembolism (VTE) prophylaxis is generally the responsibility of the surgeon, so these observations were grouped in the “Time Out” section, when the surgeon would be present, rather than the “Prior to Anesthesia” section, when the surgeon is not typically present.
Figure 2

The surgical safety assessment form. This observation tool was based on the Colorado Hospital Association Surgical Safety Checklist. It was modified to group items according to the perioperative workflow, for ease of completion during the process.

Once the SSA form was completed and distributed, a final workshop was held in which the observers from each site were trained to use the SSA form and to perform observations as discreetly as possible, without the knowledge of the operating team. The observer was generally a member of the operating room (OR) staff, whose presence would not raise suspicion, but who was not directly involved in that procedure. In addition to review of the safety elements, the observers recorded team participation, the role and participation of physicians, and what they perceived as barriers to consistent use. The completed assessments, which contained no protected health information or other patient-identifying information, were submitted to CHA, where information from the SSA forms was entered into a database.

Each hospital was asked to perform 100 observations. This was intended to avoid having the sample size dominated by the busiest hospitals. The hospitals were also requested to perform observations in a case mix that approximated their overall case mix (eg, urgent: elective cases, and across specialties in a representative approximation).

Statistical analysis

All statistical analyses were performed using SAS version 9.2 (SAS Inc., Carey, NC, USA). Data are expressed as number (percentage). All outcomes were dichotomous categorical. Cell with either missing data or “non-applicable” values were omitted from the analysis; as such the total sample size for each independent variable analyzed differed. Analysis of independent variables was first performed across all observations and then among the following subgroups: 1) by hospital, 2) by specialty (general surgery, orthopedic surgery, neurosurgery, and other), and 3) by hospital infection rate (high vs. low, dichotomized around the median infection rate of 0.5%). Differences in categorical variables between two groups were assessed using the chi-squared test, unless expected cell counts were less than 10, in which case Fischer’s exact test was used. Differences in categorical variables between more than two groups (e.g., specialty) were assessed using ANOVA. The alpha error level was set at 0.05, with p < 0.05 being considered statistically significant.


The ten participating hospitals were diverse. Three were rural and seven were urban/suburban. Two of the ten hospitals were academic (ie, staffed with surgical residents), and the other eight were community hospitals. Five were designated as level I or II trauma centers by the State of Colorado, and five were not. The original goal was 100 observations per site. A total of 854 observations were made (per-hospital median 98, range 24–106). All hospitals completed their observations within a 6-week time frame. Overall, 83% of cases were elective, 13% were urgent, and 4% emergent/trauma (Figure 3). The majority of cases were performed by orthopedic (32%) and general (30%) surgeons. Otolaryngologists performed 6% and neurosurgeons 5%; 24% were done by others (obstetrician/gynecologists, oral/maxillofacial surgeons, plastic surgeons, etc.), and 2% of cases had two or more surgical specialties involved (Figure 4).
Figure 3

Distribution of case type (n = 854).

Figure 4

Distribution of surgical specialty type (n = 854).

The SSA contains elements pertaining to a pre-anesthetic timeout; compliance with these elements is listed in Figure 5. There were three hospitals in which this was completed ≤5% of the time, while in two hospitals, at least some elements were reviewed >90% of the time. Overall, the components of this segment of the SSA were reviewed in fewer than 50% of patients.
Figure 5

Compliance with pre-anesthesia assessment checklist components prior to induction of anesthesia. * = significant variation across hospitals, p < 0.05.

Compliance with the pre-incision segment of the SSA was better in several elements, but not all (Figure 6). In nearly all (95-99%) cases the correct patient and procedure were verified, and the site was confirmed in 91% of relevant cases. There was significant hospital-to-hospital variation in addressing the following (overall mean in parentheses): team introductions (30%), cessation of activity (77%), assessing hypothermia risk (48%), need for beta blocker (23%), and VTE prophylaxis (65%). Uniformly poor compliance, without significant variation, was observed with respect to assessment of case duration (16%), blood loss (19%), anesthesiologists’ concerns (39%), and display of essential imaging (36%). Interestingly but perhaps not surprisingly, there were significant differences among surgical specialty groups in the majority of the elements (Figure 7). Compared with orthopedic and neurosurgeons, general surgeons were less compliant with team introductions, addressing critical steps, case duration, blood loss, anesthesia concerns, hypothermia risk, antibiotic administration, or displaying imaging. Orthopedic surgeons were less compliant with ceasing activity, and with assessing the need for beta-blockers or VTE prophylaxis.
Figure 6

Compliance with pre-incision time-out checklist components prior to initiation of surgical procedure. * = significant variation across hospitals, p < 0.05.

Figure 7

Compliance with checklist components prior to initiation of surgical procedure, by procedure. * = significant variation across specialty groups, p < 0.05.

Only 71% of observers perceived active participation by physicians; 9% reported that “the majority did not pay attention” and 4% reported that the team was “just going through the motions.” Compared with orthopedic and neurosurgeons, general surgeons were more compliant with active participation.


In this observational study, we have found that although over 90% of Colorado hospitals reported utilizing checklists in the OR, compliance with the Colorado SSC is consistently inconsistent and incomplete. This is not unique to Colorado. Pickering and colleagues [3] observed 294 operations performed over five different hospitals in the United Kingdom, and reported results similar to ours: although administrative audits indicated use of checklists in 95% of cases, active participation was observed in only 73%, and all information was communicated in just 55% of cases [3]. In the Netherlands, where checklist use was mandated by the Dutch Health Care Inspectorate, the checklist was fully completed in just 39% of cases [7]. In fact, as more studies have emerged, it has become clear that checklist implementation does not equate with compliance [8]. In one prospective observational study, Levy and colleagues [13] found that, despite 100% documented completion of the preincision phase of the checklist, most of the individual elements were either not addressed as designed, or not addressed at all. Similarly, Sparks and colleagues [14] found that in spite of a high level of overall participation and completion, the accuracy was poor.

The finding of poor overall compliance was not surprising- when Colorado hospitals had initially been surveyed about checklist use by CHA, one-third had raised concerns over the degree of active physician participation, and many suggested that the checklist is not consistently used on every surgical case. The need for active participation and a cultural change is a recurring theme in checklist implementation literature [8,15,16]. O’Connor and colleagues [17] examined the “human factors” in interviews with operating team members, and identified critical factors to improvement: 1) involvement of all the operating team members in the checklist process; 2) support from senior personnel; 3) ongoing education and training; and 4) breakdown of barriers to implementation. This project neither prescribed nor assessed implementation strategies. However, the barriers reported by the observers in the present study were consistent with these concepts.

In the present study, compliance varied across hospitals and by elements of the checklist. Variation across hospitals has been noted in other studies [3,18-20]. For example, van Schoten and colleagues [19] found that compliance in the Netherlands was inferior at academic hospitals, compared with general hospitals and teaching hospitals. They point out that the literature is mixed on whether large or small hospitals perform better [19]. In the present study, there was no statistical difference in compliance between the five highest-volume hospitals and the five lower-volume hospitals, although th e dichotomy was not substantial. This may merit further study.

The pre-anesthesia timeout was performed sporadically in the OR. This may have been due to the common practice of anesthesiologists to see patients in the preoperative holding area just moments before the patient being transported into the OR. Other elements of the pre-anesthesia assessment might be performed by the anesthesiologist but not reviewed with the nurses. In addition, the surgeon may not be present in the OR prior to induction of anesthesia, so elements might be “saved” for discussion with the surgeon. This is something to be explored in the future.

The pre-incision timeout did not have as much variation as the pre-anesthetic pause. Indeed, the culture of confirming patient/procedure/site seems to be well-ingrained. In studies directly assessing compliance, these items are consistently addressed [13,18]. On the other hand, elements that individually seem important are not routinely reviewed. From a surgeon’s perspective, this is understandable to a degree. Team introductions are not necessary if the team works together frequently; this may not have been noted in the observations. “Critical steps” may not seem necessary to review in “routine” procedures such as cholecystectomy or hernia repair. Similarly, anesthesiologists’ concerns, anticipated blood loss, case duration, display of imaging, or hypothermia may not seem relevant to brief, common procedures. However, it is important to remember that the checklist concept is designed to make a habit of getting all of the team members to speak up, and to discuss every item as they are individually important to surgical planning.

A noteworthy, but probably not surprising, finding of this study was that compliance with individual elements of the SSC varied by surgical specialty. This has been reported by others as well [18,19,21]. In the present study, general surgeons were less compliant with team introductions, addressing critical steps, case duration, blood loss, anesthesia concerns, hypothermia risk, antibiotic administration, or displaying imaging. There may be a variety of explanations for this. Teams that work together frequently may not feel compelled to introduce themselves; the observer may not have noted previous familiarity. We did not record details of operative procedures, so it is difficult to know whether critical steps, blood loss, hypothermia risk, or imaging were relevant- if the majority of cases were routine procedures (eg, hernia repairs, cholecystectomies) then it could be argued that such points were not germaine. On the other hand, it is not known whether unexpected occurrences arose during any of these procedures, and how they were handled. Orthopedic surgeons were less compliant with ceasing activity, and with assessing the need for beta-blockers or VTE prophylaxis. Without outcomes data (wrong site procedures, perioperative myocardial infarctions, venous thromboembolic events, etc.) it is not known whether or not these failures are problematic. The fact that general surgeons appear less compliant with certain measures may simply be a reflection of case complexity. Of note, although there is variability by surgical specialty, the “nurse-driven” components of the SSC (e.g. patient and site identity, and verification of instrument sterility) did not show any variation across specialties. This suggests that the nurses are consistent in initiating the SSC review, but that review of many individual elements is left to the surgeon. Interestingly, although the general surgeons were inferior in reviewing many checklist elements, the observers felt that the general surgeons were more compliant with active participation overall. This may indicate an opportunity for teamwork training with certain specialties/service lines.

The present study did not assess outcomes, but it is assumed that poor compliance puts patients at risk. The question is, what risk? Haynes and colleagues [1] reported that death and postoperative complications improved following implementation of SSCs. A before-and-after cohort study in the Netherlands similarly found that, after adjusting for case mix, implementation of checklists was associated with a significant reduction in mortality [7]. A subsequent review and meta-analysis- which included both the Haynes [1] and van Klei [7] studies- concluded that the evidence is “highly suggestive of a reduction in postoperative complications and mortality following implementation of the WHO SSC”- but the authors concluded that the evidence could not be regarded as definitive in the absence of higher-quality studies [9]. Indeed, a close look at the report of Haynes et al. [1] reveals that five of the eight hospitals had no significant improvement in the measured outcomes, and that baseline death and complication rates were higher than what is reported in US hospitals. And recently, Urbach and colleagues [4] reported that large community hospitals in Ontario nearly all reported checklist compliance in 99-100% of cases, and yet there was no reduction in operative mortality or complications. It was speculated by the authors- [4] and by Leape [22] in an accompanying editorial- that the failure to improve outcomes may have been related to poor compliance with checklist elements. However, without a clear link between compliance and outcomes in controlled studies, it is just as likely that the “positive” studies (e.g., Haynes et al., [1] van Klei et al. [7]) were confounded by the Hawthorne effect or other unknown factors.

One might argue that it is unrealistic to expect that completing a checklist in the OR will prevent mortality or major surgical complications. The existing literature does not indicate how often the checklist process identifies or averts an error (“near-misses”) or enhances preparation for a difficult case. On the other hand, there are some outcomes that absolutely should improve or be prevented altogether by reviewing a checklist. Appropriate verification of patient, procedure and site should completely eliminate wrong patient/wrong site procedures. Review of all SSC elements should also ensure compliance with core measures, which has very real financial repercussions for hospitals. In addition, a brief review of expected procedural steps and special equipment needs should improve OR efficiency by having necessary equipment readied in advance. Further, discussion of anticipated critical steps and blood loss may enhance the team’s preparation for dealing with challenging intraoperative events. Finally, the open discourse is recognized to improve the perception of teamwork.

A major strength of this study is that direct observations were performed surreptitiously by known members of the operating room staff, thus minimizing the Hawthorne effect. And recording of compliance with all the checklist elements is far more accurate than a simple “yes/no” recording of checklist use, or using surrogates such as core measures data. The recording of additional notations of perceived participation and teamwork, as well as barriers, has allowed us to provide feedback to hospitals that they may use to further educate their staff and improve compliance in the future. Another factor that minimized potential Hawthorne effect was that there was no formal implementation period. Hospitals were assessed in their current state, so there was no “new change” that might have waned over time.

A limitation of the current study is that the number of urgent and emergent cases was relatively low, precluding a separate analysis of those cases. It is generally assumed that compliance is lower in such situations, so team awareness of such challenges is important. In fact, van Klei and colleagues [7] noted that compliance was poor in urgent/emergent cases and cases involving sicker patients at higher risk of dying- and there was no improvement in mortality in these cases. In the Safe Surgery Saves Lives program, although compliance was not perfect, the investigators found a significant improvement in checklist compliance in urgent cases, and a significant reduction in complications after urgent surgery [23]. Another limitation is that details of case complexity were not recorded, precluding our ability to address our hypothesis that many elements were skipped on the basis of it being a “routine” case.

Surgical outcomes were not specifically assessed in this project. Prior to 2010, COPIC collected data on sentinel events; since then, however, only claims data are available- so it is not clear whether wrong patient/wrong site/wrong procedure events were different. In addition, it was impossible to link procedures to postoperative complications or mortality due to de-identification of data. Even if it had been available, the study was not designed to detect mortality differences and was likely underpowered to do so in the routine surgical population.

In sum, although most facilities report adoption and implementation of a comprehensive checklist, consistent adherence and multi-disciplinary participation with respect to the surgical checklist remains suboptimal. These comprehensive lists are inclusive of elements such as appropriate prophylactic antibiotic use prior to incision to lessen the likelihood of surgical-associated infections and appropriate surgical site marking – yet surgical site infections and wrong site procedures continue to be a prevalent problem in Colorado facilities. Surgical site infection (SSI) rates in Colorado facilities have remained relatively similar in the past two years and individual facility rates have largely not made statistical improvements (i.e. moving from a statistically same national comparison to a statistically better national comparison). Clearly, evaluation of implementation, consistent use, compliance and barriers are necessary elements to further study the SSC in Colorado hospitals.


In conclusion, SSCs have been implemented by the vast majority of hospitals in our state; however, compliance with SSC completion in the OR has wide variation and is generally suboptimal. Although this study was not designate to correlate SSC compliance with outcomes, there are concerns about the risk of a sentinel event or unanticipated complication resulting from poor preparation. There are many intuitively clear benefits to the use of SSCs, including: the absolute prevention of wrong patient/wrong site surgery; compliance with Joint Commission standards and National Hospital Inpatient Quality Measures; and improvement of efficiency by ensuring availability of important equipment and anticipation of contingencies. These outcomes are less easily measured, yet every failure is ultimately costly. Use of SSCs is important, and we feel that the focus should be on supporting local implementation efforts. A second phase of this project is under development to determine how compliance can be improved.



World health organization


Surgical safety checklist


Colorado hospital association


Surgical care improvement project


Surgical safety assessment


Venous thromboembolism


Operating room



Presented in part at the Annual Meeting of the Southwestern Surgical Congress, Scottsdale, AZ, April 2014.

Authors’ Affiliations

Department of Surgery, Denver Health Medical Center


  1. Haynes AB, Weiser TG, Berry WR, Lipsitz SR, Breizat AHS, Dellinger EP, et al. A surgical safety checklist to reduce morbidity and mortality in a global population. New Engl J Med. 2009;360:491–9.PubMedView ArticleGoogle Scholar
  2. Panesar SS, Cleary K, Sheikh A, Donaldson L. The WHO checklist: a global tool to prevent errors in surgery. Patient Saf Surg. 2009;3:9.PubMed CentralPubMedView ArticleGoogle Scholar
  3. Pickering SP, Robertson ER, Griffin D, Hadi M, Morgan LJ, Catchpole KC, et al. Compliance and use of the world health organization checklist in UK operating theatres. Br J Surg. 2013;100:1664–70.PubMedView ArticleGoogle Scholar
  4. Urbach DR, Govindarajan A, Saskin R, Wilton AS, Baxter NN. Introduction of surgical safety checklists in Ontario, Canada. New Engl J Med. 2014;370:1029–38.PubMedView ArticleGoogle Scholar
  5. Hurlburt SN, Garrett J. Improving operating room safety. Patient Saf Surg. 2009;3:25.View ArticleGoogle Scholar
  6. Birkmeyer JD. Strategies for improving surgical quality- checklists and beyond. New Engl J Med. 2010;363:1963–5.PubMedView ArticleGoogle Scholar
  7. van Klei WA, Hoff RG, van Aarnhem EE, Simmermacher RK, Regli LP, Kappen TH, et al. Effects of the introduction of the WHO “surgical safety checklist” on in-hospital mortality: a cohort study. Ann Surg. 2012;255:44–9.PubMedView ArticleGoogle Scholar
  8. Borchard A, Schwappach DLB, Barbir A, Bezzola P. A systematic review of the effectiveness, compliance, and critical factors for implementation of safety checklists in surgery. Ann Surg. 2012;256:925–33.PubMedView ArticleGoogle Scholar
  9. Bergs J, Hellings J, Cleemput I, Zurel O, De Troyer V, Van Hiel M, et al. Systematic review and meta-analysis of the effect of the world health organization surgical safety checklist on postoperative complications. Br J Surg. 2014;101:150–8.PubMedView ArticleGoogle Scholar
  10. Haugen AS, Softeland E, Almeland SK, Sevdalis N, Vonen B, Eide GE, et al. Effect of the World Health Organization checklist on patient outcomes: A stepped wedge cluster randomized controlled trial. Ann Surg. 2014. [Epub ahead of print].Google Scholar
  11. Lembitz A, Clarke TJ. Clarifying “never events” and introducing “always events.” Patient Saf Surg. 2009;3:26.PubMed CentralPubMedView ArticleGoogle Scholar
  12. Stahel PF, Sabel AL, Victoroff MS, Varnell J, Lembitz A, Boyle DJ, et al. Wrong-site and wrong patient procedures in the universal protocol era: analysis of a prospective database of physician self-reported occurrences. Arch Surg. 2010;145:978–84.PubMedView ArticleGoogle Scholar
  13. Levy SM, Senter CE, Hawkins RB, Zhao JY, Doody K, Kao LS, et al. Implementing a surgical checklist: more than checking a box. Surgery. 2012;152:331–6.PubMedView ArticleGoogle Scholar
  14. Sparks EA, Wehbe-Janek H, Johnson RL, Smythe WR, Papaconstantinou HT. Surgical safety checklist compliance: A job done poorly! J Am Coll Surg. 2013;217:867–73.PubMedView ArticleGoogle Scholar
  15. Paull DE, Mazzia LM, Izu BS, Neily J, Mills PD, Bagian JP. Predictors of successful implementation of preoperative briefings and postoperative debriefings after medical team training. Am J Surg. 2009;198:675–8.PubMedView ArticleGoogle Scholar
  16. Conley DM, Singer SJ, Edmondson L, Berry WR, Gawande AA. Effective surgical safety checklist implementation. J Am Coll Surg. 2011;212:873–9.PubMedView ArticleGoogle Scholar
  17. O’Connor P, Reddin C, O’Sullivan M, O’Duffy F, Keogh I. Surgical checklists: the human factor. Patient Saf Surg. 2013;7:14.PubMed CentralPubMedView ArticleGoogle Scholar
  18. Hannam JA, Glass L, Kwon J, Windsor J, Stapelberg F, Callaghan K, et al. A prospective, observational study of the effects of implementation strategy on compliance with a surgical safety checklist. BMJ Qual Saf. 2013;22:940–7.PubMedView ArticleGoogle Scholar
  19. Van Schoten SM, Kop V, de Blok C, Spreeuwenberg P, Groenewegen PP, Wagner C. Compliance with a time-out procedure intended to prevent wrong surgery in hospitals: Results of a national patient safety programme in the Netherlands. BMJ Open. 2014;4:e005075.PubMed CentralPubMedView ArticleGoogle Scholar
  20. Russ S, Rout S, Caris J, Mansell J, Davies R, Mayer E, et al. Measuring variation in use of the WHO surgical safety checklist in the operating room: A multi-center prospective cross-sectional study. J Am Coll Surg. 2014. doi:10.1016/j.jamcollsurg.2014.09.021Google Scholar
  21. Johnston FM, Tergas AI, Bennett JL, Valero III V, Morrissey CK, Fader AN, et al. Measuring briefing and checklist compliance in surgery: a tool for quality improvement. Am J Med Qual. 2014;29:491–8.PubMedView ArticleGoogle Scholar
  22. Leape LL. The checklist conundrum. New Engl J Med. 2014;370:1063–4.PubMedView ArticleGoogle Scholar
  23. Weiser TG, Haynes AB, Dziekan G, Berry WR, Lipsitz SR, Gawande AA. Effect of a 19-item surgical safety checklist during urgent operations in a global patient population. Ann Surg. 2010;251:976–80.PubMedView ArticleGoogle Scholar


© Biffl et al.; licensee BioMed Central. 2015

This is an Open Access article distributed under the terms of the Creative Commons Attribution License (, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited. The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated.