Quality and safety improvements are crucial in today’s healthcare landscape.[1, 2] Quality healthcare is about delivering services that boost desired health results and align with current medical knowledge.[3] The landmark Institute of Medicine (IOM) report, To Err Is Human,[4] highlighted that system flaws, not individual errors, are the main cause of medical mistakes. Inefficient processes, varied patient cases, insurance complexities, and differences in provider expertise all contribute to healthcare’s complexity. The IOM urged the healthcare industry to aim higher, proposing six key goals: effective, safe, patient-centered, timely, efficient, and equitable care.[2] Effectiveness and safety are often measured by assessing if healthcare providers use processes known to achieve desired outcomes and avoid harmful ones. The aim of quality measurement is to evaluate healthcare’s impact on outcomes and its adherence to evidence-based processes and patient preferences.
Since system failures are the root of errors,[5] process-improvement methods are vital for identifying inefficiencies, ineffective care, and preventable errors. These methods help drive system-level changes. Each technique involves evaluating performance and using findings to guide improvements. This article explores quality improvement strategies and tools – Failure Modes and Effects Analysis (FMEA), Plan-Do-Study-Act (PDSA), Six Sigma, Lean, and Root Cause Analysis (RCA) – all valuable in enhancing healthcare quality and safety, particularly through patient-centered plan-of-care tools for improving clinical outcomes.
Measures and Benchmarks for Quality Improvement
Measuring improvement is essential to confirm that quality efforts (1) positively change primary outcomes, (2) avoid unintended negative impacts elsewhere, and (3) effectively bring processes to acceptable standards.[6] The logic behind quality measurement is that good performance indicates good practice, and comparing performance across providers encourages advancement. Healthcare systems have seen a surge in performance measurement and reporting in recent years.[1, 7–9] Public quality reporting can pinpoint areas for improvement and set benchmarks at various levels.[10, 11] However, some providers are wary of publicly shared comparative data.[12] Consumers, another target audience, sometimes struggle to understand these reports, limiting their use for informed decisions about quality care.[13–15]
The complexity of healthcare, service delivery unpredictability, and the specialized yet interdependent nature of clinicians and systems[16–19] complicate quality measurement. One major challenge is the variability in attributing outcomes due to high-level clinical reasoning, judgment, problem-solving, and practical experience.[20–22] Another challenge is judging if a near miss could have caused harm or if an adverse event is a rare event or likely to reoccur.[23]
Organizations like the Agency for Healthcare Research and Quality (AHRQ), the National Quality Forum, and The Joint Commission advocate using reliable measures of quality and patient safety to drive healthcare improvement. AHRQ’s National Quality Measures Clearinghouse (http://www.qualitymeasures.ahrq.gov) and the National Quality Forum’s website (http://www.qualityforum.org) offer many useful measures applicable across different care settings and processes. These measures are developed through rigorous processes, including assessing scientific evidence, evaluating measure validity and reliability, determining optimal measure use (e.g., risk adjustment), and real-world testing.[24, 25]
Quality and safety measures track the progress of improvement initiatives against external benchmarks. Benchmarking in healthcare is a continuous, collaborative process of comparing key work process results with top performers[26] to evaluate organizational performance. Two benchmarking types are relevant: Internal benchmarking identifies best practices within an organization, compares practices internally, and tracks practice changes over time. Data can be plotted on control charts with statistical limits. However, it may not reflect external best practices. Competitive or external benchmarking uses comparative data across organizations to assess performance and identify successful improvements elsewhere. National reports from AHRQ[1, 9] and proprietary benchmarking groups offer comparative data.
Quality Improvement Strategies in Healthcare
Over four decades ago, Donabedian[27] suggested assessing healthcare quality through structure, processes, and outcomes. Structure measures evaluate resource availability and quality (e.g., insurance coverage, hospital bed capacity, trained nurses). Process measures assess service delivery by clinicians (e.g., diabetes care guidelines). Outcome measures reflect the final health result, influenced by environmental and behavioral factors (e.g., mortality, patient satisfaction, health status).
Two decades later, healthcare adopted techniques from Deming’s work[28] in post-WWII Japan’s manufacturing revival. Deming, the father of Total Quality Management (TQM), emphasized “constancy of purpose” and systematic process analysis. TQM is an organizational approach integrating management, teamwork, defined processes, systems thinking, and change to foster improvement. It emphasizes organization-wide commitment to quality for optimal results.[29]
Continuous Quality Improvement (CQI) is often used interchangeably with TQM in healthcare. CQI aims to improve clinical practice[30] by assuming improvement opportunities exist in every process and instance.[31] In-hospital Quality Assurance (QA) programs often focus on regulatory or accreditation issues, like documentation checks and credentialing reviews.[32] Clinical Practice Improvement (CPI),[33] a clinician-led approach, comprehensively understands healthcare delivery complexity using teams, defining purpose, data collection, and translating findings into practice changes. Management and clinician commitment are crucial for successful change implementation.[34–36] Management support, clear communication, and staff empowerment are also essential in quality improvement.[37]
Over the last 20 years, quality improvement methods emphasize “identifying processes with suboptimal outcomes, measuring key attributes, devising new approaches, integrating redesigns, and reassessing performance.”[38] Besides TQM, other strategies include ISO 9000, Zero Defects, Six Sigma, Baldridge, and Lean Production.[6, 39, 40]
Quality improvement is defined as “systematic, data-driven activities for immediate healthcare delivery improvement in specific settings.”[41] A quality improvement strategy is “any intervention to reduce quality gaps for typical patients.”[38] Shojania et al.[38] developed a taxonomy of quality improvement strategies (Table 1), suggesting strategy choice depends on the project nature. AHRQ’s quality tools (www.qualitytools.ahrq.gov) and patient safety (www.patientsafety.gov) websites offer further resources.
Quality improvement projects differ from research: projects address specific problems with rapid changes and strategy adoption, while research seeks generalizable results.[6] Reinhardt and Ray[42] differentiate them by: (1) QI applies research to practice, research develops new interventions; (2) QI poses no participant risk, research might; (3) QI’s audience is the organization, research is broadly applicable; (4) QI data is organization-specific, research data is multi-organizational.
Limited scientific literature has hindered quality improvement adoption in healthcare,[43, 44] but rigorous studies are emerging. A QI project becomes more research-like when it changes practice, affects patient outcomes, uses randomization, and poses risks for generalizability.[45–47] Regardless of project type, human subjects must be protected with respect, informed consent, and scientific value.[41, 46, 48]
Plan-Do-Study-Act (PDSA) Cycle
The Plan-Do-Study-Act (PDSA) model is valuable for quality improvement projects aiming for positive changes in healthcare processes and outcomes. Widely used for rapid cycle improvement,[31, 49] PDSA’s strength lies in its cyclical approach to change and assessment. Small, frequent PDSA cycles are more effective than large, slow ones[50] before system-wide changes.[31, 51]
PDSA aims to link process changes (behaviors and capabilities) to outcomes. Langley et al.[51] suggest asking three questions before PDSA cycles: (1) Project goal? (2) How to measure goal achievement? (3) Actions to reach the goal? PDSA starts by defining the problem, potential changes, a change plan, stakeholders, impact measurements, and target areas. Change is implemented, and data is gathered. Implementation results are assessed against key success/failure indicators. Finally, action is taken: implement the change or restart the process.[51]
Six Sigma Methodology
Six Sigma, a business strategy, improves, designs, and monitors processes to minimize waste, optimize satisfaction, and enhance financial stability.[52] Process performance (capability) measures improvement by comparing baseline capability to post-improvement capability after piloting solutions.[53] Six Sigma uses two main methods. One inspects process outcomes, counts defects, calculates defect rates per million, and converts this to a σ (sigma) metric. This applies to pre- and post-analytic processes. The second method predicts process performance by calculating a σ metric from tolerance limits and observed process variation, suitable for analytic processes where precision and accuracy are experimentally determined.
Six Sigma employs a structured, rigorous five-phase process: Define, Measure, Analyze, Improve, and Control (DMAIC).[53, 54] It starts by defining the project, reviewing data, and setting expectations. Then, performance standards and objectives are set, and variability sources are identified. As the project is implemented, data assesses process improvement. Validated measures determine new process capability to support analysis.
Six Sigma and PDSA are related. DMAIC builds on Shewhart’s Plan, Do, Check, Act cycle.[55] PDSA’s plan phase aligns with Six Sigma’s define core processes, customers, and requirements. PDSA’s do phase relates to Six Sigma’s measure performance. PDSA’s study phase connects to Six Sigma’s analyze. And PDSA’s act phase corresponds to Six Sigma’s improve and integrate.[56]
Lean Production System / Toyota Production System
Applying the Toyota Production System, known for car manufacturing,[57] led to the Lean Production System or Lean methodology. Lean overlaps with Six Sigma but focuses on customer needs and improving processes by removing non-value-added activities (waste). Lean maximizes value-added activities in optimal sequence for continuous operations.[58] It relies on root cause analysis to investigate errors, improve quality, and prevent recurrence.
Physicians, nurses, and managers are enhancing patient care and cutting costs in labs, pharmacies,[59–61] and blood banks[61] using Toyota Production System principles. Reviews of Toyota Production System projects show healthcare organizations improving safety and quality by defining problems, using root cause analysis, setting goals, clarifying responsibilities, and simplifying workflows.[59, 60] Spear noted that the Toyota Production System clarifies “which patient gets which procedure (output); who does which aspect of the job (responsibility); exactly which signals indicate work start (connection); and precisely how each step is carried out.”[60]
Successful Lean application in healthcare involves eliminating unnecessary daily activities from “overcomplicated processes, workarounds, and rework,”[59] engaging frontline staff, and rigorously tracking problems throughout problem-solving.
Root Cause Analysis (RCA)
Root Cause Analysis (RCA), widely used in engineering,[62] is a formal investigation and problem-solving method focused on identifying the underlying causes of events and potential intercepted events. The Joint Commission mandates RCA for all sentinel events, requiring action plans to reduce future risk and monitor improvement effectiveness.[64]
RCA is used to identify trends and assess risk when human error is suspected,[65] assuming system, not individual factors, are usually the root cause.[2, 4] Critical Incident Technique is similar, collecting information on event causes and actions post-event.[63]
RCA is a reactive, retrospective assessment outlining event sequences, causal factors, and root causes.[66] A multidisciplinary RCA team enhances finding validity.[67] Aggregate RCA, used by the VA, efficiently assesses trends through simultaneous RCAs, rather than in-depth case assessments.[68]
Using a qualitative approach, RCA uncovers error causes by examining enabling factors (e.g., lack of training), latent conditions (e.g., not checking patient ID bands), and situational factors (e.g., similar patient names) contributing to adverse events (e.g., medication errors). Investigators ask key questions: what happened, why, proximate factors, why those factors, and underlying systems/processes. This identifies safety barrier failures and problem causes for future prevention. Considering immediate pre-event factors and broader contributing factors is crucial.[68]
Traditional RCA’s final step is developing system and process improvement recommendations based on findings.[68] Literature reviews suggest RCA alone may not improve patient safety significantly.[69] The VA uses aggregate RCA, examining multiple cases simultaneously for specific event categories.[68, 70]
Given event diversity and numerous root causes, differentiating system from process factors without blaming individuals is vital. Errors are rarely due to irresponsibility or intent,[71] as supported by the IOM.[4, 72] Categorizing individual errors (e.g., TERCAP’s “lack of attentiveness, inappropriate judgment”)[73] may divert focus from system and process factors modifiable through interventions. Most individual factors can be addressed through training and error-preventing systems.
Failure Modes and Effects Analysis (FMEA)
Errors are inevitable and unpredictable. Failure Modes and Effects Analysis (FMEA) is a proactive technique to identify and eliminate potential failures, problems, and errors in systems or processes before they occur.[74–76] Developed for the military and used by NASA, FMEA predicts and evaluates potential failures and hazards, proactively identifying steps to reduce future failures.[77] FMEA aims to prevent errors by identifying potential failure modes, estimating their probability and consequences, and acting to prevent them. In healthcare, FMEA focuses on care systems and uses multidisciplinary teams for quality improvement.
FMEA can evaluate alternative processes and monitor change over time using objective measures of process effectiveness. In 2001, The Joint Commission mandated proactive risk management, identifying system weaknesses and adopting changes to minimize patient harm annually.[78]
Health Failure Modes and Effects Analysis (HFMEA)
Developed by the VA National Center for Patient Safety, Health Failure Modes and Effects Analysis (HFMEA) is a risk assessment tool with five steps: (1) define the topic; (2) assemble a team; (3) process map with numbered steps and substeps; (4) hazard analysis (failure modes, scoring, decision tree);[79] and (5) develop actions and outcomes. Hazard analysis lists possible failure modes for each process, determines action necessity, and lists causes. Post-hazard analysis, actions and outcome measures are considered, including elimination/control strategies and responsibilities.[79]
Research Evidence on Quality Improvement Implementation
An analysis of fifty studies and quality improvement projects revealed findings categorized by quality method (FMEA, RCA, Six Sigma, Lean, PDSA). Common themes emerged: (1) requirements for implementing quality improvement strategies, (2) lessons from evaluating change intervention impacts, and (3) knowledge about quality improvement tool use in healthcare.
Requirements for Implementing Quality Improvement Strategies
Strong leadership support,[80–83] involvement,[81, 84] consistent commitment to CQI,[85, 86] and visibility,[87] both written and physical,[86] are crucial for significant changes. Hospital board commitment is also vital.[86, 88] Resource demands necessitate senior leadership to: (1) ensure financial resources[87–89] for training and technology;[90, 91] (2) enable key players’ time for change processes,[85, 88, 89] providing support;[90] (3) allow sufficient project time;[86, 92] and (4) emphasize safety as a priority, reinforcing expectations.[87] Leaders must understand high-level decisions’ impact on workflows and staff time[88] and incorporate QI into leadership development.[88] Leadership must prioritize patient safety in meetings and strategies,[85, 86] create annual safety goals, and be accountable for patient safety outcomes.[85]
Despite leadership commitment, organizational hesitation may arise from past failed change efforts,[93] lack of commitment,[94] poor relationships, and ineffective communication.[89] Overcoming barriers requires embracing change,[95] culture change,[90] and institutionalizing safety and QI cultures. Non-punitive cultures take time,[61, 90] sometimes involving legal departments to shift focus to systems, not individuals.[96] Staff feel more comfortable with improvement, especially with cost savings and job security assurances despite efficiencies.[84]
Improvement processes need to engage[97] all stakeholders, demonstrating QI resource investment can be recouped through efficiency and fewer adverse events.[86] Stakeholders prioritize safe practices,[86, 98] develop solutions addressing interdisciplinary communication and teamwork (crucial for safety culture), and build on others’ successes.[86] Successful collaboratives involve stakeholders in subject choice, objectives, roles, motivation, and data analysis.[86] Different stakeholder perspectives must be considered.[97] Variation in opinions is expected,[99] and buy-in can be challenging, requiring early stakeholder involvement, feedback solicitation,[100] and support for critical changes.[101]
Communication and information sharing are vital for specifying QI purpose and strategy,[101] developing open communication across disciplines and levels, allowing voicing concerns, including patients and families, ensuring team integration and shared responsibility, sharing RCA lessons, and celebrating successes.[85] However, staff may resist system changes based on data despite efforts to inform them.[89]
Successful strategies rely on motivated[80] and empowered teams. Multidisciplinary teams reviewing data and leading change offer many advantages.[91] Teams must include the right staff,[91, 92] peers,[102] stakeholders (managers to staff), and have senior leadership support.[85, 86] Key stakeholders (nurses, physicians) must be involved[81] and supported to champion[103] and problem-solve within departments for interventions to succeed. Implementing initiatives requires significant changes in daily work,[86] necessitating considering frontline staff attitudes and willingness to improve.[59, 88, 104]
Other key factors include adaptable protocols based on patient needs[93] and unit experience/culture.[88] Defining and testing different approaches is vital.[81] Mechanisms facilitating buy-in include highlighting error types and causes,[102] involving staff in work assessment for waste,[59] providing insights into feasibility and measurable impact,[105] and presenting evidence-based changes.[100] Physicians are crucial leaders[106] and active participants,[86] especially when their behaviors create inefficiencies.[84] Physician champions can promote patient safety.[85]
Team leaders and composition are also important. Leaders emphasizing relationship building are necessary for team success.[83, 93] Dedicated leaders with significant project time are needed.[84] Team co-chairs in one project included a physician and administrator.[83] Visible champions enhance initiative visibility.[100] Multidisciplinary teams need to understand QI steps and error opportunities to prioritize critical improvements and reduce analysis subjectivity. Team diversity allows step identification from different perspectives, barrier anticipation, idea generation, and team building discussions.[100, 107] FMEA/HFMEA minimizes biases by leveraging team diversity and structuring goals.[107, 108]
Teams need preparation and ongoing education, debriefings, problem reviews, and monitoring/feedback opportunities.[84, 92, 95] Staff[95, 80, 101, 104] and leadership[80] education on problems, QI tools, planned changes, and project updates are key.[92] Training is ongoing,[91] focusing on skill deficits[82] and adapting to project implementation insights.[109] Senior staff training should not be overlooked.[105] Consultants can provide advanced QI technique knowledge.[106] Hospital-community interface models coupled with education programs are also beneficial.[97]
Teamwork improves interdepartmental relationships.[89] Team building,[110] rapid-cycle (PDSA) model use, frequent meetings, and monthly outcome data monitoring are essential.[86] Effective teamwork, communication, information transfer, coordination, and culture changes are crucial for team effectiveness.[86] However, competing workloads can hinder team member engagement.[97] Understanding each other’s roles is an important outcome, fostering continued practice development.[97] Team motivation comes from progress sharing, success celebration, and achievement recognition.[87]
Teams increase knowledge scope, improve communication, and facilitate problem learning.[111] They are proactive,[91] integrating technical processes and organizational relationships,[83] and collaborate to understand situations, define problems, pathways, tasks, and develop action plans.[59] However, teamwork can be difficult and time-consuming,[111] with consensus delayed by conflicting preferences.[97] Team members must learn group dynamics, peer confrontation, conflict resolution, and addressing detrimental behaviors.[111]
Lessons Learned from Evaluating Change Interventions
Successful QI initiatives simplify,[96, 104] standardize,[104] stratify effects, improve communication, support communication against authority gradients,[96] use defaults properly, automate cautiously,[96] use affordance and natural mapping (easy right actions), respect vigilance limits,[96] and encourage near miss reporting.[96] Policy and procedure revision and standardization effectively make new processes easier and reduce human error from limited vigilance.[78, 80–82, 90–92, 94, 96, 102, 103, 113, 114]
Simplification and standardization effectively force functions, reducing reliance on individual decision-making. Standardized medication ordering and administration protocols[78, 87, 101, 103, 106–109, 114–116] improved patient outcomes, nurse efficiency, and effectiveness.[103, 106, 108, 109, 114–116] One initiative used standardized blood product ordering forms.[94] Standardized pain metrics and assessments improved pain management in four initiatives.[80, 93, 100, 117] Simplification and standardization proved effective strategies.
Information technology can implement checks, defaults, and automation to enhance quality and reduce errors, embedding forcing functions to prevent errors.[96, 106] Necessary redundancy, like double-checking, mitigates human error by engaging two skilled practitioners[61, 101] and successfully reducing dosing errors.[78] IT successfully (1) reduces human error through automation;[61] (2) standardizes medication concentrations,[78] dosing calculations,[115, 116] protocols,[101] and order clarity;[116] (3) assists care with alerts and reminders; (4) improves medication safety (barcoding, CPOE); (5) tracks performance via databases. Workflow and procedures must adapt to technology.[78] Technology investment shows organizational commitment to improvement,[85] but resource lack for data collection hindered analysis in two initiatives.[93, 97]
Data and information are needed to understand error root causes,[99] adverse event magnitude,[106] track performance,[84, 118] and assess initiative impact.[61] Near miss and error reporting must be encouraged.[96] Error reporting is generally low and culture-dependent,[106] and can be biased, tainting results.[102] Organizations not prioritizing safety cultures may underreport errors (see Chapter 35). Data analysis is critical, and staff may need training on effective analysis and display.[106] Transparent feedback processes[39] and reporting[82] bring patient safety to the forefront.[107] Data absence, whether unreported or uncollected, hinders statistical analysis[115] and cost-benefit assessment.[108] Multi-organizational collaboration should have common databases.[98]
Measures and benchmarks enhance data understanding. Repeated measurements monitor progress,[118] but require clear success metrics.[83] Measures can engage clinicians, especially physicians. Objective, broader measures mark progress and provide a “call to action” and celebration.[106] Demonstrating the link between care process changes and outcomes is crucial when using process measures.[61]
Multiple measures and better documentation enhance patient outcome assessment.[93] Hospital administrators should encourage initiative evaluations focusing on patient outcomes, satisfaction, and cost-effectiveness.[114] Outcome assessment is improved by realistic goals (not 100% change)[119] and comparing results to benchmarks.[61, 88]
Initiative cost is a key factor, even when adverse effects necessitate rapid change.[106] Feasible changes with minimal practice disruption are important.[99] Replication potential is also crucial.[99] Standardizing processes improves replication chances but incurs costs.[106] Faster problem resolution facilitates system-wide replication.[84, 106] Low-cost, effective recommendations are implemented quickly.[93, 107] Some investigators claimed cost and length-of-stay reductions,[103] but lacked data. Change costs can be recouped through ROI or reduced liability from patient risk reduction.[61]
Staff education is critical. Pain management initiatives showed staff education on guidelines improved understanding, assessment, documentation, satisfaction, and pain management.[80, 93] IV site care and central line assessment education improved satisfaction and reduced complications and costs.[109]
Despite benefits, implementation challenges exist:
Despite challenges, perseverance is vital. New processes can be difficult to introduce,[84, 100] but quality improvement is rewarding.[84] QI is time-consuming, tedious, resource-intensive,[94] and involves trial and error.[91] Celebrating victories is important.[84]
Sustaining changes post-implementation is crucial.[105] QI should be integral to organization-wide improvement. Factors for success include bedside-friendly changes,[82] simple communication,[88] project visibility,[100] safety culture, and infrastructure strengthening.[121] Opposing views exist on spreading specific change steps versus adapting best practices.[106, 121] Generating enthusiasm for change via collaboration[103] and healthy competition is important. Collaboratives encourage evidence-based practice, rapid-cycle improvement, and consensus on better practices.[86, 98]
Knowledge About Quality Improvement Tools in Healthcare
QI tools defining and assessing healthcare problems help prioritize quality and safety issues[99] and focus on systems,[98] not individuals. Tools address errors, costs,[88] and change provider practices.[117] Many initiatives used multiple tools, starting with RCA then using Six Sigma, Lean, or PDSA for process changes. Pretesting/pilot testing was common.[92, 99] Investigators reported advantages of specific tools:
Root Cause Analysis (RCA): Useful for assessing errors/incidents, differentiating active and latent errors, identifying policy/procedure changes, and suggesting system changes, including risk communication improvement.[82, 96, 102, 105]
Six Sigma/Toyota Production System: Successfully decreased defects/variations,[59, 61, 81] operating costs,[81] and improved outcomes across settings and processes.[61, 88] Six Sigma clearly differentiates variation causes and process outcome measures.[61] It makes workarounds difficult by targeting pre-implementation process root causes.[59, 88] Teams improve implementation and results with experience.[84] Effective use requires leadership time and resource commitment for safety, cost reduction, and job satisfaction.[84] Six Sigma is valuable for problem-solving, clear communication, guiding implementation, and producing objective results.[59]
Plan-Do-Study-Act (PDSA): Used by most initiatives for gradual implementation and iterative improvement. PDSA’s rapid-cycle approach pilots processes, examines results, problem-solves, adjusts, and repeats cycles. Small, rapid cycles are more successful, allowing early adjustments[80] and avoiding distraction by details.[87, 119, 122] PDSA success is enhanced by training, baseline feedback,[118] regular meetings,[120] and collaboration with stakeholders, including patients and families,[80] for common goals.[87] Conversely, some teams struggled with rapid-cycle change, data collection, and run charts,[86] and simple PDSA rules may be more effective in complex systems.[93]
Failure Modes and Effects Analysis (FMEA): Used to prevent events and improve care quality.[123] FMEA prospectively identifies potential failures[94] for assessing process characterization,[115] and retrospectively characterizes process safety, learning from staff perspectives.[94] Process flowcharts focus teams and ensure alignment.[94] FMEA data prioritizes improvement strategies, benchmarks efforts,[116] educates and rationalizes practice change diffusion,[115] and enhances team’s ability to facilitate change across services and departments.[124] FMEA facilitates systematic error management, crucial in complex processes and settings, and relies on multidisciplinary approaches, incident reporting, decision support, standardized terminology, and caregiver education.[116]
Health Failure Modes and Effects Analysis (HFMEA): Provides detailed analysis of smaller and larger processes, yielding specific recommendations. HFMEA is a valid proactive risk analysis tool, thoroughly analyzing vulnerabilities (failure modes) before adverse events.[108] It identifies the multifactorial nature of errors[108] and potential risk,[111] but is time-consuming.[107] HFMEA minimizes group biases through multidisciplinary teams[78, 108, 115] and facilitates teamwork with step-by-step processes,[107] but requires paradigm shifts for many.[111]
Evidence-Based Practice Implications for Patient-Centered Care
Several themes emerged from successful quality improvement initiatives, offering guidance for nurses’ QI efforts. The strength of these implications lies in the methodological rigor and generalizability of the assessed strategies and projects:
- Leadership Commitment and Support: Strong leadership commitment and support are paramount. Leaders must empower staff, actively participate, and continuously drive quality improvement. Without senior leadership commitment, even well-intentioned projects are at risk. Champions for QI need to be throughout the organization, especially in leadership roles and on teams.
- Culture of Safety and Improvement: Fostering a culture of safety and improvement that rewards progress is essential. This culture should support a quality infrastructure with resources and human capital for successful QI.
- Stakeholder Involvement: Quality improvement teams must involve the right stakeholders, ensuring diverse perspectives and buy-in.
- Multidisciplinary Teams and Strategies: Healthcare complexity necessitates multidisciplinary teams and strategies. Teams from different units need to collaborate closely using communication strategies (meetings, calls, listservs) and trained facilitators.
- Problem and Root Cause Understanding: Teams and stakeholders must understand the problem and its root causes, agreeing on a clear, universally accepted metric. This agreement is as crucial as data validity.
- Methodologically Sound Approach: Use a proven, methodologically sound approach, focusing on clear models and processes, not just QI jargon. Many tools are interrelated, and using one tool alone is insufficient.
- Standardized Care Processes: Standardizing care processes and ensuring adherence enhances efficiency, effectiveness, and organizational/patient outcomes.
- Evidence-Based Practice Integration: Evidence-based practice facilitates ongoing quality improvement efforts, ensuring interventions are grounded in best practices.
- Flexible Implementation Plans: Implementation plans must be flexible to adapt to necessary changes as they arise during the QI process.
- Multiple Improvement Purposes: QI efforts can serve multiple purposes, including redesigning processes for efficiency, improving satisfaction, enhancing patient outcomes, and improving organizational climate.
- Appropriate Technology Use: Technology can improve team function, collaboration, reduce human error, and enhance patient safety when implemented thoughtfully.
- Sufficient Resource Allocation: Efforts require sufficient resources, including protected staff time, to ensure proper implementation and data collection.
- Continuous Data Collection and Analysis: Continuously collect and analyze data, communicating results on critical indicators across the organization. The goal is to use findings to assess performance and identify further improvement areas.
- Time for Change: Change takes time, so perseverance and sustained focus are essential for successful and lasting quality improvements.
Research Implications for Future Quality Improvement
Assessing quality improvement in healthcare is dynamic and complex. The knowledge base is growing slowly, partly due to ongoing debates about whether QI initiatives constitute research requiring methodological rigor for publication. While QI methods have been used since Donabedian’s 1966 publication,[27] Six Sigma and similar methodologies are more recently being applied and published in healthcare QI, often focusing on single system components, hindering organizational learning and generalizability. Despite the importance of QI, many organizational efforts may remain unpublished, not necessarily warranting peer-reviewed publication. Researchers, leaders, and clinicians need to define what QI efforts are generalizable and publishable to advance QI knowledge.
While many projects mentioned clinical, functional, satisfaction, and change readiness outcomes, cost and utilization outcomes are also important in QI, especially with variation. Key unanswered questions include:
- How can QI efforts successfully address needs of patients, insurers, regulators, and staff?
- What is the best method to prioritize improvements and balance stakeholder needs?
- What variation threshold is needed for consistently desired results?
- How can bottom-up practice change succeed without senior leadership or supportive organizational culture?
Researchers should use conceptual models to guide QI initiatives and research, leveraging the tools discussed. To generalize findings, larger sample sizes via multi-organizational collaboration are needed. Understanding which tools work best, alone or combined, is crucial. Mixed methods, including non-research methods, may better capture QI complexity. We lack knowledge on how tailoring implementation interventions affects process and patient outcomes, and the most effective cross-intervention steps. Finally, we need to understand which strategies work for whom, in what contexts, why they work or fail in different settings, and the mechanisms driving strategy effectiveness.
Conclusions
Regardless of the specific method (TQM, CQI) or tool (FMEA, Six Sigma), effective quality improvement is a dynamic process often combining multiple tools. Success requires five key elements: fostering a change and safety culture, clarifying problem understanding, involving stakeholders, testing change strategies, and continuous performance monitoring and reporting to sustain change. Ultimately, patient-centered plan-of-care tools for improving clinical outcomes are integral to this dynamic process, ensuring that quality improvement efforts are directly beneficial to those receiving care.
Search Strategy
PubMed and CINAHL were searched from 1997 to present for quality improvement efforts. Keywords: “Failure Modes and Effects Analysis/FMEA,” “Root Cause Analysis/RCA,” “Six Sigma,” “Toyota Production System/Lean,” and “Plan Do Study Act/PDSA.” 438 articles were retrieved. Inclusion criteria: nursing involvement, projects/research using FMEA, RCA, Six Sigma, Lean, or PDSA, qualitative/quantitative analyses, and reported patient outcomes. Exclusions: no nursing involvement, insufficient process/outcome information, indirect nursing involvement, developing country settings. Findings were grouped into common QI themes.
References
- National Healthcare Quality Report. Rockville, MD: Agency for Healthcare Research and Quality; 2006. [Accessed March 16, 2008]. http://www.ahrq.gov/qual/nhqr06/nhqr06.htm.
- Institute of Medicine. Crossing the quality chasm: a new health system for the 21st century. Washington, DC: National Academy Press; 2001. pp. 164–80. [PubMed: 25057539]
- Lohr KN, Schroeder SA. A strategy for quality assurance in Medicare. N Engl J Med. 1990;322:1161–71. [PubMed: 2406600]
- Institute of Medicine. To err is human: building a safer health system. Washington, DC: National Academy Press; 1999.
- McNally MK, Page MA, Sunderland VB. Failure mode and effects analysis in improving a drug distribution system. Am J Health Syst Pharm. 1997;54:17–7. [PubMed: 9117805]
- Varkey P, Peller K, Resar RK. Basics of quality improvement in health care. Mayo Clin Proc. 2007;82(6):735–9. [PubMed: 17550754]
- Marshall M, Shekelle P, Davies H, et al. Public reporting on quality in the United States and the United Kingdom. Health Aff. 2003;22(3):134–48. [PubMed: 12757278]
- Loeb J. The current state of performance measurement in healthcare. Int J Qual Health Care. 2004;16(Suppl 1):i5–9. [PubMed: 15059982]
- National Healthcare Disparities Report. Rockville, MD: Agency for Healthcare Research and Quality; 2006. [Accessed March 16, 2008]. Available at: http://www.ahrq.gov/qual/nhdr06/nhdr06.htm.
- Schoen C, Davis K, How SKH, et al. U.S. health system performance: a national scorecard. Health Affiars. 2006:w457–75. [PubMed: 16987933]
- Wakefield DS, Hendryx MS, Uden-Holman T, et al. Comparing providers’ performance: problems in making the ‘report card’ analogy fit. J Healthc Qual. 1996;18(6):4–10. [PubMed: 10162089]
- Marshall M, Shekelle PG, Leatherman S, et al. The public release of performance data: what do we expect to gain, a review of the evidence. JAMA. 2000;283:1866–74. [PubMed: 10770149]
- Schneider EC, Lieberman T. Publicly disclosed information about the quality of health care: response of the U.S. public. Qual Health Care. 2001;10:96–103. [PMC free article: PMC1757976] [PubMed: 11389318]
- Hibbard JH, Harris-Kojetin L, Mullin P, et al. Increasing the impact of health plan report cards by addressing consumers’ concerns. Health Affairs. 2000 Sept/Oct;19:138–43. [PubMed: 10992661]
- Bentley JM, Nask DB. How Pennsylvania hospitals have responded to publicly release reports on coronary artery bypass graft surgery. Jt Comm J Qual Improv. 1998;24(1):40–9. [PubMed: 9494873]
- Ferlie E, Fitzgerald L, Wood M, et al. The nonspread of innovations: the mediating role of professionals. Acad Manage J. 2005;48(1):117–34.
- Glouberman S, Mintzberg H. Managing the care of health and the cure of disease– part I: differentiation. Health Care Manage Rev. 2001;26(1):56–9. [PubMed: 11233354]
- Degeling P, Kennedy J, Hill M. Mediating the cultural boundaries between medicine, nursing and management—the central challenge in hospital reform. Health Serv Manage Res. 2001;14(1):36–48. [PubMed: 11246783]
- Gaba DM. Structural and organizational issues is patient safety: a comparison of health care to other high-hazard industries. Calif Manage Rev. 2000;43(1):83–102.
- Lee JL, Change ML, Pearson ML, et al. Does what nurses do affect clinical outcomes for hospitalized patients? A review of the literature. Health Serv Res. 1999;29(11):39–45. [PMC free article: PMC1089070] [PubMed: 10591270]
- Taylor C. Problem solving in clinical nursing practice. J Adv Nurs. 1997;26:329–36. [PubMed: 9292367]
- Benner P. From novice to expert: power and excellence in nursing practice. Menlo Part, CA: Addison-Wesley; Publishing Company: 1984.
- March JG, Sproull LS, Tamuz M. Learning from samples of one or fewer. Organizational Science. 1991;2(1):1–13.
- McGlynn EA, Asch SM. Developing a clinical performance measure. Am J Prev Med. 1998;14(3s):14–21. [PubMed: 9566932]
- McGlynn EA. Choosing and evaluating clinical performance measures. Jt Comm J Qual Improv. 1998;24(9):470–9. [PubMed: 9770637]
- Gift RG, Mosel D. Benchmarking in health care. Chicago, IL: American Hospital Publishing, Inc.; 1994. p. 5.
- Donabedian A. Evaluating quality of medical care. Milbank Q. 1966;44:166–206. [PubMed: 5338568]
- Deming WE. Out of the Crisis. Cambridge, MA: Massachusetts Institute of Technology Center for Advanced Engineering Study; 1986.
- Berwick DM, Godfrey AB, Roessner J. Curing health care. San Francisco, CA: Jossey-Bass; 2002.
- Wallin L, Bostrom AM, Wikblad K, et al. Sustainability in changing clinical practice promotes evidence-based nursing care. J Adv Nurs. 2003;41(5):509–18. [PubMed: 12603576]
- Berwick DM. Developing and testing changes in delivery of care. Ann Intern Med. 1998;128:651–6. [PubMed: 9537939]
- Chassin MR. Quality of Care–part 3: Improving the quality of care. N Engl J Med. 1996:1060–3. [PubMed: 8793935]
- Horn SD, Hickey JV, Carrol TL, et al. Can evidence-based medicine and outcomes research contribute to error reduction? In: Rosenthal MM, Sutcliffe KN, editors. Medical error: what do we know? What do we do? San Francisco, CA: Jossey-Bass; 2002. pp. 157–73.
- Joss R. What makes for successful TQM in the NHS? Int J Health Care Qual Assur. 1994;7(7):4–9. [PubMed: 10140850]
- Nwabueze U, Kanji GK. The implementation of total quality management in the NHS: how to avoid failure. Total Quality Management. 1997;8(5):265–80.
- Jackson S. Successfully implementing total quality management tools within healthcare: what are the key actions? Int J Health Care Qual Assur. 2001;14(4):157–63.
- Rago WV. Struggles in transformation: a study in TQM, leadership and organizational culture in a government agency. Public Adm Rev. 1996;56(3)
- Shojania KG, McDonald KM, Wachter RM, et al. Closing the quality gap: a critical analysis of quality improvement strategies, Volume 1–Series Overview and Methodology Technical Review 9 (Contract No 290-02-0017 to the Stanford University–UCSF Evidence-based Practice Center). Rockville, MD: Agency for Healthcare Research and Quality; Aug, 2004. AHRQ Publication No. 04-0051–1. [PubMed: 20734525]
- Furman C, Caplan R. Appling the Toyota production system: using a patient safety alert system to reduce error. Jt Comm J Qual Patient Saf. 2007;33(7):376–86. [PubMed: 17711139]
- Womack JP, Jones DT. Lean thinking. New York: Simon and Schuster; 1996.
- Lynn J, Baily MA, Bottrell M, et al. The ethics of using quality improvement methods in health care. Ann Intern Med. 2007;146:666–73. [PubMed: 17438310]
- Reinhardt AC, Ray LN. Differentiating quality improvement from research. Appl Nurs Res. 2003;16(1):2–8. [PubMed: 12624857]
- Blumenthal D, Kilo CM. A report card on continuous quality improvement. Milbank Q. 1998;76(4):625–48. [PMC free article: PMC2751093] [PubMed: 9879305]
- Shortell SM, Bennet CL, Byck GR. Assessing the impact of continuous quality improvement on clinical practice: what it will take to accelerate progress. Milbank Q. 1998;76(4):593–624. [PMC free article: PMC2751103] [PubMed: 9879304]
- Lynn J. When does quality improvement count as research? Human subject protection and theories of knowledge. Qual Saf Health Care. 2004;13:67–70. [PMC free article: PMC1758070] [PubMed: 14757803]
- Bellin E, Dubler NN. The quality improvement-research divide and the need for external oversight. Am J Public Health. 2001;91:1512–7. [PMC free article: PMC1446813] [PubMed: 11527790]
- Choo V. Thin line between research and audit. Lancet. 1998;352:1481–6. [PubMed: 9717915]
- Harrington L. Quality improvement, research, and the institutional review board. J Healthc Qual. 2007;29(3):4–9. [PubMed: 17708327]
- Berwick DM. Eleven worthy aims for clinical leadership of health care reform. JAMA. 1994;272(10):797–802. [PubMed: 8078145]
- Berwick DM. Improvement, trust, and the healthcare workforce. Qual Saf Health Care. 2003;12:2–6. [PMC free article: PMC1758027] [PubMed: 14645761]
- Langley JG, Nolan KM, Nolan TW, et al. The improvement guide: a practical approach to enhancing organizational performance. New York: Jossey-Bass; 1996.
- Pande PS, Newman RP, Cavanaugh RR. The Six Sigma way. New York: McGraw-Hill; 2000.
- Barry R, Murcko AC, Brubaker CE. The Six Sigma book for healthcare: improving outcomes by reducing errors. Chicago, IL: Health Administration Press; 2003.
- Lanham B, Maxson-Cooper P. Is Six Sigma the answer for nursing to reduce medical errors and enhance patient safety? Nurs Econ. 2003;21(1):39–41. [PubMed: 12632719]
- Shewhart WA. Statistical method from the viewpoint of quality control. Washington, DC: U.S. Department of Agriculture; 1986. p. 45.
- Pande PS, Newman RP, Cavanagh RR. The Six Sigma was: team field book. New York: McGraw-Hill; 2002.
- Sahney VK. Generating management research on improving quality. Health Care Manage Rev. 2003;28(4):335–47. [PubMed: 14682675]
- Endsley S, Magill MK, Godfrey MM. Creating a lean practice. Fam Pract Manag. 2006;13:34–8. [PubMed: 16671348]
- Printezis A, Gopalakrishnan M. Current pulse: can a production system reduce medical errors in health care? Q Manage Health Care. 2007;16(3):226–38. [PubMed: 17627218]
- Spear SJ. Fixing health care from the inside, today. Harv Bus Rev. 2005;83(9):78–91. 158. [PubMed: 16171213]
- Johnstone PA, Hendrickson JA, Dernbach AJ, et al. Ancillary services in the health care industry: is Six Sigma reasonable? Q Manage Health Care. 2003;12(1):53–63. [PubMed: 12593375]
- Reason J. Human Error. New York: Cambridge University Press; 1990.
- Kemppainen JK. The critical incident technique and nursing care quality research. J Adv Nurs. 2000;32(5):1264–71. [PubMed: 11115012]
- Joint Commission. 2003 hospital accreditation standards. Oakbrook Terrace, IL: Joint Commission Resources; 2003.
- Bogner M. Human Error in Medicine. Hillsdale, NJ: Lawrence Erlbaum Associates; 1994.
- Rooney JJ, Vanden Heuvel LN. Root cause analysis for beginners. Qual Process. 2004 July; [Accessed on January 5, 2008]; Available at: www.asq.org.
- Giacomini MK, Cook DJ. Users’ guides to the medical literature: XXIII. Qualitative research in health care. Are the results of the study valid? Evidence-Based Medicine Working Group. JAMA. 2000;284:357–62. [PubMed: 10891968]
- Joint Commisssion. Using aggregate root cause analysis to improve patient safety. Jt Comm J Qual Patient Saf. 2003;29(8):434–9. [PubMed: 12953608]
- Wald H, Shojania K. Root cause analysis. In: Shojania K, Duncan B, McDonald KM, et al., editors. Making health care safer: a critical analysis of patient safety practices. Evidence Report/Technology Assessment No. 43. Rockville, MD: AHRQ; 2001. AHRQ Publication Number: 01–058. [PMC free article: PMC4781305] [PubMed: 11510252]
- Bagian JP, Gosbee J, Lee CZ, et al. The Veterans Affairs root cause analysis system in action. Jt Comm J Qual Improv. 2002;28(10):531–45. [PubMed: 12369156]
- Leape LL. Error in medicine. JAMA. 1994;272:1851–7. [PubMed: 7503827]
- Institute of Medicine. Keeping Patients Safe: Transforming the Work Environment of Nurses. Washington, DC: National Academy Press; 2004.
- Benner P, Sheets V, Uris P, et al. Individual, practice, and system causes of errors in nursing: a taxonomy. JONA. 2002;32(10):509–23. [PubMed: 12394596]
- Spath PL, Hickey P. Home study programme: using failure mode and effects analysis to improve patient safety. AORN J. 2003;78:16–21. [PubMed: 12885066]
- Croteau RJ, Schyve PM. Proactively error-proofing health care processes. In: Spath PL, editor. Error reduction in health care: a systems approach to improving patient safety. Chicago, IL: AHA Press; 2000. pp. 179–98.
- Williams E, Talley R. The use of failure mode effect and criticality analysis in a medication error subcommittee. Hosp Pharm. 1994;29:331–6. 339. [PubMed: 10133462]
- Reiling GJ, Knutzen BL, Stoecklein M. FMEA–the cure for medical errors. Qual Progress. 2003;36(8):67–71.
- Adachi W, Lodolce AE. Use of failure mode and effects analysis in improving safety of IV drug administration. Am J Health Syst Pharm. 2005;62:917–20. [PubMed: 15851497]
- DeRosier J, Stalhandske E, Bagin JP, et al. Using health care failure mode and effect analysis: the VA National Center for Patient Safety’s Prospective Risk Analysis System. J Qual Improv. 2002;28(5):248–67. [PubMed: 12053459]
- Buhr GT, White HK. Management in the nursing home: a pilot study. J Am Med Dir Assoc. 2006;7:246–53. [PubMed: 16698513]
- Guinane CS, Davis NH. The science of Six Sigma in hospitals. Am Heart Hosp J. 2004 Winter;:42–8. [PubMed: 15604839]
- Mills PD, Neily J, Luan D, et al. Using aggregate root cause analysis to reduce falls and related injuries. Jt Comm J Qual Patient Saf. 2005;31(1):21–31. [PubMed: 15691207]
- Pronovost PJ, Morlock L, Davis RO, et al. Using online and offline change models to improve ICU access and revenues. J Qual Improv. 2000;26(1):5–17. [PubMed: 10677818]
- Thompson J, Wieck KL, Warner A. What perioperative and emerging workforce nurses want in a manager. AORN J. 2003;78(2):246–9. 258. passium. [PubMed: 12940425]
- Willeumier D. Advocate health care: a systemwide approach to quality and safety. Jt Comm J Qual Patient Saf. 2004;30(10):559–66. [PubMed: 15518360]
- Leape LL, Rogers G, Hanna D, et al. Developing and implementing new safe practices: voluntary adoption through statewide collaboratives. Qual Saf Health Care. 2006;15:289–95. [PMC free article: PMC2564013] [PubMed: 16885255]
- Smith DS, Haig K. Reduction of adverse drug events and medication errors in a community hospital setting. Nurs Clin North Am. 2005;40(1):25–32. [PubMed: 15733944]
- Jimmerson C, Weber D, Sobek DK. Reducing waste and errors: piloting lean principles at Intermountain Healthcare. J Qual Patient Saf. 2005;31(5):249–57. [PubMed: 15960015]
- Docimo AB, Pronovost PJ, Davis RO, et al. Using the online and offline change model to improve efficiency for fast-track patients in an emergency department. J Qual Improv. 2000;26(9):503–14. [PubMed: 10983291]
- Gowdy M, Godfrey S. Using tools to assess and prevent inpatient falls. Jt Comm J Qual Patient Saf. 2003;29(7):363–8. [PubMed: 12856558]
- Germaine J. Six Sigma plan delivers stellar results. Mater Manag Health Care. 2007:20–6. [PubMed: 17506407]
- Semple D, Dalessio L. Improving telemetry alarm response to noncritical alarms using a failure mode and effects analysis. J Healthc Qual. 2004;26(5):Web Exclusive: W5-13–W5-19.
- Erdek MA, Pronovost PJ. Improving assessment and treatment of pain in the critically ill. Int J Qual Health Care. 2004;16(1):59–64. [PubMed: 15020561]
- Burgmeier J. Failure mode and effect analysis: an application in reducing risk in blood transfusion. J Qual Improv. 2002;28(6):331–9. [PubMed: 12066625]
- Mutter M. One hospital’s journey toward reducing medication errors. Jt Comm J Qual Patient Saf. 2003;29(6):279–88. [PubMed: 14564746]
- Rex JH, Turnbull JE, Allen SJ, et al. Systematic root cause analysis of adverse drug events in a tertiary referral hospital. J Qual Improv. 2000;26(10):563–75. [PubMed: 11042820]
- Bolch D, Johnston JB, Giles LC, et al. Hospital to home: an integrated approach to discharge planning in a rural South Australian town. Aust J Rural Health. 2005;13:91–6. [PubMed: 15804332]
- Horbar JD, Plsek PE, Leahy K. NIC/Q 2000: establishing habits for improvement in neonatal intensive care units. Pediatrics. 2003;111:d397–410. [PubMed: 12671159]
- Singh R, Singh A, Servoss JT, et al. Prioritizing threats to patient safety in rural primary care. Inform Prim Care. 2007;15(4):221–9.
- Dunbar AE, Sharek PJ, Mickas NA, et al. Implementation and case-study results of potentially better practices to improve pain management of neonates. Pediatrics. 2006;118(Supplement 2):S87–94. [PubMed: 17079628]
- Weir VL. Best-practice protocols: preventing adverse drug events. Nurs Manage. 2005;36(9):24–30. [PubMed: 16155492]
- Plews-Ogan ML, Nadkarni MM, Forren S, et al. Patient safety in the ambulatory setting. A clinician-based approach. J Gen Intern Med. 2004;19(7):719–25. [PMC free article: PMC1492477] [PubMed: 15209584]
- Baird RW. Quality improvement efforts in the intensive care unit: development of a new heparin protocol. BUMC Proceedings. 2001;14:294–6. [PMC free article: PMC1305833] [PubMed: 16369633]
- Luther KM, Maguire L, Mazabob J, et al. Engaging nurses in patient safety. Crit Care Nurs Clin N Am. 2002;14(4):341–6. [PubMed: 12400624]
- Middleton S, Chapman B, Griffiths R, et al. Reviewing recommendations of root cause analyses. Aust Health Rev. 2007;31(2):288–95. [PubMed: 17470051]
- Farbstein K, Clough J. Improving medication safety across a multihospital system. J Qual Improv. 2001;27(3):123–37. [PubMed: 11242719]
- Esmail R, Cummings C, Dersch D, et al. Using healthcare failure mode and effect analysis tool to review the process of ordering and administrating potassium chloride and potassium phosphate. Healthc Q. 2005;8:73–80. [PubMed: 16334076]
- van Tilburg CM, Liestikow IP, Rademaker CMA, et al. Health care failure mode and effect analysis: a useful proactive risk analysis in a pediatric oncology ward. Qual Saf Health Care. 2006;15:58–64. [PMC free article: PMC2564000] [PubMed: 16456212]
- Eisenberg P, Painer JD. Intravascular therapy process improvement in a multihospital system: don’t get stuck with substandard care. Clin Nurse Spec. 2002:182–6. [PubMed: 12172487]
- Singh R, Servoss T, Kalsman M, et al. Estimating impacts on safety caused by the introduction of electronic medical records in primary care. Inform Prim Care. 2004;12:235–41. [PubMed: 15808025]
- Papastrat K, Wallace S. Teaching baccalaureate nursing students to prevent medication errors using a problem-based learning approach. J Nurs Educ. 2003;42(10):459–64. [PubMed: 14577733]
- Berwick DM. Continuous improvement as an ideal in health care. N Engl J Med. 1989;320(1):53–6. [PubMed: 2909878]
- Pexton C, Young D. Reducing surgical site infections through Six Sigma and change management. Patient Safety Qual Healthc [e-Newsletter]. 2004. [Accessed November 14, 2007]. Available at: www.psqh.com/julsep04/pextonyoung.html.
- Salvador A, Davies B, Fung KFK, et al. Program evaluation of hospital-based antenatal home care for high-risk women. Hosp Q. 2003;6(3):67–73. [PubMed: 12846147]
- Apkon M, Leonard J, Probst L, et al. Design of a safer approach to intravenous drug infusions: failure mode and effects analysis. Qual Saf Health Care. 2004;13:265–71. [PMC free article: PMC1743853] [PubMed: 15289629]
- Kim GR, Chen AR, Arceci RJ, et al. Computerized order entry and failure modes and effects analysis. Arch Pediatr Adolesc Med. 2006;160:495–8. [PubMed: 16651491]
- Horner JK, Hanson LC, Wood D, et al. Using quality improvement to address pain management practices in nursing homes. J Pain Symptom Manage. 2005;30(3):271–7. [PubMed: 16183011]
- van Tiel FH, Elenbaas TW, Voskuilen BM, et al. Plan-do-study-act cycles as an instrument for improvement of compliance with infection control measures in care of patients after cardiothoracic surgery. J Hosp Infect. 2006;62:64–70. [PubMed: 16309783]
- Dodds S, Chamberlain C, Williamson GR, et al. Modernising chronic obstructive pulmonary disease admissions to improve patient care: local outcomes from implementing the Ideal Design of Emergency Access project. Accid Emerg Nurs. 2006 Jul;14(3):141–7. [PubMed: 16762552]
- Warburton RN, Parke B, Church W, et al. Identification of seniors at risk: process evaluation of a screening and referral program for patients aged > 75 in a community hospital emergency department. Int J Health Care Qual Assur. 2004;17(6):339–48. [PubMed: 15552389]
- Nowinski CV, Mullner RM. Patient safety: solutions in managed care organizations? Q Manage Health Care. 2006;15(3):130–6. [PubMed: 16849984]
- Wojciechowski E, Cichowski K. A case review: designing a new patient education system. The Internet J Adv Nurs Practice. 2007;8(2)
- Gering J, Schmitt B, Coe A, et al. Taking a patient safety approach to an integration of two hospitals. Jt Comm J Qual Patient Saf. 2005;31(5):258–66. [PubMed: 15960016]
- Day S, Dalto J, Fox J, et al. Failure mode and effects analysis as a performance improvement tool in trauma. J Trauma Nurs. 2006;13(3):111–7. [PubMed: 17052091]
- Johnson T, Currie G, Keill P, et al. New York-Presbyterian hospital: translating innovation into practice. Jt Comm J Qual Patient Saf. 2005;31(10):554–60. [PubMed: 16294667]
- Aldarrab A. Application of lean Six Sigma for patients presenting with ST-elevation myocardial infarction: the Hamilton Health Sciences experience. Healthc Q. 2006;9(1):56–60. [PubMed: 16548435]