PDSA stands for Plan, Do, Study, Act. It’s a four-step cycle that healthcare organizations use to test and improve processes, from reducing hospital wait times to cutting down on medication errors. Designed by the Institute for Healthcare Improvement (IHI) based on the work of statistician and management consultant W. Edwards Deming, the PDSA cycle is essentially the scientific method adapted for real-world problem solving. Instead of overhauling an entire system at once, teams make small, deliberate changes, measure what happens, and refine their approach before scaling up.
Where PDSA Came From
W. Edwards Deming was an American engineer and statistician who spent decades consulting on process improvement, most famously in Japanese manufacturing after World War II. Shortly before his death in 1993, he formalized the PDSA model. The IHI then adapted it specifically for healthcare settings, embedding it within a broader framework called the Model for Improvement.
The Model for Improvement pairs the PDSA cycle with three guiding questions: What are we trying to accomplish? How will we know that a change is an improvement? What change can we make that will result in improvement? These questions set the direction. The PDSA cycle is the engine that tests whether a proposed change actually works.
The Four Stages Explained
Plan
This is where a team identifies a specific problem and designs a small test to address it. “Small” is key. The goal is not to redesign an entire department’s workflow in one pass. A good Plan phase defines what change will be tested, who will be involved, what data will be collected, and what the team predicts will happen. For example, a hospital noticing delays in patient discharge might plan to test a new checklist process with just five patients on a single ward over one week.
Do
The team carries out the test on that small scale and documents what happens, including anything unexpected. This is a trial run, not a full rollout. Staff record observations as they go: Did the checklist take too long to fill out? Did a step not make sense? Did patients move through faster than before? The emphasis is on learning, not perfection.
Study
After the test, the team examines the results and compares them against their predictions. This is where data analysis happens, often using visual tools like run charts or control charts to spot trends. Did the change produce the expected improvement? Were there unintended side effects? What surprised the team? The Study phase is what separates PDSA from simple trial and error. It forces a structured look at the evidence before deciding what to do next.
Act
Based on what the team learned, they choose one of three paths: adopt the change (it worked), adapt it (it partly worked but needs tweaking), or abandon it (it didn’t work or caused new problems). If the change needs tweaking, the team loops back to Plan and runs another cycle with adjustments. If it worked well at small scale, they might run the next cycle with more patients, more staff, or across an additional unit, gradually expanding the scope with each pass.
How PDSA Cycles Scale Up
A single PDSA cycle rarely solves a complex problem. The method is designed to be iterative: you start as small as one patient on one day, learn from the results, and progressively widen the test. The IHI describes this as testing a change in the “real-world setting,” which means teams work within their actual clinical environment rather than in a controlled simulation.
This incremental approach serves a practical purpose. Healthcare systems are complex, and changes that look good on paper can fail unpredictably when they collide with actual workflows, staffing patterns, or patient needs. By testing small and ramping up, teams catch problems early when they’re cheap and easy to fix, rather than after a hospital-wide rollout.
Real-World Results
One well-documented example involved a medical specialties department at a tertiary care center in Saudi Arabia that used PDSA cycles to tackle delayed patient discharges. Despite admissions increasing year over year from 2016 to 2018, the average length of stay dropped from 9.16 days to 7.47 days. Readmission and mortality rates also decreased after the intervention launched in 2017, suggesting the faster discharges weren’t coming at the cost of care quality.
In another case, a comprehensive cancer center applied PDSA cycles to address overcrowded palliative care beds. The original average length of stay for palliative care patients was 28 days. After iterative testing and stakeholder engagement across multiple service tiers, that number fell to 10.8 days by 2018 and held at 10.1 days in 2019. Bed overcapacity dropped from 35% to 13.8% in a single year. The team used each PDSA cycle to test simplified pilots, then built a detailed business case before broader rollout.
Common Pitfalls
PDSA is conceptually simple, which can be deceptive. Research on implementation barriers reveals several recurring problems that trip teams up.
The most common mistake is going too big too fast. Teams sometimes try to redesign an entire process in a single cycle rather than testing one narrow change. This defeats the purpose of the method. Large-scale changes are harder to measure, harder to attribute to specific factors, and harder to reverse if they don’t work. Smaller, focused tests produce cleaner data and faster learning.
Bureaucratic friction is another frequent barrier. Healthcare organizations often require formal approval for new forms, protocols, or workflow changes. These approval processes can take weeks or months, while PDSA cycles are designed to move in days or weeks. When a team needs sign-off from multiple committees before testing a simple checklist, the pace of iteration stalls. Some organizations address this by giving quality improvement teams explicit authority to run small-scale tests without full institutional approval, reserving formal processes for the broader rollout stage.
Staff understanding is a subtler challenge. Some team members find the PDSA framework too abstract or “big picture,” making it unclear what they’re supposed to do differently on a given shift. Successful implementations tend to involve hands-on coaching and very concrete cycle goals, like “test this new handoff form with the next three patients and note any confusion.”
PDSA vs. PDCA
You may encounter the term PDCA, which stands for Plan, Do, Check, Act. This version emerged from Deming’s earlier teaching in Japan and is widely used in manufacturing and business settings. The two terms are used interchangeably in most healthcare literature, and in practice the difference is minimal. The shift from “Check” to “Study” reflects a subtle philosophical point: “Study” implies deeper analysis and learning from results, while “Check” can suggest a simpler pass/fail inspection. For practical purposes, the methodology is the same.

