A utilization rate measures how much of an available resource is actually being used, expressed as a percentage. The basic formula is simple: divide the amount of a resource in use by the total amount available, then multiply by 100. While the concept stays the same, utilization rate means something quite different depending on whether you’re talking about employee productivity, credit cards, hospital beds, or factory output.
The Basic Formula
At its core, every utilization rate works the same way:
Utilization rate = (amount used / total available) × 100
If a consultant has 40 hours available in a week and spends 30 of them on billable client work, their utilization rate is 75%. If a hospital has 200 beds and 150 are occupied, its bed utilization rate is 75%. The math is identical. What changes is what you’re measuring and what counts as a “good” number.
Billable Hours in Professional Services
In consulting, law, accounting, and similar fields, utilization rate tracks how much of an employee’s available time goes toward billable client work versus internal tasks like meetings, training, or administrative work. This is one of the most common contexts where people encounter the term.
Target utilization rates vary significantly by role. People doing hands-on client work, like consultants and freelancers, typically aim for 75% to 80% billable utilization. Managers who split time between client relationships and internal oversight target 35% to 50%. Sales and administrative staff may be below 10%, since their work rarely gets billed directly to clients. Non-billable work like internal meetings, professional development, and business planning should ideally take up no more than 20% to 25% of a client-facing employee’s time.
A utilization rate that’s too low means the company is paying people who aren’t generating revenue. But pushing it too high squeezes out time for the internal work that keeps a business running, and it’s a reliable path to burnout.
Credit Utilization and Your Credit Score
In personal finance, your credit utilization rate is the percentage of your available revolving credit that you’re currently using. If you have credit cards with a combined limit of $10,000 and carry $2,000 in balances, your utilization rate is 20%.
This number matters more than most people realize. It’s one of the most influential factors in your credit score, and lower is better. People with exceptional credit scores (800 to 850) carry an average utilization of just 7.1%, according to Experian data from the third quarter of 2024. Those with poor scores (300 to 579) average 80.7%. The national average sits at 29%.
There’s no single cutoff where your score suddenly drops, but 30% is the point where the negative effect becomes more pronounced. The ideal range is single digits. One quirk worth knowing: 0% utilization is actually slightly worse than 1%, because lenders want to see that you’re actively using credit responsibly, not just holding dormant accounts.
Factory and Industrial Capacity
The Federal Reserve tracks capacity utilization for manufacturing, mining, and utilities across the United States. In this context, the rate measures how much of a plant’s sustainable maximum output is actually being produced. “Sustainable maximum” factors in a realistic work schedule, normal downtime, and typical availability of materials.
From 1972 to 2024, the average capacity utilization rate across all U.S. industry was 79.5%. Manufacturing specifically averaged 78.2%. No broad industry category has ever hit 100%, and manufacturing has only exceeded 90% during wartime. An 85% rate in industrial production signals tight conditions, meaning factories are running close to their limits and may struggle to ramp up further without new investment.
Hospital Bed Occupancy
Hospitals track bed utilization (often called occupancy rate) to balance two competing pressures: making efficient use of expensive infrastructure and keeping enough open beds to handle surges in demand. From 2009 to 2019, average U.S. hospital occupancy ran between 63% and 66%. In the year following the end of the COVID-19 public health emergency (May 2023 through April 2024), that average climbed to 75.3%, with enormous state-level variation. Wyoming averaged 43% while Rhode Island hit 88%.
A widely cited benchmark holds that 85% occupancy represents a functional bed shortage, the point where hospitals can no longer reliably absorb unexpected patient surges. Recent research suggests that benchmark may actually be too generous for many settings. Smaller wards and certain specialties hit capacity problems well below 85%, because they have less ability to absorb the natural day-to-day variation in how many patients show up and how long they stay. At current trends, the U.S. could reach that 85% threshold nationally by around 2032, driven largely by an aging population.
Medical Equipment Utilization
Expensive diagnostic machines like MRI and CT scanners have their own utilization tracking, usually measured in hours of operation per week. Higher utilization spreads the cost of the equipment across more procedures, lowering the per-scan expense. The average utilization rate for imaging equipment nationwide is about 25 hours per week, roughly 48% to 54% of available time. Non-rural facilities average around 56%.
Medicare policy has pushed facilities toward higher utilization. Congress initially proposed a 90% utilization target for reimbursement calculations but settled on 75%, still well above the 62.5% that high-performing hospitals were actually averaging in 2009. The gap between policy targets and real-world usage reflects the practical difficulty of keeping expensive machines running constantly when patient demand fluctuates throughout the day and week.
Why the “Right” Rate Varies So Much
A 75% utilization rate might be excellent for a consultant, concerning for a hospital, and perfectly normal for a factory. The ideal number depends entirely on what’s being measured and what happens when you push too high. In professional services, over-utilization burns out employees. In hospitals, it means there are no beds left when an ambulance arrives. In manufacturing, it signals inflation risk because producers can’t easily increase supply.
The common thread is that 100% utilization is almost never the goal. Every system needs slack, whether that’s unscheduled hours for a knowledge worker, empty hospital beds for emergencies, or idle factory capacity that can ramp up when demand spikes. The utilization rate’s real value is in revealing where that slack exists and whether there’s enough of it.

