A typical home or small-shop welder draws between 2,000 and 12,000 watts during active welding, depending on the process, amperage setting, and whether it runs on 120V or 240V power. In practical terms, an hour of hobbyist welding costs roughly $0.50 to $2.00 in electricity at average U.S. residential rates, while industrial setups can consume far more.
Power Draw by Welding Process
The electricity a welder consumes comes down to a simple formula: volts multiplied by amps equals watts. A 120V machine pulling 20 amps uses 2,400 watts. A 240V machine pulling 40 amps uses 9,600 watts. The welding process you’re using determines where you fall in that range.
MIG welding is the most common process for home shops and light fabrication. A mid-range MIG welder like a 180-amp unit has a power capacity around 5,250 watts and can draw up to 20 amps on a 230V circuit (roughly 4,600 watts) depending on your settings. Small 120V MIG machines for thin sheet metal pull considerably less, typically 2,000 to 3,500 watts.
Stick (SMAW) welding generally falls in a similar range for hobby-level machines but scales up quickly. A 180-amp stick welder runs comparable wattage to MIG at the same amperage. Larger 250-amp units used for structural work or thicker plate can pull 8,000 to 12,000 watts.
TIG welding operates across the widest amperage range. The same 180-amp multiprocess machine runs TIG from as low as 10 amps (welding thin aluminum or stainless) up to 165 amps. At the low end, power consumption drops to just a few hundred watts. At full output, it’s comparable to MIG.
Why You’re Not Welding 100% of the Time
Your welder doesn’t draw peak wattage continuously. Duty cycle, the percentage of a 10-minute window you can actually weld before the machine needs to cool down, dramatically reduces real-world electricity use. A machine rated at 60% duty cycle at 300 amps can weld for six minutes out of every ten before it hits its thermal limit. The other four minutes, it’s either idle or you’re grinding, repositioning, or swapping parts.
For hobbyists and most shop fabricators, actual arc-on time is even lower than the duty cycle suggests. Between setup, fit-up, tacking, adjusting settings, and cooling, many welders only strike an arc for 10 to 20 minutes out of every hour. That means your real electricity consumption is a fraction of the machine’s peak rating.
Inverter vs. Transformer Efficiency
The type of welder you own makes a meaningful difference in how much electricity gets wasted as heat. Modern inverter-based welders convert 85 to 90% of the electricity they pull from the wall into actual welding output. Older transformer-based machines convert only 60 to 70%. That gap means a transformer welder doing the same job as an inverter unit could use 25 to 40% more electricity to produce identical results.
If you’re buying a new welder and electricity costs matter to you, inverter machines pay for their efficiency over time, especially in a production environment. They’re also lighter and offer more precise arc control, which is why they’ve largely replaced transformers in shops that aren’t running legacy equipment.
Calculating Your Actual Cost
The formula is straightforward: multiply your machine’s wattage by your actual welding hours, convert to kilowatt-hours, then multiply by your electricity rate.
Here’s a realistic example. Say you’re running a 240V MIG welder at moderate settings, pulling about 4,500 watts. You weld for 20 minutes out of a one-hour session (the rest is prep, grinding, and cooling). That’s 0.33 hours of arc time. Your consumption is 4,500 watts × 0.33 hours = 1,485 watt-hours, or about 1.5 kWh. At the 2025 U.S. average residential rate of 17.3 cents per kWh, that hour in the shop costs you roughly $0.26 in electricity.
Crank the numbers up for heavier work. A 250-amp stick welder pulling 10,000 watts with 30 minutes of arc time in an hour uses 5 kWh, costing about $0.87 at residential rates. Even a full day of heavy welding rarely exceeds $5 to $10 in electricity for a home shop, which is why material costs and consumables dwarf the power bill for most hobbyists.
Industrial and Robotic Welding
Production welding is a different story. Robotic welding cells in manufacturing environments can consume 50 to 80 kWh per hour when running at high speeds on heavy materials, based on energy analysis of automated welding stations using industrial power sources. At the average U.S. industrial electricity rate of 8.6 cents per kWh, that translates to $4 to $7 per hour per cell. Facilities running dozens of cells across multiple shifts see welding become a significant line item in their energy budget.
These industrial systems also run at much higher duty cycles than manual welding. Robotic arms don’t stop to reposition or take breaks, so arc-on time can reach 80 to 90% in well-designed production lines. That sustained draw is what pushes consumption so high compared to manual shop welding.
Circuit and Outlet Requirements
Before worrying about electricity cost, make sure your electrical system can actually deliver what your welder needs. Small 120V welders run on a standard 15 or 20-amp household outlet, though they’ll often trip a 15-amp breaker under load. Dedicated 20-amp circuits are safer for these machines.
Most 240V welders need a dedicated circuit with a NEMA 6-50 outlet, which delivers 240 volts at up to 50 amps. This outlet uses two hot wires and a ground wire, protected by a 50-amp double-pole breaker. It’s the same type of plug used for many electric vehicle chargers and heavy-duty power tools. If you’re setting up a garage or shop for welding, having an electrician install a 50-amp 240V circuit gives you enough headroom for virtually any single-phase welder up to about 250 amps of output.
Running a welder on an undersized circuit doesn’t just trip breakers. It can cause voltage drops that hurt weld quality, damage the machine’s electronics, or create a fire hazard from overheated wiring. Match your circuit to the welder’s input requirements listed on the nameplate, not just its output amperage.

