What Is a Short in C? Integer Type Explained

A short in C is an integer data type that uses 2 bytes of memory and stores whole numbers ranging from -32,768 to 32,767. It’s the smallest multi-byte integer type in C, designed for situations where you need to save memory and your values fit within that range.

Size and Value Range

A short occupies 2 bytes (16 bits) on virtually all modern systems. This holds true on both 32-bit and 64-bit architectures, unlike int and long, whose sizes can vary across platforms. The C standard guarantees a minimum range of -32,767 to +32,767, but on nearly every system you’ll encounter today, the actual range is -32,768 to 32,767.

You can declare a short in several equivalent ways:

  • short x;
  • short int x;
  • signed short int x;

All three mean the same thing. Most programmers just write short.

Signed vs. Unsigned Short

By default, a short is signed, meaning it can hold negative values. If you only need positive numbers, you can declare an unsigned short, which shifts the entire range upward: 0 to 65,535. You get double the positive range because the bit that normally tracks the sign is used for magnitude instead.

The exact limits for your compiler are defined as constants in <limits.h>: SHRT_MIN and SHRT_MAX for signed, USHRT_MAX for unsigned. You can print these values to confirm what your system supports.

When to Use a Short

A short saves memory compared to an int (which is typically 4 bytes). This matters when you’re storing large arrays of small values, working in embedded systems with limited RAM, or dealing with binary file formats or network protocols that specify 16-bit fields. For general-purpose variables in everyday code, though, int is usually the better choice because of how C handles arithmetic (more on that below).

Integer Promotion During Arithmetic

One of the most important things to know about short is that C automatically promotes it to int whenever you use it in an expression. If you add two short variables together, the compiler converts both to int before doing the math, and the result is an int.

This means you can’t actually perform arithmetic “in” the short type. The promotion happens silently, and the result needs to be stored back into a short if that’s what you want. If the result exceeds the short range, it gets truncated, which can produce unexpected values. The general rule is simple: any integer type smaller than int gets promoted to int before operations are performed on it.

This is why using short for loop counters or temporary calculations doesn’t actually speed anything up. The CPU works with int-sized values regardless.

Printing and Reading Short Values

To print a short with printf, you use the h length modifier before the format specifier:

  • %hd for a signed short in decimal
  • %hu for an unsigned short in decimal
  • %hx for an unsigned short in hexadecimal

The same modifiers apply when reading input with scanf. In practice, printf will often work fine with just %d because of integer promotion (the short gets promoted to int when passed to the function). But scanf writes directly to a memory address, so using the wrong format specifier there can corrupt adjacent memory. Always use %hd with scanf when reading into a short.

Overflow Behavior

What happens when a short exceeds its maximum value depends on context. Because of integer promotion, adding 1 to a short holding 32,767 actually happens in int arithmetic, producing 32,768 as a perfectly valid int. The trouble comes when you store that back into a short.

For unsigned types, overflow is well-defined: the value wraps around. An unsigned short at 65,535 wraps to 0 when you add 1. For signed types, overflow is technically undefined behavior in C, meaning the compiler is free to do anything. In practice, most systems wrap around (32,767 + 1 becomes -32,768), but you shouldn’t rely on that. Compilers can and do optimize based on the assumption that signed overflow never happens, which can introduce subtle bugs.

Short vs. Other Integer Types

  • char: 1 byte, range of -128 to 127 (signed). Smaller than short but often used for characters rather than numbers.
  • short: 2 bytes, range of -32,768 to 32,767.
  • int: Typically 4 bytes, range of roughly -2.1 billion to 2.1 billion. The default choice for integer arithmetic.
  • long: At least 4 bytes (8 bytes on most 64-bit Unix systems).

The C standard only guarantees that short is at least 16 bits and no larger than int. On every mainstream compiler today, short is exactly 16 bits, making it one of the most consistent types across platforms. If you need a guaranteed 16-bit integer regardless of platform, you can also use int16_t from <stdint.h>, which makes your intent explicit.