All sound begins with vibration. Every sound you’ve ever heard, from a whisper to a thunderclap, started with an object vibrating and pushing against the particles around it. That push creates a wave of energy that travels through a medium like air, water, or solid material until it reaches your ears. Without vibration, there is no sound.
How Vibrations Become Sound Waves
When an object vibrates, it doesn’t launch something physical toward your ear. Instead, it nudges the air particles immediately next to it. Those particles bump into their neighbors, which bump into their neighbors, and so on. The energy passes from particle to particle in a chain reaction, much like a ripple moving through a line of dominoes. The particles themselves barely move from their original positions. What travels is the energy.
A useful way to picture this is a slinky stretched across a table. If you push one end forward and pull it back, a pulse travels down the length of the slinky. Each coil pushes or pulls on the next one, passing the disturbance along without any single coil traveling the full distance. Sound works the same way through air molecules.
Compressions and Rarefactions
Sound waves are longitudinal waves, meaning the particles move back and forth in the same direction the wave is traveling (unlike ocean waves, where water moves up and down while the wave moves forward). This back-and-forth motion creates two alternating zones. A compression is where particles are squeezed closest together, and a rarefaction is where they’re spread furthest apart. These alternating regions of high and low pressure ripple outward from the source, and that pattern of pressure changes is what your ear ultimately detects as sound.
Why Sound Needs a Medium
Because sound depends on particles bumping into each other, it cannot travel through a vacuum. In the emptiness of space, there are no air molecules, no water molecules, no particles of any kind to carry the vibration forward. This is why the classic sci-fi scene of an explosion roaring through space is pure fiction. No medium, no sound.
The type of medium matters enormously for how fast sound travels. In air at room temperature, sound moves at roughly 343 meters per second (about 767 miles per hour). In water, where molecules are packed more tightly, it jumps to around 1,480 meters per second. In steel, where atoms are locked in a rigid structure and transfer energy very efficiently, sound races along at approximately 5,120 meters per second. The denser and stiffer the material, the faster the wave moves through it.
Frequency, Pitch, and What You Hear
Not all vibrations sound the same, and the main reason is frequency: how many times per second the source vibrates. Frequency is measured in hertz (Hz), where one hertz equals one complete vibration cycle per second. A guitar string vibrating 440 times per second produces the note A above middle C. Your brain interprets higher frequencies as higher-pitched sounds and lower frequencies as lower-pitched ones.
Human hearing spans roughly 20 Hz to 20,000 Hz. Sounds below 20 Hz are classified as infrasound. You can’t hear them in the traditional sense, but they can be felt as physical pressure or rumbling. Elephants and some weather phenomena produce infrasound. Sounds above 20,000 Hz are ultrasound, used by bats for navigation and by medical imaging equipment. The 20 to 200 Hz range is considered low-frequency sound, covering the deep bass you feel in your chest at a concert.
Frequency also explains why certain notes sound pleasant together. Two sounds whose frequencies form a 2:1 ratio are separated by an octave, a pairing the human ear finds naturally harmonious. A note at 512 Hz played alongside one at 256 Hz is a textbook example.
Amplitude, Loudness, and the Decibel Scale
If frequency determines pitch, amplitude determines loudness. Amplitude is the size of the pressure variation in the wave. A gently plucked guitar string creates small pressure fluctuations and a quiet sound. Pluck it hard, and the string swings further from its resting position, creating larger pressure swings and a louder sound. More amplitude means more energy carried by the wave.
Loudness is measured in decibels (dB), and the scale is logarithmic rather than linear. This means that a 10 dB increase doesn’t represent a small bump in intensity. It represents a tenfold increase in sound energy. The scale works this way because human hearing itself responds logarithmically. The softest sound you can detect and the loudest sound you can tolerate differ by a factor of roughly one trillion in actual energy. A linear scale would be unwieldy, so the decibel system compresses that enormous range into manageable numbers, typically 0 dB (the threshold of hearing) to around 120-130 dB (the threshold of pain).
Why a Violin Doesn’t Sound Like a Flute
When a violin plays an A at 440 Hz, it doesn’t produce a single pure frequency. It simultaneously generates sound at 880 Hz, 1,320 Hz, 1,760 Hz, and so on. These additional frequencies are called harmonics, and they’re whole-number multiples of the fundamental frequency. A flute playing the same A at 440 Hz also produces harmonics, but the relative loudness of each harmonic is different. This mix is what gives each instrument its distinct character, known as timbre.
The French mathematician Joseph Fourier showed that any complex sound wave, no matter how irregular it looks, can be broken down into a combination of simple sine waves at different frequencies and amplitudes. This principle applies to everything: musical instruments, the human voice, traffic noise. The fundamental frequency determines the pitch you perceive, and the blend of harmonics on top of it determines the color or texture of the sound. Remarkably, the fundamental frequency doesn’t even need to be the loudest component for your brain to identify the pitch correctly. It doesn’t need to be present at all. Your auditory system fills in the gap, perceiving the pitch of a “missing fundamental” based on the pattern of harmonics.
How Your Ear Converts Vibrations to Signals
The final step in the chain from vibration to perception happens inside your inner ear, in a snail-shaped structure called the cochlea. Sound waves enter the ear canal, vibrate the eardrum, pass through three tiny bones in the middle ear, and arrive at the cochlea as mechanical vibrations in fluid. Lining the inside of the cochlea are thousands of specialized sensory cells topped with tiny hair-like projections.
When sound vibrations ripple through the cochlear fluid, these projections tilt. Tiny filaments connecting the tips of adjacent projections, called tip links, stretch and physically pull open channels in the cell membrane. Charged particles flow in, generating an electrical signal. This conversion from mechanical movement to electrical impulse happens in as little as 10 microseconds, fast enough to faithfully capture high-frequency sounds and help your brain pinpoint where a sound is coming from.
That electrical signal triggers the release of chemical messengers at the base of the cell, which activate the auditory nerve. From there, the signal travels to the brain, where it’s interpreted as the sound you consciously hear. Every sound you experience, from a bird chirping to a symphony orchestra, traces back to this same chain: something vibrates, particles carry the energy, and your inner ear translates the mechanical wave into the electrical language of the nervous system.

