A star is an immense, luminous celestial body whose temperature dictates its appearance, lifespan, and energy output. The temperature within these spheres of plasma is not uniform; it spans an extraordinary range from millions of degrees in the core to thousands of degrees at the visible surface. Understanding a star’s temperature requires looking at the source of its heat and the light it emits.
The Source of Stellar Heat
The extreme temperatures found deep inside a star are a direct consequence of the star’s own immense gravitational force. The sheer mass of the star compresses the material in its center to such an extent that the pressure and temperature skyrocket. This crushing environment forces hydrogen atoms into a state of plasma, where the electrons are stripped away from the nuclei.
This intense pressure and heat create the conditions necessary for nuclear fusion, the engine that powers the star. Specifically, in stars like our Sun, the core reaches temperatures of approximately 15 million Kelvin, allowing hydrogen nuclei (protons) to fuse together to form helium nuclei. This process, known as the proton-proton chain reaction, converts a tiny amount of mass into an enormous burst of energy according to the principles of mass-energy equivalence.
The energy released by fusion works its way outward, counteracting the inward force of gravity and maintaining the star’s structure. This constant energy generation sustains the star’s internal heat, which is then radiated away from the star’s surface.
Measuring Heat by Starlight
While the core’s temperature must be calculated indirectly, a star’s surface temperature is determined by analyzing the light it emits. Astronomers treat stars as approximate blackbody radiators, which means the spectrum of light they produce is directly related to their temperature. When an object heats up, the wavelength at which it emits the most light shifts toward the shorter, more energetic end of the spectrum.
This relationship explains that blue stars are much hotter than red stars. Cooler stars, with surface temperatures below 3,700 Kelvin, primarily radiate light at longer, less energetic wavelengths, giving them a reddish or orange appearance. As a star’s temperature increases, the peak emission wavelength shifts through yellow and green and eventually into the blue and ultraviolet ranges.
A star with a surface temperature of about 6,000 Kelvin, like our Sun, appears yellow because its peak emission is in the green-yellow part of the spectrum. The hottest stars, with temperatures exceeding 30,000 Kelvin, emit a disproportionate amount of their energy in the blue and ultraviolet, causing them to glow with a brilliant blue-white color. By measuring the star’s brightness through specific color filters, astronomers can quantify this color index and accurately derive the star’s surface temperature.
The Stellar Classification System
Astronomers use the spectral classification system, designated by the letters O, B, A, F, G, K, and M, to categorize stars based on their surface temperature. This sequence is arranged from hottest (O-type) to coolest (M-type), with each letter representing a specific temperature range. Within each main class, a star is further subdivided into ten subclasses, numbered 0 to 9, allowing for precise thermal identification.
The O-type stars are the most massive and hottest, exhibiting surface temperatures above 30,000 Kelvin and shining with a blue-violet light. Moving down the sequence, B-type stars range from 10,000 to 30,000 Kelvin, and A-type stars, which include Sirius, fall between 7,500 and 10,000 Kelvin, glowing white or blue-white.
Stars in the middle of the sequence, like F-type (6,000 to 7,500 K) and G-type (5,000 to 6,000 K), are yellow or yellow-white. Our Sun is a G2-type star, placing it near the middle of the thermal range. K-type stars (3,500 to 5,000 K) appear orange, while M-type stars, the most common in the galaxy, are the coolest, with surface temperatures below 3,500 Kelvin, and appear red.

