What Is a BSI Sensor? Back-Side Illumination Explained

A BSI (back-side illuminated) sensor is a type of image sensor where the light-sensitive layer faces incoming light directly, without any wiring blocking the path. This design captures significantly more light than older sensor architectures, which is why it has become the standard in smartphone cameras and is increasingly common in dedicated cameras too.

How a BSI Sensor Works

Every digital image sensor contains two essential layers: the photodiodes that detect light and the metal wiring that carries electrical signals. In a traditional front-side illuminated (FSI) sensor, light has to pass through multiple layers of metal wiring before it reaches the photodiodes. That wiring blocks and scatters some of the incoming photons before they ever get detected.

A BSI sensor solves this by flipping the silicon substrate upside down. The metal wiring gets relocated to the back of the photodiode layer, so photons hit the light-detecting silicon directly without any wiring or other obstructions in the way. The result is a sensor that captures more of the available light from any given scene, producing cleaner images with less noise.

Why BSI Sensors Capture Better Images

The practical difference comes down to how efficiently a sensor converts photons into usable image data. With the wiring out of the way, each pixel on a BSI sensor collects a larger share of the light that falls on it. Sony’s original announcement of its back-illuminated technology in 2008 claimed nearly double the sensitivity compared to conventional designs, with lower noise as well.

This matters most in two scenarios: low-light shooting and small pixels. When you’re photographing in dim conditions, every photon counts. A sensor that wastes fewer photons at the wiring stage produces brighter, cleaner images at the same exposure settings. And when pixels are physically tiny (as they are on smartphone sensors, where millions of pixels are packed onto a chip smaller than a fingernail), the wiring on an FSI sensor blocks a proportionally larger share of each pixel’s surface area. BSI removes that bottleneck entirely.

The faster readout speed of BSI designs also brings practical benefits. Reading pixel data more quickly makes fully electronic shutters possible, improves autofocus response, and enables higher burst shooting rates with continuous autofocus tracking.

Why Smartphones Depend on BSI

BSI technology is arguably the single biggest reason smartphone cameras have improved so dramatically over the past decade. Phone manufacturers face a constant tension: users want better image quality (which means larger sensors) but refuse thicker phones. BSI sensors helped resolve this by allowing the lens to sit closer to the sensor surface, enabling larger sensors without adding thickness to the phone body.

Smartphone camera sensors have been steadily growing in size even as phones stay slim. Without BSI architecture, this wouldn’t have been feasible. The combination of larger sensors and more efficient light collection is a major part of what has closed the image quality gap between phones and dedicated cameras.

BSI in Dedicated Cameras

BSI isn’t limited to phones. Many modern mirrorless cameras use BSI CMOS sensors, where the benefits shift from saving space to maximizing image quality. In larger sensors with bigger pixels, the advantage over FSI is less dramatic than in tiny smartphone chips, but improved light collection still translates to cleaner high-ISO performance and better dynamic range. The faster readout speeds also reduce rolling shutter distortion during video recording and fast action photography.

Stacked BSI: The Next Step

Stacked CMOS sensors build on the BSI concept by going further with the layering. Instead of just flipping the wiring behind the photodiodes, a stacked design also integrates the image signal processor and ultra-fast memory directly into the same chip, layered beneath the sensor. This makes readout speeds even faster, which improves autofocus performance and reduces the visual artifacts that come from reading an image line by line.

Stacked BSI sensors appear in flagship smartphones and high-end mirrorless cameras. They’re more expensive to manufacture, but the performance gains in speed and processing power are substantial.

How BSI Sensors Are Made

Manufacturing a BSI sensor is more complex than building a traditional one. The process requires thinning the silicon wafer after the circuits are built, because you need light to reach the photodiodes from the back side through a very thin layer of silicon. In practice, this means mechanically grinding the wafer down and then using a chemical etching process to remove additional material and release the stress caused by grinding.

This thinning step is delicate. Mechanical grinding can create micro-damage and stress in the silicon that may cause chips to crack during later manufacturing stages. Manufacturers compensate by using a secondary etching process to strip away the damaged surface layer. The final silicon thickness in some designs lands around 150 micrometers, roughly the width of a thick human hair. Getting this consistently right at scale was the main reason BSI sensors took years to move from concept to commercial product.

A Brief Timeline

Sony announced its back-illuminated CMOS image sensor technology in June 2008 and shipped the first consumer cameras using it, the TX1 and WX1 compacts, in 2009 under the “Exmor R” branding. Within a few years, BSI became standard in smartphone image sensors. Today, virtually every phone camera and a growing number of mirrorless cameras use some form of BSI or stacked BSI architecture. Traditional front-side illuminated sensors are increasingly limited to lower-cost applications where maximum light sensitivity isn’t critical.