What Is a Sequential Circuit and How Does It Work?

A sequential circuit is a digital circuit whose output depends not only on its current inputs but also on its history of past inputs. This is what separates it from the other major category of digital circuits, combinational circuits, which produce outputs based purely on whatever inputs are present right now. The key difference is memory: a sequential circuit can remember what happened before and use that stored information to determine what it does next.

How Sequential Circuits Store Information

At its core, a sequential circuit is really just a combinational circuit with a feedback loop. Some of the circuit’s outputs are routed back into it as inputs, creating a loop that allows information to persist over time. This feedback is what gives the circuit its memory. The outputs that feed back are called “present state” variables because they represent the circuit’s current condition, while the new values heading into the feedback path are called “next state” variables because they determine what state the circuit will enter next.

If a circuit’s memory path has, say, 3 digital lines running through it, those lines can store 2³ (eight) different binary patterns. That means the circuit has eight possible internal states. Each state represents a unique configuration of stored values, and the circuit transitions between these states based on a combination of its current inputs and whichever state it’s already in.

Latches and Flip-Flops: The Building Blocks

The memory inside a sequential circuit comes from small storage elements called latches and flip-flops. A latch is the simpler of the two. It provides a basic form of memory by holding a value, but it responds to input changes at any time. That flexibility can be a problem when you need multiple storage elements to update in a coordinated way.

Flip-flops solve this by adding a clock signal. A flip-flop only updates its stored value at a precise moment, typically the rising or falling edge of the clock. This narrow update window means you can synchronize dozens or thousands of flip-flops so they all change state at exactly the same time. Think of the clock as a metronome keeping all the parts of the circuit in step.

For a flip-flop to work reliably, the input data must be stable for a short period before the clock edge (called setup time) and remain stable for a short period after it (called hold time). If data is still changing during either window, the flip-flop can enter an unpredictable state called metastability, where it might settle on the correct value or the wrong one. Designing circuits to meet these timing requirements is a fundamental part of digital engineering.

Synchronous vs. Asynchronous Circuits

Sequential circuits come in two broad flavors depending on how they handle timing. Synchronous sequential circuits use a single clock signal that governs when every storage element in the circuit updates. All state changes happen on the same clock edge, which makes the circuit’s behavior predictable and easier to design around. The tradeoff is speed: the clock signal takes a small but real amount of time to propagate across the entire circuit, which can limit how fast the system operates.

Asynchronous sequential circuits have no clock at all. Instead, state changes happen directly in response to input changes, whenever they arrive. This makes asynchronous circuits potentially faster, since they don’t wait for a clock tick. But the lack of synchronization introduces a serious risk called a race condition. If two input signals arrive at slightly different times, the circuit may process them in the wrong order and land in an incorrect state. Designing reliable asynchronous circuits is significantly harder, which is why the vast majority of digital systems use synchronous designs.

Moore and Mealy State Machines

Engineers use two formal models to describe how sequential circuits behave. In a Moore machine, the outputs depend only on the current state. Whenever the circuit transitions to a new state on a clock edge, the output changes accordingly. This makes outputs synchronous with the clock, which simplifies timing analysis.

In a Mealy machine, the outputs depend on both the current state and the current inputs. Because the inputs can change at any moment, the outputs of a Mealy machine can change between clock edges, making them asynchronous in nature. Mealy machines often require fewer states than an equivalent Moore machine to accomplish the same task, since the inputs directly influence the output without waiting for a full state transition.

Both models are equally capable of representing any sequential behavior. The choice between them typically comes down to design priorities: Moore machines are easier to debug and test, while Mealy machines can respond to inputs faster.

Representing Circuit Behavior

Two tools are commonly used to map out what a sequential circuit does. A state diagram uses circles to represent states and arrows to represent transitions between them. Each arrow is labeled with the input that causes the transition and the output it produces. You can trace through a state diagram by following the arrows to see exactly how the circuit responds to any sequence of inputs.

A state table presents the same information in a grid format. The rows list every possible current state, the columns list every possible input combination, and each cell contains the next state and output. State tables, state diagrams, and the circuit itself all describe the same behavior in different forms. Engineers often start with one representation and convert to the others during the design process.

Common Applications

Sequential circuits are everywhere in digital electronics. Registers, which are collections of two or more flip-flops working in parallel, store multi-bit values like a byte of data. Every processor uses registers to hold instructions, addresses, and intermediate results during computation.

Counters are sequential circuits that increment (or decrement) their stored value on every clock cycle. A simple counter cycles through values like 0, 1, 2, 3 and so on, and typically includes an enable signal to pause counting and a reset signal to return to zero. Counters show up in everything from digital clocks to memory address generators.

Shift registers move stored data one position left or right on each clock cycle. They’re used in serial communication, where data arrives one bit at a time and needs to be assembled into a full word, and in signal processing applications. More advanced shift registers can shift in either direction, load new values, or reset, all controlled by input signals combined with counters.

At a larger scale, RAM itself is built from sequential circuit principles. Each memory cell stores a bit using feedback, and the entire memory system relies on clocked control signals to read and write data reliably.

How FPGAs Implement Sequential Logic

Modern programmable chips called FPGAs (field-programmable gate arrays) implement sequential circuits using configurable logic blocks, or CLBs. Each CLB typically contains small lookup tables for combinational logic along with edge-triggered flip-flops for storage. The flip-flops can be configured to trigger on either the rising or falling clock edge and include enable and reset signals for additional control.

When implementing a state machine on an FPGA, engineers choose how to encode the states. Binary encoding uses the fewest flip-flops but requires more complex combinational logic. One-hot encoding assigns one flip-flop per state, using more flip-flops but simplifying the logic between them. Since FPGAs have flip-flops in abundance, one-hot encoding is generally preferred because it produces simpler, faster designs. For particularly complex state machines, modern FPGAs also include embedded memory blocks that can store state transition information directly, avoiding the need to build the logic from individual gates entirely.