Haptics is the science of creating touch sensations through technology. Derived from the Greek word “haptein,” meaning “to touch,” the field covers any system that lets you feel something that isn’t physically there, whether that’s the click of a virtual button on a touchscreen, the recoil of a gun in a video game, or the resistance a surgeon feels when a robot arm presses against tissue. The global haptics market is projected to reach $11.27 billion in 2026, growing at nearly 10% per year, driven largely by gaming, smartphones, and the expanding world of connected devices.
Two Types of Touch Feedback
Haptic technology works by mimicking two distinct channels your body uses to sense the physical world. The first is tactile feedback: sensations on the surface of your skin like pressure, vibration, and texture. The second is kinesthetic feedback: the deeper sense of force and motion that comes from your muscles, tendons, and joints. When you grip a heavy suitcase, your skin feels the handle pressing into your palm (tactile), while your arm muscles register the weight pulling downward (kinesthetic).
Most consumer devices focus on tactile feedback. The vibration in your phone when you type on a glass keyboard is tactile. But more advanced systems, like the force-feedback steering wheels used in racing simulators, combine both types. They push back against your hands to recreate the resistance of turning real tires on pavement.
How Your Skin Actually Senses Touch
To understand why haptic technology works, it helps to know what it’s imitating. Your skin contains four main types of pressure sensors, each tuned to different signals. Some sit near the surface and detect fine details like edges and textures. Others are buried deeper and pick up vibration or skin stretching. On hairless skin like your fingertips and palms, the sensors responsible for detecting light touch and object handling are the most densely packed, which is why your fingers are so much more sensitive than, say, your back.
Some of these sensors respond quickly and then go quiet (which is why you stop noticing the feeling of your shirt after a few minutes). Others fire continuously as long as pressure is applied. Haptic engineers exploit these differences. A quick, sharp vibration pulse activates the fast-responding sensors and feels like a tap. A sustained, low-frequency rumble engages the slower ones and feels like steady pressure.
The Hardware Behind the Feeling
The simplest haptic devices use a small motor with an off-center weight that spins to create vibration. This is the technology behind the basic buzz in older phones and game controllers. It’s cheap and reliable, but it can only produce one kind of sensation: a generic rumble.
More advanced systems use voice-coil actuators, similar in design to a tiny speaker. Instead of producing sound you hear, they produce precise vibrations you feel. The PS5’s DualSense controller uses this approach with highly programmable voice-coil actuators that can simulate everything from raindrops to the gritty drag of driving through mud. Compared to older rumble motors, these actuators produce both more powerful and more nuanced feedback.
At the high end, piezoelectric actuators use crystals that change shape when voltage is applied. They respond in under one millisecond and can operate across a frequency range of 1 to 1,000 Hz, allowing designers to create highly detailed vibration profiles. That speed matters because even small delays between what you see on screen and what you feel in your hand break the illusion of realism.
Haptics in Phones and Touchscreens
Every time you feel a subtle click while typing on a smartphone, that’s haptics at work. Modern phones use linear actuators to produce short, crisp taps timed to each keypress, creating the illusion of pressing a physical button on a flat glass surface. This same technology powers the “clicks” you feel when adjusting a volume slider or long-pressing an app icon.
The shift from physical buttons to touchscreens in cars has made haptics especially important for driving safety. Flat screens require you to look at what you’re pressing, which pulls your eyes off the road. Research from a divided-attention study found that when drivers performed touchscreen tasks using haptic-only feedback (feeling their way through controls rather than looking), their visual attention on a simultaneous search task improved significantly compared to using visual or combined feedback. In practical terms, haptic confirmation lets you adjust the climate control or change a radio station by feel, the way you once could with physical knobs.
Gaming and Virtual Reality
Gaming is where most people first encounter advanced haptics. The PS5’s DualSense controller is a good example of how far the technology has come. Rather than a uniform buzz, it delivers micro-sensations that correspond to what’s happening on screen. Pulling a bowstring feels different from revving an engine, and walking on sand feels different from walking on metal. Microsoft’s Xbox controllers offer a simpler version of this through haptic triggers that provide variable resistance.
In VR headsets and gloves, haptics become even more critical. When you reach out and “grab” a virtual object, haptic gloves can stiffen against your fingers to simulate the object’s shape and resistance. Without this feedback, VR feels like waving your hands through air. With it, your brain begins to treat virtual objects as real.
Surgical Robots and Medical Training
One of the most consequential applications of haptics is in robot-assisted surgery. When a surgeon operates through a robotic system, they’re sitting at a console several feet (or sometimes miles) from the patient, controlling robotic arms with hand movements. The goal is “transparency,” where the surgeon feels as though their own hands are touching the patient rather than operating a remote mechanism.
This requires sensors on the robotic instruments that measure the forces applied to tissue, then motors in the hand controls that recreate those forces for the surgeon. Research on these systems consistently shows that haptic feedback reduces unintentional injuries during delicate tasks like dissection. In one study using a modified da Vinci surgical system, direct force feedback outperformed visual-only force displays during tissue examination. Other research found that force feedback reduces the amount of force surgeons accidentally apply to tissue, lowering the risk of damage. Interestingly, while both trained surgeons and non-surgeons benefited from haptic feedback in terms of safety, only the trained surgeons maintained their speed while doing so.
Some systems also use “virtual fixtures,” software-generated boundaries that push back against the surgeon’s hand to prevent the instrument from entering a dangerous zone. Think of it like invisible guardrails that the surgeon can feel but not see.
Accessibility and Navigation Aids
For people with visual impairments, haptics can replace visual information with touch-based cues. Researchers have developed wearable belts equipped with an array of small vibrating actuators positioned around the user’s abdomen. These belts connect to obstacle-detection systems and vibrate in specific patterns to indicate where nearby objects are, how close they are, and which direction to move. A buzz on your left hip might mean “obstacle to your left,” while increasing intensity means “it’s getting closer.”
These systems are still largely in the research phase, but early results from studies combining haptic belts with audio cues in virtual environments show promise for navigation training. The key advantage of haptic feedback over audio-only systems is that it doesn’t compete with environmental sounds the user needs to hear, like traffic or conversation.
Touch Without Contact
Perhaps the most striking recent development is mid-air haptics, which creates the sensation of touch without any wearable device at all. Arrays of ultrasonic transducers focus sound waves onto a precise point on your skin. The sound is far above the range of human hearing, but when the waves converge on your hand, they create tiny vibrations in the skin tissue. The sensation has been described as feeling like gentle, pressurized airflow on your palm.
By modulating the ultrasound at around 200 Hz, the frequency your skin’s pressure sensors are most sensitive to, engineers can create shapes and textures you can feel floating in the air. Synchronizing multiple transducers with precise timing allows for multiple focal points, so you can feel what seems like a three-dimensional object or a row of buttons hovering in front of you. This technology is being explored for everything from public kiosks (where touchscreens spread germs) to car dashboards and interactive museum exhibits.

