What Will Phones Look Like in 10 Years: AI, AR, and More

Phones in 10 years will still be recognizable as phones, but they’ll be thinner, smarter, and far more connected to the world around you. The biggest shifts won’t come from a single breakthrough but from several technologies maturing at once: flexible displays, on-device AI powerful enough to replace many apps, solid-state batteries with two to three times the energy density of today’s, and AR glasses that turn your phone into a background engine rather than something you stare at all day.

Screens That Bend, Roll, and Stretch

The rigid glass slab has dominated phone design for over 15 years, but foldable phones have already cracked the door open to something different. The next step is rollable and stretchable displays, screens that expand from a compact form into a tablet-sized surface when you need it and tuck back down when you don’t. Samsung and LG have shown working prototypes of rollable screens that pull out from one side of the device like a scroll, and the underlying OLED technology is advancing quickly enough that these designs should move from novelty to mainstream within the decade.

What this means in practice: your phone might sit comfortably in a front pocket at 4 inches, then stretch to 7 or 8 inches for reading, video, or productivity. The phone and the tablet merge into one device. The tradeoff today is durability, since flexible screens scratch and crease more easily than glass. That’s where new materials come in.

Surfaces That Repair Themselves

Self-healing polymers are already proven in lab settings. These materials contain tiny capsules or chemical networks that activate when scratched, flowing into the damage and bonding back together. Current prototypes can heal surface scratches at room temperature in roughly 10 to 48 hours depending on the depth and size of the mark. Within a decade, phone coatings could recover from everyday pocket scratches overnight, reducing the need for screen protectors and bulky cases. Full recovery from deeper scrapes currently takes one to seven days in lab conditions, so the technology isn’t magic, but for the minor abrasions most people deal with, it could make a real difference.

AI That Runs Entirely on Your Device

Today’s phones already have dedicated AI chips, called neural processing units, that handle tasks like Face ID, photo enhancement, and voice recognition. The pace of improvement is staggering. Apple’s current Neural Engine runs at 35 trillion operations per second. Qualcomm’s latest laptop chip hits 45 trillion, and Intel has announced plans for chips reaching 74 trillion operations per second, a fivefold jump from their 2024 hardware.

Scale that trajectory out to 2035 and your phone’s AI chip will be powerful enough to handle complex tasks that currently require cloud servers: real-time video translation where someone speaking Mandarin appears to be speaking English with matched lip movements, AI assistants that genuinely understand context and can draft emails, negotiate appointments, or summarize a 90-minute meeting in seconds, all without sending your data to the cloud. That last part matters. On-device processing means your conversations, photos, and health data stay on your hardware, which changes the privacy equation significantly.

The practical effect is that many standalone apps become unnecessary. Instead of downloading a translation app, a photo editor, a writing assistant, and a fitness tracker, a single AI layer handles all of it, adapting to what you need in the moment.

Batteries That Last for Days

Solid-state batteries replace the liquid electrolyte in today’s lithium-ion cells with a solid material, which allows them to store two to three times more energy in the same amount of space. Industry forecasts place broader consumer availability, including phones and other electronics, between 2028 and 2030, with costs dropping to compete with current lithium-ion by that window.

For you, this likely means a phone that lasts two full days of heavy use on a single charge, or three days of moderate use. It also means faster charging, since solid-state cells can tolerate higher charging rates without degrading as quickly. Combined with more efficient AI chips and better display technology, the daily charging ritual could become a twice-a-week habit.

Charging Without Plugging In

Wireless charging today requires you to set your phone on a pad. Spatial wireless charging, the kind that works across a room, is further behind but progressing. Stanford researchers have demonstrated a system that transmits 10 watts across 2 to 3 feet at 92% efficiency, with performance holding steady regardless of distance within a 6-foot range. Ten watts is enough to slowly charge a phone, and the efficiency is surprisingly close to wired charging.

By 2035, it’s plausible that restaurants, offices, and airports will have spatial chargers embedded in furniture and walls, topping off your battery whenever you’re nearby. You won’t think about charging the way you don’t think about connecting to Wi-Fi in a coffee shop today. The technology exists now in lab form. The challenge is standardization, safety certification, and buildout, which is an infrastructure problem more than a physics one.

AR Glasses Change What a Phone Is For

Perhaps the biggest shift in how phones look in 10 years is that you’ll look at them less. AR glasses are converging on a form factor that resembles regular eyewear, powered by advances in holographic waveguides and microLED displays that project bright, high-resolution images onto transparent lenses without blocking your view. Miniaturized processors and 5G (and eventually 6G) networks offload the heavy computation to the cloud or to the phone in your pocket, keeping the glasses light and cool enough to wear all day.

Advanced sensors including depth cameras and LiDAR, combined with AI-powered computer vision, let these glasses understand the physical world in real time: recognizing objects, mapping rooms, and responding to hand gestures. Your phone becomes the processing hub and cellular radio that powers the glasses, while notifications, navigation, and video calls float in your field of vision. You pull the phone out for tasks that need a bigger canvas or precise input, but the default interaction shifts to your face.

This doesn’t mean phones disappear. It means they evolve into something more like a pocket server, still essential, but no longer the primary screen for every interaction.

6G Connectivity

5G is still rolling out in much of the world, but the specifications for 6G are already being drafted under the international standard known as IMT-2030. The targets are dramatic: peak data rates of 1 terabit per second (roughly 100 times faster than the best 5G) and round-trip latency under 100 microseconds, which is fast enough to make holographic video calls and real-time remote device control feel instantaneous.

For everyday use, 6G won’t just mean faster downloads. It enables the kind of continuous, low-latency connection that AR glasses and on-device AI need to work seamlessly. When your glasses need to render a complex 3D overlay but your phone’s processor is busy, 6G can bounce that task to an edge server and return the result before you perceive any delay. It’s the connective tissue that ties together all the other advances.

Brain Control Is Still a Long Way Off

Whenever people imagine future phones, brain-computer interfaces come up. The honest answer: non-invasive versions, the kind that read brain signals through sensors on your head rather than implants, still suffer from weak, noisy signals that require expensive amplification and sophisticated processing. Current non-invasive systems can do basic tasks like moving a cursor or selecting letters on a screen, but they’re slow, inconsistent, and require bulky headsets.

Companies are working on commercial BCI devices, and the technology holds real promise for people with physical disabilities. But for controlling a phone with your thoughts while walking down the street, the signal quality and miniaturization challenges remain largely unsolved. Privacy and security concerns add another layer of complexity. Ten years from now, you might see early consumer BCI accessories for specific tasks like meditation tracking or simple hands-free commands, but swiping and tapping aren’t going anywhere yet.

What the 2035 Phone Looks Like

Putting it all together: the phone you carry in 2035 will likely be a thin, flexible device with a screen that expands when you need more space and contracts for portability. Its surface will shrug off minor scratches overnight. A powerful AI chip will handle translation, writing, photo editing, and personal assistance without an internet connection. A solid-state battery will last two to three days between charges, and you may not need to think about charging at all if spatial wireless chargers become common in public spaces.

The biggest change might be behavioral rather than physical. With lightweight AR glasses handling notifications, navigation, and quick interactions, you’ll reach for your phone less often. It becomes the engine rather than the interface, always working but rarely demanding your attention. The glass rectangle isn’t going away, but its role in your daily life is about to get a lot quieter.