Deep Tech

What the Hesai–Keeta Drone Partnership Reveals About Scaling Urban Drone Delivery

Sensing technology is facilitating the transition of drone delivery services from trial phases to regular daily operations.

Updated

January 23, 2026 10:41 AM

A quadcopter drone with package attached. PHOTO: FREEPIK

A new partnership between Hesai Technology, a LiDAR solutions company and Keeta Drone, an urban delivery platform under Meituan, offers a glimpse into how drone delivery is moving from experimentation to real-world scale.

Under the collaboration, Hesai will supply solid-state LiDAR sensors for Keeta’s next-generation delivery drones. The goal is to make everyday drone deliveries more reliable as they move from trials to routine operations. Keeta Drone operates in a challenging space—low-altitude urban airspace. Its drones deliver food, medicine and emergency supplies across cities such as Beijing, Shanghai, Hong Kong and Dubai. With more than 740,000 deliveries completed across 65 routes, the company has discontinued testing the concept. It is scaling it. For that scale to work, drones must be able to navigate crowded environments filled with buildings, trees, power lines and unpredictable conditions. This is where Hesai’s technology comes in.

Hesai’s solid-state LiDAR is integrated into Keeta's latest long-range delivery drones. LiDAR stands for Light Detection and Ranging. In simple terms, it is a sensing technology that helps machines understand their surroundings by sending out laser pulses and measuring how they bounce back. Unlike GPS, LiDAR does not rely solely on satellites to determine position. Instead, it gives drones a direct sense of their surroundings, helping them spot small but critical obstacles like wires or tree branches.

In a recent demonstration, Keeta Drone completed a nighttime flight using LiDAR-based navigation alone without relying on cameras or satellite positioning. This shows how the technology can support stable operations even when visibility is poor or GPS signals are limited.

The LiDAR system used in these drones is Hesai’s second-generation solid-state model known as FTX. Compared with earlier versions, the sensor offers higher resolution while being smaller and lighter—important considerations for airborne systems where weight and space are limited. The updated design also reduces integration complexity, making it easier to incorporate into commercial drone platforms. Large-scale production of the sensor is expected to begin in 2026.

From Hesai’s perspective, delivery drones are one of several forms robots are expected to take in the coming decades. Industry forecasts suggest robots will increasingly appear in many roles from industrial systems to service applications, with drones becoming a familiar part of urban infrastructure rather than a novelty.

For Keeta Drone, this improves safety and reliability. And for the broader industry, it signals that drone logistics is entering a more mature phase—one defined less by experimentation and more by dependable execution. Taken together, the partnership highlights a practical evolution in drone delivery.

As cities grow more complex, the question is no longer whether drones can fly but whether they can do so reliably, safely and at scale. At its core, this partnership is not about drones or sensors as products. It is about what it takes to make a complex system work quietly in real cities. As drone delivery moves out of pilot zones and into everyday use, reliability matters more than novelty.

Keep Reading

Artificial Intelligence

Can a Toy Teach a Child to Read Like a Human Would? Inside the Rise of AI Reading Companions

A closer look at how reading, conversation, and AI are being combined

Updated

February 7, 2026 2:18 PM

Assorted plush character toys piled inside a glass claw machine. PHOTO: ADOBE STOCK

In the past, “educational toys” usually meant flashcards, prerecorded stories or apps that asked children to tap a screen. ChooChoo takes a different approach. It is designed not to instruct children at them, but to talk with them.

ChooChoo is an AI-powered interactive reading companion built for children aged three to six. Instead of playing stories passively, it engages kids in conversation while reading. It asks questions, reacts to answers, introduces new words in context and adjusts the story flow based on how the child responds. The goal is not entertainment alone, but language development through dialogue.

That idea is rooted in research, not novelty. ChooChoo is inspired by dialogic reading methods from Yale’s early childhood language development work, which show that children learn language faster when stories become two-way conversations rather than one-way narration. Used consistently, this approach has been shown to improve vocabulary, comprehension and confidence within weeks.

The project was created by Dr. Diana Zhu, who holds a PhD from Yale and focused her work on how children acquire language. Her aim with ChooChoo was to turn academic insight into something practical and warm enough to live in a child’s room. The result is a device that listens, responds and adapts instead of simply playing content on command.

What makes this possible is not just AI, but where that AI runs.

Unlike many smart toys that rely heavily on the cloud, ChooChoo is built on RiseLink’s edge AI platform. That means much of the intelligence happens directly on the device itself rather than being sent back and forth to remote servers. This design choice has three major implications.

First, it reduces delay. Conversations feel natural because the toy can respond almost instantly. Second, it lowers power consumption, allowing the device to stay “always on” without draining the battery quickly. Third, it improves privacy. Sensitive interactions are processed locally instead of being continuously streamed online.

RiseLink’s hardware, including its ultra-low-power AI system-on-chip designs, is already used at large scale in consumer electronics. The company ships hundreds of millions of connected chips every year and works with global brands like LG, Samsung, Midea and Hisense. In ChooChoo’s case, that same industrial-grade reliability is being applied to a child’s learning environment.

The result is a toy that behaves less like a gadget and more like a conversational partner. It engages children in back-and-forth discussion during stories, introduces new vocabulary in natural context, pays attention to comprehension and emotional language and adjusts its pace and tone based on each child’s interests and progress. Parents can also view progress through an optional app that shows what words their child has learned and how the system is adjusting over time.

What matters here is not that ChooChoo is “smart,” but that it reflects a shift in how technology enters early education. Instead of replacing teachers or parents, tools like this are designed to support human interaction by modeling it. The emphasis is on listening, responding and encouraging curiosity rather than testing or drilling.

That same philosophy is starting to shape the future of companion robots more broadly. As edge AI improves and hardware becomes smaller and more energy efficient, we are likely to see more devices that live alongside people instead of in front of them. Not just toys, but helpers, tutors and assistants that operate quietly in the background, responding when needed and staying out of the way when not.

In that sense, ChooChoo is less about novelty and more about direction. It shows what happens when AI is designed not for spectacle, but for presence. Not for control, but for conversation.

If companion robots become part of daily life in the coming years, their success may depend less on how powerful they are and more on how well they understand when to speak, when to listen and how to grow with the people who use them.