CES 2026 and the move toward wearable robots you don’t wear all day.
Updated
January 28, 2026 5:53 PM

The π6 exoskeleton from VIGX. PHOTO: VIGX
CES 2026 highlighted how robotics is taking many different forms. VIGX, a wearable robotics company, used the event to introduce the π6, a portable exoskeleton robot designed to be carried and worn only when needed. Unveiled in Las Vegas, the device reflects a broader shift at CES toward robotics that move with people rather than staying fixed in industrial or clinical settings.
Exoskeletons have existed for years, most commonly in controlled environments such as factories, rehabilitation facilities and specialised research settings. In these contexts, they have tended to be large, fixed systems intended for long sessions of supervised use rather than something a person could deploy on their own.
Against that backdrop, the π6 explores a more personal and flexible approach to assistance. Instead of treating an exoskeleton as permanent equipment, it is designed to be something users carry with them and wear only when a task or situation calls for extra support.
The π6 weighs 1.9 kilograms and folds down to a size that fits into a bag. When worn, it sits around the waist and legs, providing mechanical assistance during activities such as walking, climbing or extended movement. Rather than altering how people move, the system adds controlled rotational force at key joints to reduce physical strain over time.
According to the company, the device delivers up to 800 watts of peak power and 16 Nm of rotational force. In practical terms, this means the system is designed to help users sustain effort for longer periods, especially during physically demanding activities_ by easing the body's load rather than pushing it beyond normal limits.
The π6 is designed to support users weighing between 45 kilograms and 120 kilograms and is intended for intermittent use. This reinforces its role as a wearable companion — something taken out when needed and set aside when not — rather than a device meant to be worn continuously.
Another aspect of the system is how it responds to different environments. Using onboard sensors and processing, the exoskeleton can detect changes such as slopes or uneven ground and adjust the level of assistance accordingly. This reduces the need for manual adjustments and helps maintain a consistent walking experience across varied terrain, with software fine-tuning how assistance is applied rather than directing movement itself.
The hardware design follows a similar logic. The power belt contains a detachable battery, allowing users to remove or swap it without handling the entire system. This keeps the wearable components lighter and makes the exoskeleton easier to transport. The battery can also be used as a general power source for small electronic devices, adding a layer of practicality beyond the exoskeleton’s core function.
VIGX frames its work around accessibility rather than industrial automation. “To empower ordinary people,” said founder Bob Yu, explaining why the company chose to focus on exoskeleton robotics. “VIGX is dedicated to expanding the physical limits of humans, enabling deeper outdoor adventures, making running and cycling easier and more enjoyable and allowing people to sustain their outdoor pursuits regardless of age.”
Placed within the wider context of CES, the π6 sits alongside a growing number of portable robots and wearable systems that prioritise convenience, mobility and personal use. By reducing the physical and practical barriers to wearing an exoskeleton, VIGX is testing whether assistive robotics can move beyond niche environments and into everyday life. If that experiment succeeds, wearable robots may become less about dramatic augmentation and more about quiet support — present when needed and easy to put away when not.
Keep Reading
A closer look at how reading, conversation, and AI are being combined
Updated
February 7, 2026 2:18 PM

Assorted plush character toys piled inside a glass claw machine. PHOTO: ADOBE STOCK
In the past, “educational toys” usually meant flashcards, prerecorded stories or apps that asked children to tap a screen. ChooChoo takes a different approach. It is designed not to instruct children at them, but to talk with them.
ChooChoo is an AI-powered interactive reading companion built for children aged three to six. Instead of playing stories passively, it engages kids in conversation while reading. It asks questions, reacts to answers, introduces new words in context and adjusts the story flow based on how the child responds. The goal is not entertainment alone, but language development through dialogue.
That idea is rooted in research, not novelty. ChooChoo is inspired by dialogic reading methods from Yale’s early childhood language development work, which show that children learn language faster when stories become two-way conversations rather than one-way narration. Used consistently, this approach has been shown to improve vocabulary, comprehension and confidence within weeks.
The project was created by Dr. Diana Zhu, who holds a PhD from Yale and focused her work on how children acquire language. Her aim with ChooChoo was to turn academic insight into something practical and warm enough to live in a child’s room. The result is a device that listens, responds and adapts instead of simply playing content on command.
What makes this possible is not just AI, but where that AI runs.
Unlike many smart toys that rely heavily on the cloud, ChooChoo is built on RiseLink’s edge AI platform. That means much of the intelligence happens directly on the device itself rather than being sent back and forth to remote servers. This design choice has three major implications.
First, it reduces delay. Conversations feel natural because the toy can respond almost instantly. Second, it lowers power consumption, allowing the device to stay “always on” without draining the battery quickly. Third, it improves privacy. Sensitive interactions are processed locally instead of being continuously streamed online.
RiseLink’s hardware, including its ultra-low-power AI system-on-chip designs, is already used at large scale in consumer electronics. The company ships hundreds of millions of connected chips every year and works with global brands like LG, Samsung, Midea and Hisense. In ChooChoo’s case, that same industrial-grade reliability is being applied to a child’s learning environment.
The result is a toy that behaves less like a gadget and more like a conversational partner. It engages children in back-and-forth discussion during stories, introduces new vocabulary in natural context, pays attention to comprehension and emotional language and adjusts its pace and tone based on each child’s interests and progress. Parents can also view progress through an optional app that shows what words their child has learned and how the system is adjusting over time.
What matters here is not that ChooChoo is “smart,” but that it reflects a shift in how technology enters early education. Instead of replacing teachers or parents, tools like this are designed to support human interaction by modeling it. The emphasis is on listening, responding and encouraging curiosity rather than testing or drilling.
That same philosophy is starting to shape the future of companion robots more broadly. As edge AI improves and hardware becomes smaller and more energy efficient, we are likely to see more devices that live alongside people instead of in front of them. Not just toys, but helpers, tutors and assistants that operate quietly in the background, responding when needed and staying out of the way when not.
In that sense, ChooChoo is less about novelty and more about direction. It shows what happens when AI is designed not for spectacle, but for presence. Not for control, but for conversation.
If companion robots become part of daily life in the coming years, their success may depend less on how powerful they are and more on how well they understand when to speak, when to listen and how to grow with the people who use them.