Major Forums & Conferences

How CES 2026 Reframed the Role of Robots

Examining how robots are moving from demonstrations to daily use.

Updated

January 8, 2026 6:22 PM

An industrial robotic arm capable of autonomous welding. PHOTO: ADOBE STOCK

CES 2026 did not frame robotics as a distant future or a technological spectacle. Instead, it highlighted machines designed for the slow, practical work of fitting into human systems. Across the show floor, robots were no longer performing for attention but being shaped by real-world constraints—space, safety, fatigue and repetition.

They appeared in factories, homes, emergency settings and industrial sites, each responding to a specific kind of human limitation. Together, these four robots reveal how robotics is being redefined: not as a replacement for people, but as infrastructure that quietly takes on work humans are least meant to carry alone.

1. Hyundai’s Atlas: From lab to factory

Hyundai Motor unveiled its electric humanoid robot, Atlas, during a media day on January 5, 2026, at the Mandalay Bay Convention Center in Las Vegas as part of CES 2026. Developed with Boston Dynamics, Hyundai’s U.S.-based robotics subsidiary, Atlas was presented in two forms: a research prototype and a commercial model designed for real factory environments.

Shown under the theme “AI Robotics, Beyond the Lab to Life: Partnering Human Progress,” Atlas is designed to work alongside humans rather than replace them. The premise is straightforward—robots take on physically demanding and repetitive tasks such as sorting and assembly, while people focus on work requiring judgment, creativity and decision-making.

Built for industrial use, the commercial version of Atlas is designed to adapt quickly, with Hyundai stating it can learn new tasks within a day. Its adult-sized humanoid form features 56 degrees of freedom, enabling flexible, human-like movement. Tactile sensors in its hands and a 360-degree vision system support spatial awareness and precise operation.

Atlas is also engineered for demanding conditions. It can lift up to 50 kilograms, operate in temperatures ranging from –20°C to 40°C and is waterproof, making it suitable for challenging factory settings.

Looking ahead, Hyundai expects Atlas to begin with parts sorting and sequencing by 2028, move into assembly by 2030 and later take on precision tasks that require sustained physical effort and focus.

2. Widemount’s Smart Firefighting Robot: Built for hazard zones

Widemount’s Smart Firefighting Robot is designed to operate in environments that are difficult and dangerous for humans to enter. Developed by Widemount Dynamics, a spinout from the Hong Kong Polytechnic University, the robot is built to support emergency teams during fires, particularly in enclosed and smoke-filled spaces.

The robot can move through buildings and industrial facilities even when visibility is near zero. Rather than relying on cameras or GPS, it uses radar-based mapping to understand its surroundings and determine a safe path forward. This allows it to continue operating when smoke, heat or debris would normally restrict access.

As it approaches a fire, the robot analyses the burning object. Its onboard AI helps identify the material involved and selects an appropriate extinguishing method. Sensors simultaneously assess flame intensity and send real-time updates to command centres, giving responders clearer situational awareness.

When actively fighting a fire, the robot can aim directly at the source and deploy extinguishing agents autonomously. The system continuously adjusts its actions based on incoming sensor data, reducing the need for constant human intervention during high-risk situations.

3. LG Electronics’ LG CLOiD: Automation for domestic spaces

At CES 2026, LG Electronics offered a glimpse into how household work could gradually shift from people to machines. The company introduced LG CLOiD, an AI-powered home robot designed to manage everyday chores by working directly with connected appliances within LG’s ThinQ ecosystem.

Designed for indoor living spaces, CLOiD features a compact upper body with two articulated arms, a head unit and a wheeled base that enables steady movement across floors. Its torso can tilt to adjust height, allowing it to reach items placed low or on kitchen counters. The arms and hands are built for careful handling, enabling the robot to grip common household objects rather than heavy tools. The head also functions as a mobile control unit, housing cameras, sensors, a display and voice interaction capabilities for communication and monitoring.

In practice, CLOiD acts as a task coordinator. It can retrieve items from appliances, operate ovens and washing machines and manage laundry cycles from start to finish, including folding and stacking clothes. By connecting multiple devices through the ThinQ system, the robot turns separate appliances into a single, coordinated workflow.

These capabilities are supported by LG’s Physical AI system. CLOiD uses vision to recognise objects and interpret its surroundings, language processing to understand instructions and action control to execute tasks step by step. Together, these systems allow the robot to follow routines, respond to user input and adjust task execution over time.

4. Doosan Robotics’ Scan & Go: Automation at an industrial scale

Doosan Robotics introduced Scan & Go at CES 2026, an AI-driven robotic system designed to automate large-scale surface repair and inspection. The solution targets environments with complex, irregular surfaces that are difficult to pre-program, such as aircraft structures, wind turbine blades and large industrial installations.

Scan & Go operates by scanning surfaces on site and building an understanding of their shape in real time. Instead of relying on detailed digital models or manual coding, the system plans its movements based on live data. This enables it to adapt to variations in size, curvature and surface condition without extensive setup.

The underlying technology combines 3D sensing with AI-based motion planning. The system interprets surface data, generates tool paths and refines its actions as work progresses. In practical terms, this reduces manual intervention while maintaining consistency across large work areas.

By handling surface preparation and inspection tasks that are time-consuming and physically demanding, Scan & Go is positioned as a support tool for industrial teams operating at scale.

A shift from demonstration to deployment

Taken together, these robots signal a clear shift in how machines are being designed and deployed. Across factories, homes, emergency sites and industrial infrastructure, robotics is moving beyond demonstrations and into practical roles that support human work.

The unifying theme is not replacement, but relief—robots taking on tasks that are repetitive, hazardous or physically demanding. CES 2026 suggests that robotics is evolving from spectacle to utility, with a growing focus on systems that adapt to real environments, respond to genuine constraints and integrate into everyday workflows.

Keep Reading

Artificial Intelligence

SK Telecom Unveils A.X K1: Why Korea’s First 500B-Scale Sovereign AI Model Matters

How Korea is trying to take control of its AI future.

Updated

January 13, 2026 10:56 AM

SK Telecom Headquarters in Seoul, South Korea. PHOTO: ADOBE STOCK

SK Telecom, South Korea’s largest mobile operator, has unveiled A.X K1, a hyperscale artificial intelligence model with 519 billion parameters. The model sits at the center of a government-backed effort to build advanced AI systems and domestic AI infrastructure within Korea. This comes at a time when companies in the United States and China largely dominate the development of the most powerful large language models.

Rather than framing A.X K1 as just another large language model, SK Telecom is positioning it as part of a broader push to build sovereign AI capacity from the ground up. The model is being developed as part of the Korean government’s Sovereign AI Foundation Model project, which aims to ensure that core AI systems are built, trained and operated within the country. In simple terms, the initiative focuses on reducing reliance on foreign AI platforms and cloud-based AI infrastructure, while giving Korea more control over how artificial intelligence is developed and deployed at scale.

One of the gaps this approach is trying to address is how AI knowledge flows across a national ecosystem. Today, the most powerful AI foundation models are often closed, expensive and concentrated within a small number of global technology companies. A.X K1 is designed to function as a “teacher model,” meaning it can transfer its capabilities to smaller, more specialized AI systems. This allows developers, enterprises and public institutions to build tailored AI tools without starting from scratch or depending entirely on overseas AI providers.

That distinction matters because most real-world applications of artificial intelligence do not require massive models operating independently. They require focused, reliable AI systems designed for specific use cases such as customer service, enterprise search, manufacturing automation or mobility. By anchoring those systems to a large, domestically developed foundation model, SK Telecom and its partners are aiming to create a more resilient and self-sustaining AI ecosystem.

The effort also reflects a shift in how AI is being positioned for everyday use. SK Telecom plans to connect A.X K1 to services that already reach millions of users, including its AI assistant platform A., which operates across phone calls, messaging, web services and mobile applications. The broader goal is to make advanced AI feel less like a distant research asset and more like an embedded digital infrastructure that supports daily interactions.

This approach extends beyond consumer-facing services. Members of the SKT consortium are testing how the hyperscale AI model can support industrial and enterprise applications, including manufacturing systems, game development, robotics and autonomous technologies. The underlying logic is that national competitiveness in artificial intelligence now depends not only on model performance, but on whether those models can be deployed, adapted and validated in real-world environments.

There is also a hardware dimension to the project. Operating an AI model at the 500-billion-parameter scale places heavy demands on computing infrastructure, particularly memory performance and communication between processors. A.X K1 is being used to test and validate Korea’s semiconductor and AI chip capabilities under real workloads, linking large-scale AI software development directly to domestic semiconductor innovation.

The initiative brings together technology companies, universities and research institutions, including Krafton, KAIST and Seoul National University. Each contributes specialized expertise ranging from data validation and multimodal AI research to system scalability. More than 20 institutions have already expressed interest in testing and deploying the model, reinforcing the idea that A.X K1 is being treated as shared national AI infrastructure rather than a closed commercial product.

Looking ahead, SK Telecom plans to release A.X K1 as open-source AI software, alongside APIs and portions of the training data. If fully implemented, the move could lower barriers for developers, startups and researchers across Korea’s AI ecosystem, enabling them to build on top of a large-scale foundation model without incurring the cost and complexity of developing one independently.