How Korea is trying to take control of its AI future.
Updated
December 30, 2025 1:38 PM

SK Telecom Headquarters in Seoul, South Korea. PHOTO: ADOBE STOCK
SK Telecom, South Korea’s largest mobile operator, has unveiled A.X K1, a hyperscale artificial intelligence model with 519 billion parameters. The model sits at the center of a government-backed effort to build advanced AI systems and domestic AI infrastructure within Korea. This comes at a time when companies in the United States and China largely dominate the development of the most powerful large language models.
Rather than framing A.X K1 as just another large language model, SK Telecom is positioning it as part of a broader push to build sovereign AI capacity from the ground up. The model is being developed as part of the Korean government’s Sovereign AI Foundation Model project, which aims to ensure that core AI systems are built, trained and operated within the country. In simple terms, the initiative focuses on reducing reliance on foreign AI platforms and cloud-based AI infrastructure, while giving Korea more control over how artificial intelligence is developed and deployed at scale.
One of the gaps this approach is trying to address is how AI knowledge flows across a national ecosystem. Today, the most powerful AI foundation models are often closed, expensive and concentrated within a small number of global technology companies. A.X K1 is designed to function as a “teacher model,” meaning it can transfer its capabilities to smaller, more specialized AI systems. This allows developers, enterprises and public institutions to build tailored AI tools without starting from scratch or depending entirely on overseas AI providers.
That distinction matters because most real-world applications of artificial intelligence do not require massive models operating independently. They require focused, reliable AI systems designed for specific use cases such as customer service, enterprise search, manufacturing automation or mobility. By anchoring those systems to a large, domestically developed foundation model, SK Telecom and its partners are aiming to create a more resilient and self-sustaining AI ecosystem.
The effort also reflects a shift in how AI is being positioned for everyday use. SK Telecom plans to connect A.X K1 to services that already reach millions of users, including its AI assistant platform A., which operates across phone calls, messaging, web services and mobile applications. The broader goal is to make advanced AI feel less like a distant research asset and more like an embedded digital infrastructure that supports daily interactions.
This approach extends beyond consumer-facing services. Members of the SKT consortium are testing how the hyperscale AI model can support industrial and enterprise applications, including manufacturing systems, game development, robotics and autonomous technologies. The underlying logic is that national competitiveness in artificial intelligence now depends not only on model performance, but on whether those models can be deployed, adapted and validated in real-world environments.
There is also a hardware dimension to the project. Operating an AI model at the 500-billion-parameter scale places heavy demands on computing infrastructure, particularly memory performance and communication between processors. A.X K1 is being used to test and validate Korea’s semiconductor and AI chip capabilities under real workloads, linking large-scale AI software development directly to domestic semiconductor innovation.
The initiative brings together technology companies, universities and research institutions, including Krafton, KAIST and Seoul National University. Each contributes specialized expertise ranging from data validation and multimodal AI research to system scalability. More than 20 institutions have already expressed interest in testing and deploying the model, reinforcing the idea that A.X K1 is being treated as shared national AI infrastructure rather than a closed commercial product.
Looking ahead, SK Telecom plans to release A.X K1 as open-source AI software, alongside APIs and portions of the training data. If fully implemented, the move could lower barriers for developers, startups and researchers across Korea’s AI ecosystem, enabling them to build on top of a large-scale foundation model without incurring the cost and complexity of developing one independently.
Keep Reading
Where smarter storage meets smarter logistics.
Updated
December 16, 2025 3:29 PM
.jpg)
Kioxia's flagship building at Yokohama Technology Campus. PHOTO: KIOXIA
E-commerce keeps growing and with it, the number of products moving through warehouses every day. Items vary more than ever — different shapes, seasonal packaging, limited editions and constantly updated designs. At the same time, many logistics centers are dealing with labour shortages and rising pressure to automate.
But today’s image-recognition AI isn’t built for this level of change. Most systems rely on deep-learning models that need to be adjusted or retrained whenever new products appear. Every update — whether it’s a new item or a packaging change — adds extra time, energy use and operational cost. And for warehouses handling huge product catalogs, these retraining cycles can slow everything down.
KIOXIA, a company known for its memory and storage technologies, is working on a different approach. In a new collaboration with Tsubakimoto Chain and EAGLYS, the team has developed an AI-based image recognition system that is designed to adapt more easily as product lines grow and shift. The idea is to help logistics sites automatically identify items moving through their workflows without constantly reworking the core AI model.
At the center of the system is KIOXIA’s AiSAQ software paired with its Memory-Centric AI technology. Instead of retraining the model each time new products appear, the system stores new product data — images, labels and feature information — directly in high-capacity storage. This allows warehouses to add new items quickly without altering the original AI model.
Because storing more data can lead to longer search times, the system also indexes the stored product information and transfers the index into SSD storage. This makes it easier for the AI to retrieve relevant features fast, using a Retrieval-Augmented Generation–style method adapted for image recognition.
The collaboration will be showcased at the 2025 International Robot Exhibition in Tokyo. Visitors will see the system classify items in real time as they move along a conveyor, drawing on stored product features to identify them instantly. The demonstration aims to illustrate how logistics sites can handle continuously changing inventories with greater accuracy and reduced friction.
Overall, as logistics networks become increasingly busy and product lines evolve faster than ever, this memory-driven approach provides a practical way to keep automation adaptable and less fragile.