Deep Tech

How a South Korean University Team Is Turning Industrial Air Into Power

A turbine-inspired generator shows how overlooked industrial airflow could quietly become a new source of usable power

Updated

February 3, 2026 11:23 AM

Campus building of Chung-Ang University. PHOTO: CHUNG-ANG UNIVERSITY

Compressed air is used across factories, data centers and industrial plants to move materials, cool systems and power tools. Once it has done that job, the air is usually released — and its remaining energy goes unused.

That everyday waste is what caught the attention of a research team at Chung-Ang University in South Korea. They are investigating how this overlooked airflow can be harnessed to generate electricity instead of disappearing into the background.

Most of the world’s power today comes from systems like turbines, which turn moving fluids into energy or solar cells, which convert sunlight into electricity. The Chung-Ang team has built a device that uses compressed air to generate electricity without relying on traditional blades or sunlight.

At the center of the work is a simple question: what happens when high-pressure air spins through a specially shaped device at very high speed?  The answer lies in the air itself. The researchers found that tiny particles naturally present in the air carry an electric charge. When that air moves rapidly across certain surfaces, it can transfer charge without physical contact. This creates electricity through a process known as the “particulate static effect.”

To use that effect, the team designed a generator based on a Tesla turbine. Unlike conventional turbines with blades, a Tesla turbine uses smooth rotating disks and relies on the viscosity of air to create motion. Compressed air enters the device, spins the disks at high speed and triggers charge buildup on specially layered surfaces inside.

What makes this approach different is that the system does not depend on friction between parts rubbing together. Instead, the charge comes from particles in the air interacting with the surfaces as they move past. This reduces wear and allows the generator to operate at very high speeds. And those speeds translate into real output.

In lab tests, the device produced strong electrical power. The researchers also showed that this energy could be used in practical ways. It ran small electronic devices, helped pull moisture from the air and removed dust particles from its surroundings.

The problem this research is addressing is straightforward.
Compressed air is already everywhere in industry, but its leftover energy is usually ignored. This system is designed to capture part of that unused motion and convert it into electricity without adding complex equipment or major safety risks.

Earlier methods of harvesting static electricity from particles showed promise, but they came with dangers. Uncontrolled discharge could cause sparks or even ignition. By using a sealed, turbine-based structure, the Chung-Ang University team offers a safer and more stable way to apply the same physical effect.

As a result, the technology is still in the research stage, but its direction is easy to see. It points toward a future where energy is not only generated in power plants or stored in batteries, but also recovered from everyday industrial processes.

Keep Reading

Artificial Intelligence

Can a Toy Teach a Child to Read Like a Human Would? Inside the Rise of AI Reading Companions

A closer look at how reading, conversation, and AI are being combined

Updated

January 22, 2026 11:46 AM

Assorted plush character toys piled inside a glass claw machine. PHOTO: ADOBE STOCK

In the past, “educational toys” usually meant flashcards, prerecorded stories or apps that asked children to tap a screen. ChooChoo takes a different approach. It is designed not to instruct children at them, but to talk with them.

ChooChoo is an AI-powered interactive reading companion built for children aged three to six. Instead of playing stories passively, it engages kids in conversation while reading. It asks questions, reacts to answers, introduces new words in context and adjusts the story flow based on how the child responds. The goal is not entertainment alone, but language development through dialogue.

That idea is rooted in research, not novelty. ChooChoo is inspired by dialogic reading methods from Yale’s early childhood language development work, which show that children learn language faster when stories become two-way conversations rather than one-way narration. Used consistently, this approach has been shown to improve vocabulary, comprehension and confidence within weeks.

The project was created by Dr. Diana Zhu, who holds a PhD from Yale and focused her work on how children acquire language. Her aim with ChooChoo was to turn academic insight into something practical and warm enough to live in a child’s room. The result is a device that listens, responds and adapts instead of simply playing content on command.

What makes this possible is not just AI, but where that AI runs.

Unlike many smart toys that rely heavily on the cloud, ChooChoo is built on RiseLink’s edge AI platform. That means much of the intelligence happens directly on the device itself rather than being sent back and forth to remote servers. This design choice has three major implications.

First, it reduces delay. Conversations feel natural because the toy can respond almost instantly. Second, it lowers power consumption, allowing the device to stay “always on” without draining the battery quickly. Third, it improves privacy. Sensitive interactions are processed locally instead of being continuously streamed online.

RiseLink’s hardware, including its ultra-low-power AI system-on-chip designs, is already used at large scale in consumer electronics. The company ships hundreds of millions of connected chips every year and works with global brands like LG, Samsung, Midea and Hisense. In ChooChoo’s case, that same industrial-grade reliability is being applied to a child’s learning environment.

The result is a toy that behaves less like a gadget and more like a conversational partner. It engages children in back-and-forth discussion during stories, introduces new vocabulary in natural context, pays attention to comprehension and emotional language and adjusts its pace and tone based on each child’s interests and progress. Parents can also view progress through an optional app that shows what words their child has learned and how the system is adjusting over time.

What matters here is not that ChooChoo is “smart,” but that it reflects a shift in how technology enters early education. Instead of replacing teachers or parents, tools like this are designed to support human interaction by modeling it. The emphasis is on listening, responding and encouraging curiosity rather than testing or drilling.

That same philosophy is starting to shape the future of companion robots more broadly. As edge AI improves and hardware becomes smaller and more energy efficient, we are likely to see more devices that live alongside people instead of in front of them. Not just toys, but helpers, tutors and assistants that operate quietly in the background, responding when needed and staying out of the way when not.

In that sense, ChooChoo is less about novelty and more about direction. It shows what happens when AI is designed not for spectacle, but for presence. Not for control, but for conversation.

If companion robots become part of daily life in the coming years, their success may depend less on how powerful they are and more on how well they understand when to speak, when to listen and how to grow with the people who use them.