When farm challenges grow, smart tools need to grow with them.
Updated
November 27, 2025 3:26 PM

A drone spraying water over an agricultural field. PHOTO: FREEPIK
Farms today are under pressure. Fields are getting bigger, workers are harder to find and many jobs still rely on long hours of manual labor. XAG’s new P150 Max agricultural drone is designed for exactly this reality. Instead of replacing farmers, it takes over the heavy, repetitive fieldwork that slows them down, making farm operations more efficient and more precise.
The P150 Max is built around one simple idea: a single machine that can handle multiple farming tasks. Most farm drones focus only on spraying or mapping, but this one is fully modular. With a quick switch of attachments, it can spray crops, spread seeds or fertilizer, map fields or transport supplies. This flexibility helps farmers keep up with changing tasks throughout the day without needing different machines, improving both productivity and cost-efficiency.
A key challenge in agriculture is that fields are rarely smooth or predictable. Tractors can get stuck, smaller drones can’t carry much and some areas—like orchards or hilly plots—are simply hard to reach. The P150 Max fills that gap with an 80-kilogram payload and fast flight speed, letting it cover more ground per trip. Fewer takeoffs mean less downtime and more work completed before weather or daylight cuts operations short.
When it’s time to spray, the drone uses a smart spraying system that allows farmers to adjust droplet size based on the crop’s needs. This matters because precise spraying reduces waste and improves targeting. With an output of up to 46 liters per minute, the drone can serve both large open fields and dense orchards where consistent coverage is traditionally difficult.
The spreading system applies the same logic. Instead of dropping seeds or fertilizer unevenly, the vertical mechanism spreads material smoothly and resists wind drift. This ensures uniform application across irregular or hard-to-reach land—an ongoing challenge for modern farms aiming for higher yield and better resource use.
Another everyday issue for farmers is understanding and surveying the land before working on it. The P150 Max helps here with a built-in mapping tool that covers up to 20 hectares per flight and instantly converts the images into detailed maps. With AI detecting obstacles like trees or irrigation lines, the drone can plan safe and efficient autonomous routes, reducing manual planning time.
Beyond spraying and spreading, the drone can transport tools, produce and farm supplies using a sling attachment. This is particularly helpful after heavy rain, when vehicles cannot easily move across muddy or flooded fields.
Under all these functions is XAG’s upgraded flight control system, which provides centimeter-level accuracy even when network signals are weak. Integrated sensors—including 4D radar and a wide-angle camera—help the drone recognize hazards such as poles and wires. Farmers can manage all operations through the XAG One app or a handheld controller, both of which automatically generate the best route based on field shape and terrain.
Since long field days require long operating hours, the fast-charging battery system can recharge in about seven minutes using a dedicated kit. This supports continuous drone use throughout the day with minimal interruptions.
After years of testing, the XAG P150 Max is essentially an effort to make practical, scalable farm automation more accessible. By combining spraying, spreading, mapping and transport into one heavy-duty platform, it offers a way to ease labor shortages while keeping operations efficient and sustainable. Instead of focusing on one task, the drone aims to take over the time-consuming physical work so farmers can focus on decisions, planning and crop management.
Keep Reading
At under US$1,000, Hypernova isn’t just eyewear—it’s Meta’s push to make AR feel ordinary.
Updated
November 27, 2025 3:26 PM

Closeup of the Ray-Ban logo and the built-in ultra-wide 12 MP camera on a pair of new Ray-Ban Meta Wayfarer smart glasses. PHOTO: ADOBE STOCK
Meta is preparing to launch its next big wearable: the Hypernova smart glasses. Unlike earlier experiments like the Ray-Ban Stories, these new glasses promise more advanced features at a price point under US$1,000. With a launch set for September 17 at Meta’s annual Connect conference, the Hypernova is already drawing attention for blending design, technology and accessibility.
In this article, let’s take a closer look at Hypernova’s design, features, pricing and the challenges Meta faces as it tries to bring smart glasses into everyday life.
Meta’s earlier Ray-Ban glasses offered cameras and audio but no display. Hypernova changes that: The glasses will ship with a built-in micro-display, giving wearers quick access to maps, messages, notifications and even Meta’s AI assistant. It’s a step toward everyday AR that feels useful and natural, not experimental.
Perhaps most importantly, the price makes them attainable. While early estimates placed the cost above US$1,000, Meta has committed to a launch price of around US$800. That’s still premium, but it moves AR smart glasses into reach for more consumers.
Hypernova weighs about 70 grams, roughly 20 grams heavier than the Ray-Ban Meta models. The added weight likely comes from added components like the new display and extra sensors.
To keep the glasses stylish, Meta continues its partnership with EssilorLuxottica, the company behind Ray-Ban and Prada eyewear. Thicker frames—especially Prada’s designs—help hide the hardware like chips, microphones and batteries without making the glasses look oversized.
The glasses stick close to the classic Ray-Ban silhouette but feature slightly bulkier arms. On the left side, a touch-sensitive bar lets users control functions with taps and swipes. For example, a two-finger tap can trigger a photo or start video recording.
Hypernova introduces something the earlier Ray-Ban glasses never had: a display built right into the lens. In the bottom-right corner of the right lens, a small micro-screen uses waveguide optics to project a digital overlay with about a 20° field of view. This means you can glance at turn-by-turn directions, check a notification or quickly consult Meta’s AI assistant without pulling out your phone. It’s discreet, practical and a major step up from the older models, which were limited to capturing photos and videos, handling calls and playing music via speakers.
Alongside the glasses comes the Ceres wristband, a companion device powered by electromyography (EMG). The band picks up the tiny electrical signals in your wrist and fingers, translating them into commands. A pinch might let you select something, a wrist flick could scroll a page, and a swipe could move between screens. The idea is to avoid clunky buttons or having to talk to your glasses in public. Meta has also been experimenting with handwriting recognition through the band, though it’s not clear if that feature will be ready in time for launch.
Meta doesn’t just want Hypernova to be useful—it wants it to be fun. Code found in leaked firmware revealed a small game called Hypertrail. It looks to borrow ideas from the 1981 arcade shooter Galaga, letting wearers play a simple, retro-inspired game right through their glasses. It’s not the main attraction, but it shows Meta is trying to make Hypernova feel more like a playful everyday gadget rather than just a piece of serious tech.
Hypernova runs on a customized version of Android and pairs with smartphones through the Meta View app. Out of the box, it should support the basics: calls, music and message notifications. Leaks suggest several apps will come preinstalled, including Camera, Gallery, Maps, WhatsApp, Messenger and Meta AI. A Qualcomm processor powers the whole setup, helping it run smoothly while keeping energy demands reasonable.
Meta is also trying to bring in outside developers. In August 2025, CNBC reported that the company invited third-party developers—especially in generative AI—to build experimental apps for Hypernova and the Ceres wristband. The Meta Connect 2025 agenda even highlights sessions on a new smart glasses SDK and toolkit. The push shows Meta’s interest in making Hypernova more than just a device; it wants a broader platform with apps that go beyond its own first-party software.
During development, Hypernova was rumored to cost as much as US$1,400. By pricing it around US$800, Meta signals that it wants adoption more than profit. The company is keeping production limited (around 150,000 units), showing it sees this as a market test rather than a mass rollout. Still, the sub-US$1,000 price tag makes advanced AR far more accessible than before.
Despite its promise, Hypernova may still face hurdles. The Ceres wristband can struggle if worn loosely, and some testers have reported issues based on which arm it’s worn on or even when wearing long sleeves. In short, getting EMG input right for everyone will be critical.
Privacy is another major concern. In past experiments, researchers hacked Ray-Ban Meta glasses to run facial recognition, instantly identifying strangers and pulling personal info. Meta has added guidelines, like a recording indicator light, but critics argue these measures are too easy to ignore. Moreover, data captured by smart glasses can feed into AI training, raising questions about consent and surveillance.
The Meta Hypernova smart glasses mark a turning point in wearable tech. They’re lighter and more stylish than bulky AR headsets, while offering real-world features like navigation, messaging and hands-free control. At under US$1,000, they aim to make AR glasses more than a luxury gadget—they’re a step toward everyday use.
Whether Hypernova succeeds will depend on how well it balances style, usability and privacy. But one thing is clear: Meta is betting that always-on, glanceable AR can move from science fiction to daily life.