Health & Biotech

How a Teen-Founded Startup Is Using AI to Reinvent Pesticide Discovery

Bindwell is testing a simple idea: use AI to design smarter, more targeted pesticides built for today’s farming challenges.

Updated

January 8, 2026 6:33 PM

Researcher tending seedlings in a laboratory environment. PHOTO: FREEPIK

Bindwell, a San Francisco–based ag-tech startup using AI to design new pesticide molecules, has raised US$6 million in seed funding, co-led by General Catalyst and A Capital, with participation from SV Angel and Y Combinator founder Paul Graham. The round will help the company expand its lab in San Carlos, hire more technical talent and advance its first pesticide candidates toward validation.  

Even as pesticide use has doubled over the last 30 years, farmers still lose up to 40% of global crops to pests and disease. The core issue is resistance: pests are adapting faster than the industry can update its tools. As a result, farmers often rely on larger amounts of the same outdated chemicals, even as they deliver diminishing returns.

Meanwhile, innovation in the agrochemical sector has slowed, leaving the industry struggling to keep up with rapidly evolving pests. This is the gap Bindwell is targeting. Instead of updating old chemicals, the company uses AI to find completely new compounds designed for today’s pests and farming conditions.  

This vision is made even more striking by the people leading it. Bindwell was founded by 18-year-old Tyler Rose and 19-year-old Navvye Anand, who met at the Wolfram Summer Research Program in 2023. Both had deep ties to agriculture — Rose in China and Anand in India — witnessing up close how pest outbreaks and chemical dependence burdened farmers.  

Filling the gap in today’s pesticide pipeline, Bindwell created an AI system that can design and evaluate new molecules long before they hit the lab. It starts with Foldwell, the company’s protein-structure model, which helps map the shapes of pest proteins so scientists know where a molecule should bind. Then comes PLAPT, which can scan through every known synthesized compound in just a few hours to see which ones might actually work. For biopesticides, they use APPT, a model tuned to spot protein-to-protein interactions and shown to outperform existing tools on industry benchmarks.

Bindwell isn’t selling AI tools. Instead, the company develops the molecules itself and licenses them to major agrochemical players. Owning the full discovery process lets the team bake in safety, selectivity and environmental considerations from day one. It also allows Bindwell to plug directly into the pipelines that produce commercial pesticides — just with a fundamentally different engine powering the science.

At present, the team is now testing its first AI-generated candidates in its San Carlos lab and is in early talks with established pesticide manufacturers about potential licensing deals. For Rose and Anand, the long-term vision is simple: create pest control that works without repeating the mistakes of the last half-century. As they put it, the goal is not to escalate chemical use but to design molecules that are more precise, less harmful and resilient against resistance from the start.

Keep Reading

Artificial Intelligence

X-Humanoid Introduces Tien Kung 3.0 as Deployment Challenges Persist in Humanoid Robotics

A closer look at the tech, AI, and open ecosystem behind Tien Kung 3.0’s real-world push

Updated

February 18, 2026 8:03 PM

Humanoid robots working in a warehouse. PHOTO: ADOBE STOCK

Humanoid robotics has advanced quickly in recent years. Machines can now walk, balance, and interact with their surroundings in ways that once seemed out of reach. Yet most deployments remain limited. Many robots perform well in controlled settings but struggle in real-world environments. Integration is often complex, hardware interfaces are closed, software tools are fragmented, and scaling across industries remains difficult.

Against this backdrop, X-Humanoid has introduced its latest general-purpose platform, Embodied Tien Kung 3.0. The company positions it not simply as another humanoid robot, but as a system designed to address the practical barriers that have slowed adoption, with a focus on openness and usability.

At the hardware level, Embodied Tien Kung 3.0 is built for mobility, strength, and stability. It is equipped with high-torque integrated joints that provide strong limb force for high-load applications. The company says it is the first full-size humanoid robot to achieve whole-body, high-dynamic motion control integrated with tactile interaction. In practice, this means the robot is designed to maintain balance and execute dynamic movements even in uneven or cluttered environments. It can clear one-meter obstacles, perform consecutive high-dynamic maneuvers, and carry out actions such as kneeling, bending, and turning with coordinated whole-body control.

Precision is also a focus. Through multi-degree-of-freedom limb coordination and calibrated joint linkage, the system is designed to achieve millimeter-level operational accuracy. This level of control is intended to support industrial-grade tasks that require consistent performance and minimal error across changing conditions.

But hardware is only part of the equation. The company pairs the robot with its proprietary Wise KaiWu general-purpose embodied AI platform. This system supports perception, reasoning, and real-time control through what the company describes as a coordinated “brain–cerebellum” architecture. It establishes a continuous perception–decision–execution loop, allowing the robot to operate with greater autonomy and reduced reliance on remote control.

For higher-level cognition, Wise KaiWu incorporates components such as a world model and vision-language models (VLM) to interpret visual scenes, understand language instructions, and break complex objectives into structured steps. For real-time execution, a vision-language-action (VLA) model and full autonomous navigation system manage obstacle avoidance and precise motion under variable conditions. The platform also supports multi-agent collaboration, enabling cross-platform compatibility, asynchronous task coordination, and centralized scheduling across multiple robots.

A central part of the platform is openness. The company states that the system is designed to address compatibility and adaptation challenges across both development and deployment layers. On the hardware side, Embodied Tien Kung 3.0 includes multiple expansion interfaces that support different end-effectors and tools, allowing faster adaptation to industrial manufacturing, specialized operations, and commercial service scenarios. On the software side, the Wise KaiWu ecosystem provides documentation, toolchains, and a low-code development environment. It supports widely adopted communication standards, including ROS2, MQTT, and TCP/IP, enabling partners to customize applications without rebuilding core systems.

The company also highlights its open-source approach. X-Humanoid has open-sourced key components from the Embodied Tien Kung and Wise KaiWu platforms, including the robot body architecture, motion control framework, world model, embodied VLM and cross-ontology VLA models, training toolchains, the RoboMIND dataset, and the ArtVIP simulation asset library. By opening access to these elements, the company aims to reduce development costs, lower technical barriers, and encourage broader participation from researchers, universities, and enterprises.

Embodied Tien Kung 3.0 enters a market where technical progress is visible but large-scale adoption remains uneven. The gap is not only about movement or strength. It is about integration, interoperability, and the ability to operate reliably and autonomously in everyday industrial and commercial settings. If platforms can reduce fragmentation and simplify deployment, humanoid robots may move beyond demonstrations and into sustained commercial use.

In that sense, the significance of Embodied Tien Kung 3.0 lies less in isolated technical claims and more in how its high-dynamic hardware, embodied AI system, open interfaces, and collaborative architecture are structured to work together. Whether that integrated approach can close the deployment gap will shape how quickly humanoid robotics becomes part of real-world operations.