Bindwell is testing a simple idea: use AI to design smarter, more targeted pesticides built for today’s farming challenges.
Updated
November 14, 2025 10:48 PM

Researcher tending seedlings in a laboratory environment. PHOTO: FREEPIK
Bindwell, a San Francisco–based ag-tech startup using AI to design new pesticide molecules, has raised US$6 million in seed funding, co-led by General Catalyst and A Capital, with participation from SV Angel and Y Combinator founder Paul Graham. The round will help the company expand its lab in San Carlos, hire more technical talent and advance its first pesticide candidates toward validation.
Even as pesticide use has doubled over the last 30 years, farmers still lose up to 40% of global crops to pests and disease. The core issue is resistance: pests are adapting faster than the industry can update its tools. As a result, farmers often rely on larger amounts of the same outdated chemicals, even as they deliver diminishing returns.
Meanwhile, innovation in the agrochemical sector has slowed, leaving the industry struggling to keep up with rapidly evolving pests. This is the gap Bindwell is targeting. Instead of updating old chemicals, the company uses AI to find completely new compounds designed for today’s pests and farming conditions.
This vision is made even more striking by the people leading it. Bindwell was founded by 18-year-old Tyler Rose and 19-year-old Navvye Anand, who met at the Wolfram Summer Research Program in 2023. Both had deep ties to agriculture — Rose in China and Anand in India — witnessing up close how pest outbreaks and chemical dependence burdened farmers.
Filling the gap in today’s pesticide pipeline, Bindwell created an AI system that can design and evaluate new molecules long before they hit the lab. It starts with Foldwell, the company’s protein-structure model, which helps map the shapes of pest proteins so scientists know where a molecule should bind. Then comes PLAPT, which can scan through every known synthesized compound in just a few hours to see which ones might actually work. For biopesticides, they use APPT, a model tuned to spot protein-to-protein interactions and shown to outperform existing tools on industry benchmarks.
Bindwell isn’t selling AI tools. Instead, the company develops the molecules itself and licenses them to major agrochemical players. Owning the full discovery process lets the team bake in safety, selectivity and environmental considerations from day one. It also allows Bindwell to plug directly into the pipelines that produce commercial pesticides — just with a fundamentally different engine powering the science.
At present, the team is now testing its first AI-generated candidates in its San Carlos lab and is in early talks with established pesticide manufacturers about potential licensing deals. For Rose and Anand, the long-term vision is simple: create pest control that works without repeating the mistakes of the last half-century. As they put it, the goal is not to escalate chemical use but to design molecules that are more precise, less harmful and resilient against resistance from the start.
Keep Reading
The upgraded CodeFusion Studio 2.0 simplifies how developers design, test and deploy AI on embedded systems.
Updated
November 7, 2025 9:31 PM

Illustration of CodeFusion Studio™ 2.0 showing AI, code and chip icons. PHOTO: ANALOG DEVICES, INC.
Analog Devices (ADI), a global semiconductor company, launched CodeFusion Studio™ 2.0 on November 3, 2025. The new version of its open-source development platform is designed to make it easier and faster for developers to build AI-powered embedded systems that run on ADI’s processors and microcontrollers.
“The next era of embedded intelligence requires removing friction from AI development”, said Rob Oshana, Senior Vice President of the Software and Digital Platforms group at ADI. “CodeFusion Studio 2.0 transforms the developer experience by unifying fragmented AI workflows into a seamless process, empowering developers to leverage the full potential of ADI's cutting-edge products with ease so they can focus on innovating and accelerating time to market”.
The upgraded platform introduces new tools for hardware abstraction, AI integration and automation. These help developers move more easily from early design to deployment.
CodeFusion Studio 2.0 enables complete AI workflows, allowing teams to use their own models and deploy them on everything from low-power edge devices to advanced digital signal processors (DSPs).
Built on Microsoft Visual Studio Code, the new CodeFusion Studio offers built-in checks for model compatibility, along with performance testing and optimization tools that help reduce development time. Building on these capabilities, a new modular framework based on Zephyr OS lets developers test and monitor how AI and machine learning models perform in real time. This gives clearer insight into how each part of a model behaves during operation and helps fine-tune performance across different hardware setups.
Additionally, the CodeFusion Studio System Planner has also been redesigned to handle more device types and complex, multi-core applications. With new built-in diagnostic and debugging features — like integrated memory analysis and visual error tracking — developers can now troubleshoot problems faster and keep their systems running more efficiently.
This launch marks a deeper pivot for ADI. Long known for high-precision analog chips and converters, the company is expanding its edge-AI and software capabilities to enable what it calls Physical Intelligence — systems that can perceive, reason, and act locally.
“Companies that deliver physically aware AI solutions are poised to transform industries and create new, industry-leading opportunities. That's why we're creating an ecosystem that enables developers to optimize, deploy and evaluate AI models seamlessly on ADI hardware, even without physical access to a board”, said Paul Golding, Vice President of Edge AI and Robotics at ADI. “CodeFusion Studio 2.0 is just one step we're taking to deliver Physical Intelligence to our customers, ultimately enabling them to create systems that perceive, reason and act locally, all within the constraints of real-world physics”.