Bindwell is testing a simple idea: use AI to design smarter, more targeted pesticides built for today’s farming challenges.
Updated
January 8, 2026 6:33 PM

Researcher tending seedlings in a laboratory environment. PHOTO: FREEPIK
Bindwell, a San Francisco–based ag-tech startup using AI to design new pesticide molecules, has raised US$6 million in seed funding, co-led by General Catalyst and A Capital, with participation from SV Angel and Y Combinator founder Paul Graham. The round will help the company expand its lab in San Carlos, hire more technical talent and advance its first pesticide candidates toward validation.
Even as pesticide use has doubled over the last 30 years, farmers still lose up to 40% of global crops to pests and disease. The core issue is resistance: pests are adapting faster than the industry can update its tools. As a result, farmers often rely on larger amounts of the same outdated chemicals, even as they deliver diminishing returns.
Meanwhile, innovation in the agrochemical sector has slowed, leaving the industry struggling to keep up with rapidly evolving pests. This is the gap Bindwell is targeting. Instead of updating old chemicals, the company uses AI to find completely new compounds designed for today’s pests and farming conditions.
This vision is made even more striking by the people leading it. Bindwell was founded by 18-year-old Tyler Rose and 19-year-old Navvye Anand, who met at the Wolfram Summer Research Program in 2023. Both had deep ties to agriculture — Rose in China and Anand in India — witnessing up close how pest outbreaks and chemical dependence burdened farmers.
Filling the gap in today’s pesticide pipeline, Bindwell created an AI system that can design and evaluate new molecules long before they hit the lab. It starts with Foldwell, the company’s protein-structure model, which helps map the shapes of pest proteins so scientists know where a molecule should bind. Then comes PLAPT, which can scan through every known synthesized compound in just a few hours to see which ones might actually work. For biopesticides, they use APPT, a model tuned to spot protein-to-protein interactions and shown to outperform existing tools on industry benchmarks.
Bindwell isn’t selling AI tools. Instead, the company develops the molecules itself and licenses them to major agrochemical players. Owning the full discovery process lets the team bake in safety, selectivity and environmental considerations from day one. It also allows Bindwell to plug directly into the pipelines that produce commercial pesticides — just with a fundamentally different engine powering the science.
At present, the team is now testing its first AI-generated candidates in its San Carlos lab and is in early talks with established pesticide manufacturers about potential licensing deals. For Rose and Anand, the long-term vision is simple: create pest control that works without repeating the mistakes of the last half-century. As they put it, the goal is not to escalate chemical use but to design molecules that are more precise, less harmful and resilient against resistance from the start.
Keep Reading
The focus is no longer just AI-generated worlds, but how those worlds become structured digital products
Updated
February 20, 2026 6:50 PM

The inside of a pair of HTC VR goggles. PHOTO: UNSPLASH
As AI tools improve, creating 3D content is becoming faster and easier. However, building that content into interactive experiences still requires time, structure and technical work. That difference between generation and execution is where HTC VIVERSE and World Labs are focusing their new collaboration.
HTC VIVERSE is a 3D content platform developed by HTC. It provides creators with tools to build, refine and publish interactive virtual environments. Meanwhile, World Labs is an AI startup founded by researcher Fei-Fei Li and a team of machine learning specialists. The company recently introduced Marble, a tool that generates full 3D environments from simple text, image or video prompts.
While Marble can quickly create a digital world, that world on its own is not yet a finished experience. It still needs structure, navigation and interaction. This is where VIVERSE fits in. By combining Marble’s world generation with VIVERSE’s building tools, creators can move from an AI-generated scene to a usable, interactive product.
In practice, the workflow works in two steps. First, Marble produces the base 3D environment. Then, creators bring that environment into VIVERSE, where they add game mechanics, scenes and interactive elements. In this model, AI handles the early visual creation, while the human creator defines how users explore and interact with the world.
To demonstrate this process, the companies developed three example projects. Whiskerhill turns a Marble-generated world into a simple quest-based experience. Whiskerport connects multiple AI-generated scenes into a multi-level environment that users navigate through portals. Clockwork Conspiracy, built by VIVERSE, uses Marble’s generation system to create a more structured, multi-scene game. These projects are not just demos. They serve as proof that AI-generated worlds can evolve beyond static visuals and become interactive environments.
This matters because generative AI is often judged by how quickly it produces content. However, speed alone does not create usable products. Digital experiences still require sequencing, design decisions and user interaction. As a result, the real challenge is not generation, but integration — connecting AI output to tools that make it functional.
Seen in this context, the collaboration is less about a single product and more about workflow. VIVERSE provides a system that allows AI-generated environments to be edited and structured. World Labs provides the engine that creates those environments in the first place. Together, they are testing whether AI can fit directly into a full production pipeline rather than remain a standalone tool.
Ultimately, the collaboration reflects a broader change in creative technology. AI is no longer only producing isolated assets. It is beginning to plug into the larger process of building complete experiences. The key question is no longer how quickly a world can be generated, but how easily that world can be turned into something people can actually use and explore.