A closer look at how machine intelligence is helping doctors see cancer in an entirely new light.
Updated
November 28, 2025 4:18 PM

Serratia marcescens colonies on BTB agar medium. PHOTO: UNSPLASH
Artificial intelligence is beginning to change how scientists understand cancer at the cellular level. In a new collaboration, Bio-Techne Corporation, a global life sciences tools provider, and Nucleai, an AI company specializing in spatial biology for precision medicine, have unveiled data from the SECOMBIT clinical trial that could reshape how doctors predict cancer treatment outcomes. The results, presented at the Society for Immunotherapy of Cancer (SITC) 2025 Annual Meeting, highlight how AI-powered analysis of tumor environments can reveal which patients are more likely to benefit from specific therapies.
Led in collaboration with Professor Paolo Ascierto of the University of Napoli Federico II and Istituto Nazionale Tumori IRCCS Fondazione Pascale, the study explores how spatial biology — the science of mapping where and how cells interact within tissue — can uncover subtle immune behaviors linked to survival in melanoma patients.
Using Bio-Techne’s COMET platform and a 28-plex multiplex immunofluorescence panel, researchers analyzed 42 pre-treatment biopsies from patients with metastatic melanoma, an advanced stage of skin cancer. Nucleai’s multimodal AI platform integrated these imaging results with pathology and clinical data to trace patterns of immune cell interactions inside tumors.
The findings revealed that therapy sequencing significantly influences immune activity and patient outcomes. Patients who received targeted therapy followed by immunotherapy showed stronger immune activation, marked by higher levels of PD-L1+ CD8 T-cells and ICOS+ CD4 T-cells. Those who began with immunotherapy benefited most when PD-1+ CD8 T-cells engaged closely with PD-L1+ CD4 T-cells along the tumor’s invasive edge. Meanwhile, in patients alternating between targeted and immune treatments, beneficial antigen-presenting cell (APC) and T-cell interactions appeared near tumor margins, whereas macrophage activity in the outer tumor environment pointed to poorer prognosis.
“This study exemplifies how our innovative spatial imaging and analysis workflow can be applied broadly to clinical research to ultimately transform clinical decision-making in immuno-oncology”, said Matt McManus, President of the Diagnostics and Spatial Biology Segment at Bio-Techne.
The collaboration between the two companies underscores how AI and high-plex imaging together can help decode complex biological systems. As Avi Veidman, CEO of Nucleai, explained, “Our multimodal spatial operating system enables integration of high-plex imaging, data and clinical information to identify predictive biomarkers in clinical settings. This collaboration shows how precision medicine products can become more accurate, explainable and differentiated when powered by high-plex spatial proteomics – not limited by low-plex or H&E data alone”.
Dr. Ascierto described the SECOMBIT trial as “a milestone in demonstrating the possible predictive power of spatial biomarkers in patients enrolled in a clinical study”.
The study’s broader message is clear: understanding where immune cells are and how they interact inside a tumor could become just as important as knowing what they are. As AI continues to map these microscopic landscapes, oncology may move closer to genuinely personalized treatment — one patient, and one immune network, at a time.
Keep Reading
Redefining sensor performance with advanced physical AI and signal processing.
Updated
December 16, 2025 3:28 PM
.jpg)
Robot with human features, equipped with a visual sensor. PHOTO: UNSPLASH
Atomathic, the company once known as Neural Propulsion Systems, is stepping into the spotlight with a bold claim: its new AI platforms can help machines “see the invisible”. With the commercial launch of AIDAR™ and AISIR™, the company says it is opening a new chapter for physical AI, AI sensing and advanced sensor technology across automotive, aviation, defense, robotics and semiconductor manufacturing.
The idea behind these platforms is simple yet ambitious. Machines gather enormous amounts of signal data, yet they still struggle to understand the faint, fast or hidden details that matter most when making decisions. Atomathic says its software closes that gap. By applying AI signal processing directly to raw physical signals, the company aims to help sensors pick up subtle patterns that traditional systems miss, enabling faster reactions and more confident autonomous system performance.
"To realize the promise of physical AI, machines must achieve greater autonomy, precision and real-time decision-making—and Atomathic is defining that future," said Dr. Behrooz Rezvani, Founder and CEO of Atomathic. "We make the invisible visible. Our technology fuses the rigor of mathematics with the power of AI to transform how sensors and machines interact with the world—unlocking capabilities once thought to be theoretical. What can be imagined mathematically can now be realized physically."
This technical shift is powered by Atomathic’s deeper mathematical framework. The core of its approach is a method called hyperdefinition technology, which uses the Atomic Norm and fast computational techniques to map sparse physical signals. In simple terms, it pulls clarity out of chaos. This enables ultra-high-resolution signal visualization in real time—something the company claims has never been achieved at this scale in real-time sensing.
AIDAR and AISIR are already being trialled and integrated across multiple sectors and they’re designed to work with a broad range of hardware. That hardware-agnostic design is poised to matter even more as industries shift toward richer, more detailed sensing. Analysts expect the automotive sensor market to surge in the coming years, with radar imaging, next-gen ADAS systems and high-precision machine perception playing increasingly central roles.
Atomathic’s technology comes from a tight-knit team with deep roots in mathematics, machine intelligence and AI research, drawing talent from institutions such as Caltech, UCLA, Stanford and the Technical University of Munich. After seven years of development, the company is ready to show its progress publicly, starting with demonstrations at CES 2026 in Las Vegas.
Suppose the future of autonomy depends on machines perceiving the world with far greater fidelity. In that case, Atomathic is betting that the next leap forward won’t come from more hardware, but from rethinking the math behind the signal—and redefining what physical AI can do.