Skip to content
University of Guelph · Vector Institute

Graham Taylor

Professor and Canada Research Chair in Machine Learning. Developing fundamental deep learning methods, with biodiversity science as both motivation and a sandbox for methods at scale.

Photo of Graham Taylor

I lead the Machine Learning Research Group at the University of Guelph. We develop deep learning methods spanning generative modelling, multimodal representation learning across vision, DNA, and language, graph neural networks, and uncertainty quantification — with recurring interests in learning under limited supervision, AI for science, and the design of LLM-based agents as collaborators in research workflows.

Much of this research is motivated by the challenge of monitoring and preserving Earth's biodiversity. With biologists and ecologists, we build large-scale multimodal datasets and foundation models that integrate images, DNA barcodes, and text. I am a co-PI of BIOSCAN, a global initiative monitoring multicellular life at planetary scale, and contribute as senior personnel to the NSF/NSERC-funded Artificial Intelligence and Biodiversity Change (ABC) Global Center, an international team developing AI for ecosystem monitoring.

I co-direct the Centre for Advancing Responsible and Ethical AI (CARE-AI) at Guelph, am a Canada CIFAR AI Chair and faculty member at the Vector Institute, and serve as Academic Director of Next AI, a non-profit accelerator for AI-focused entrepreneurs.

News

Apr 2026 On 2026-04-02 I gave a Science on Tap talk at Royal City Brewery, Guelph: "From Barcode to Browser: How AI Turns Insect DNA into Biodiversity Data."
Mar 2026 On 2026-03-06 I gave a keynote talk at the ORF-RE11 Project Team Meeting.
Sep 2025 Nathan Grewal joined our group as a PhD student in the Fall of 2025.
Aug 2025 On 2025-08-07 I gave a keynote talk at the BIO2 Symposium.
All news →

Research Highlights

Self-Distillation of Hidden Layers for Self-Supervised Representation Learning
Scott Lowe, Anthony Fuller, Sageev Oore, Evan Shelhamer, Graham Taylor
arXiv preprint arXiv:2603.15553, 2026
Self-distilling intermediate layer representations to improve self-supervised visual learning.
arXiv
CLIBD: Bridging Vision and Genomics for Biodiversity Monitoring at Scale
ZeMing Gong, Austin Wang, Xiaoliang Huo, Joakim Bruslund Haurum, Scott Lowe, Graham Taylor, Angel Chang
International Conference on Learning Representations (ICLR), 2025
Contrastive learning across images, DNA barcodes, and taxonomic labels for zero-shot insect classification.
BarcodeMamba: State Space Models for Biodiversity Analysis
Tiancheng Gao, Graham Taylor
Neural Information Processing Systems (NeurIPS) Workshop on Foundation Models for Science, 2024
State-space models applied to DNA barcode sequences for taxonomic classification at scale.
BIOSCAN-5M: A Multimodal Dataset for Insect Biodiversity
Zahra Gharaee, Scott Lowe, ZeMing Gong, Pablo Millan Arias, Nicholas Pellegrino, Austin Wang, Joakim Bruslund Haurum, Iuliia Zarubiieva, Lila Kari, Dirk Steinke, Graham Taylor, Paul Fieguth, Angel Chang
Neural Information Processing Systems (NeurIPS) Datasets and Benchmarks Track, 2024
A multimodal dataset of over 5 million insect specimens with images, DNA barcodes, and taxonomic labels.
Selected publications →

Group

Photo of Anna Viklund
Anna Viklund
Software Engineer
Photo of Emma Boehly
Emma Boehly
Research Associate
Photo of Iuliia Eyriay
Iuliia Eyriay
Postdoctoral Fellow & Research Manager
Photo of Michal Lisicki
Michal Lisicki
PhD Student
Photo of Mohamed Mostafa
Mohamed Mostafa
MASc Student
Photo of Nathan Grewal
Nathan Grewal
PhD Student
Photo of Scott Lowe
Scott Lowe
Postdoctoral Fellow
Photo of Tiancheng Gao
Tiancheng Gao
MASc Student
Photo of Vivian Phung
Vivian Phung
MSc Student
Alumni & former members →

Want to join the lab?

We're a remote-friendly ML research lab that publishes openly and ships datasets and tools researchers actually use. If you want to know what it's like to work here, read on.

How We Work