This talk explores what it means to write scientific software that lives up to the standards we expect of science itself.

Good science demands transparency, reproducibility, and rigour. The software underpinning it should be no different. In labs, hospitals, and research institutes, Rust is beginning to appear where it matters most: places where correctness and clarity aren't just nice-to-haves, but the foundations of trustworthy research.
This talk explores what it means to write scientific software that lives up to the standards we expect of science itself. We'll look at how Rust's emphasis on explicitness and safety aligns naturally with the principles of open, reproducible research, and how we can go further by treating tests as proof, documentation as methodology, and readable code as a form of scientific communication.
Drawing on examples from epidemiology, synthetic data, and biomedical infrastructure, we'll examine how to build tools that are auditable, maintainable, and built to last. We'll also reflect on how the choices we make today, in our dependencies, our environments, and our defaults, shape whether the next generation of researchers can understand, verify, and build on our work.
I’ll share what the Rust job market really looks like in 2025 — where companies are hiring, which skills stand out, and how the recruitment process actually works behind the scenes.
This talk puts popular Rust rewrites to the test. We'll examine how these tools stack up against their battle-tested predecessors, looking at real-world performance, compilation times, binary sizes, feature completeness, and ecosystem maturity.
In this introductory talk, we will explore what it means to "Ratatuify" the Rust package manager, Cargo.
The talk explores how Rust’s type system and memory safety can be leveraged to enforce mandatory guardrails at the infrastructure level, where traditional frameworks often fall short.
This talk explores building a complete self-hosted LLM stack in Rust: Paddler, a distributed load balancer for serving LLMs at scale, and Poet, a static site generator that consumes those LLMs for AI-powered content features.