Jump to Content

Research

Distill: Communicating the science of machine learning

Published
Authors

Shakir Mohamed, Helen King

Like every field of science, the importance of clear communication in machine learning research cannot be over-emphasised: it helps to drive forward the state-of-the art by allowing the research community to share, discuss and build upon new findings.

For this reason, we at DeepMind are enthusiastic supporters of Distill, a new independent, web-based medium for clear and open - demystified - machine learning research, comprising a journal, prizes recognising outstanding work, and tools to create interactive essays.

The machine learning community has always embraced new forms of scientific communication. Today, our science and practice is communicated through papers published in traditional journals, hosted on arXiv, supported by code repositories and community-driven efforts such as JMLR and JAIR, at conferences and through surveys, posters, blog posts, videos, demos, podcasts and interviews.

In this tradition, Distill makes its own unique contribution. Drawing on modern web technologies, it provides a new way to learn and understand machine learning by promoting interactive, vivid and engaging exposition, and by recognising the invaluable contributions of those who make the time to remove the mystery - and reveal the importance - of even the most seemingly obscure results.

DeepMind is proud to be a contributing sponsor of the Distill prize, an annual prize aimed at recognising outstanding work communicating and refining ideas in machine learning, and Shakir Mohamed is a member of the journal's steering committee. Ultimately, our desire is to support fresh and diverse thinking in machine learning research - to play our part, quoting William Zinnsser, in creating a community of people ‘finding a common thread of humanity between themselves and their speciality and their readers’.