December 2013. A PhD student at the University of Amsterdam working under Max Welling posts a paper to arXiv: "Auto-Encoding Variational Bayes."
Diederik Kingma, who goes by the Frisian nickname Durk, found an elegant trick. Reparameterize the random sampling in a latent variable model so the randomness is injected as an external input, and the entire system becomes differentiable. Trainable by standard gradient descent.
The trick fit in a few lines of algebra. It launched a field. Within two years, Kingma joined the founding team of OpenAI. His PhD, completed in 2017, earned a cum laude distinction. The first awarded by Amsterdam's computer science department in thirty years.