Durk Kingma (@dpkingma) 's Twitter Profile
Durk Kingma

@dpkingma

Deep learning, mostly generative models. Prev. @Google Brain/DeepMind, founding team @OpenAI. Inventor of the VAE, Adam optimizer, among other things. ML PhD.

ID: 25263396

linkhttp://dpkingma.com calendar_today19-03-2009 09:32:31

660 Tweet

37,37K Followers

386 Following

Durk Kingma (@dpkingma) 's Twitter Profile Photo

It found it interesting to re-read this post by Mohammed AlQuraishi from 10 years ago (2013). moalquraishi.wordpress.com/2013/03/17/wha… He was mostly right, except that the development of AI into a mass-market technology started earlier than he expected.

Yisong Yue (@yisongyue) 's Twitter Profile Photo

Congratulations to Durk Kingma and Max Welling for winning the inaugural ICLR Test of Time Award for their amazing work on Auto-Encoding Variational Bayes, the paper that proposed Variational Autoencoders! arxiv.org/abs/1312.6114

Durk Kingma (@dpkingma) 's Twitter Profile Photo

Thanks to the ICLR Award Committee! And thank you for the kind words, Max! You were the perfect Ph.D. advisor and collaborator, kind and inspiring. I really couldn't have wished for better.

Sirui Xie (@siruixie) 's Twitter Profile Photo

📢 Excited to share EM Distillation (EMD), a maximum likelihood method that distills pretrained diffusion models to one-step generators. EMD gracefully interpolates between mode-seeking and mode-covering KL to better capture the teacher's distribution. arxiv.org/abs/2405.16852

📢 Excited to share EM Distillation (EMD), a maximum likelihood method that distills pretrained diffusion models to one-step generators. EMD gracefully interpolates between mode-seeking and mode-covering KL to better capture the teacher's distribution.
 
arxiv.org/abs/2405.16852