Philipp Hennig (@philipphennig5) 's Twitter Profile
Philipp Hennig

@philipphennig5

Professor for the Methods of Machine Learning at the University of Tübingen.

ID: 1215966313765646336

linkhttp://mml.inf.uni-tuebingen.de calendar_today11-01-2020 11:59:31

563 Tweet

5,5K Followers

322 Following

Autonomous Vision Group (@autovisiongroup) 's Twitter Profile Photo

After 2 years of hard work by the team, we are thrilled to release scholar-inbox.com today! Scholar Inbox is a personal paper recommender which enables you to stay up-to-date with the most relevant progress by delivering personal suggestions directly to your inbox.🧵

After 2 years of hard work by the team, we are thrilled to release scholar-inbox.com today! Scholar Inbox is a personal paper recommender which enables you to stay up-to-date with the most relevant progress by delivering personal suggestions directly to your inbox.🧵
Conor Hassan (@hassanconor) 's Twitter Profile Photo

Textbooks I want to read in the near future: "Understanding Deep Learning", Simon Prince "Deep Learning", Bishop & Bishop "Probabilistic Numerics", Philipp Hennig (1/3)

Textbooks I want to read in the near future: 

"Understanding Deep Learning", <a href="/SimonPrinceAI/">Simon Prince</a> 
"Deep Learning", Bishop &amp; Bishop 
"Probabilistic Numerics", <a href="/PhilippHennig5/">Philipp Hennig</a> 
(1/3)
Philipp Hennig (@philipphennig5) 's Twitter Profile Photo

Still some places available for probnumschool.org 2024. Join us in Southampton in April to discuss how the algorithmic bedrock of machine learning can be adapted to quantify computational uncertainty from the ground up. Beautiful methods, amazing people. A place for you?

Michèle Finck (@finck_m) 's Twitter Profile Photo

I am organising a PhD Summer School on Artificial Intelligence and Law next May! This has been designed specifically for doctoral researchers working in this area. Travel and accommodation costs will be covered for all participants. More here: ailawinstitute.de

Michiel ☁️ (@michielstock) 's Twitter Profile Photo

Really enjoying Michael A Osborne's textbook on probabilist numerics, a refreshing take on the field! The musings on the difference between "probabilistic" and "stochastic" are also very provoking (though I might not agree with everything).

Really enjoying <a href="/maosbot/">Michael A Osborne</a>'s textbook on probabilist numerics, a refreshing take on the field! The musings on the difference between "probabilistic" and "stochastic" are also very provoking (though I might not agree with everything).
AndrzejWasowski@scholar.social 🌻 🕊️ (@andrzejwasowski) 's Twitter Profile Photo

Probabilistic Numerics—Computation as Inference by Philipp Hennig Register and find out what Philipp Hennig will present in PICS PhD School in Oct in Copenhagen here: etaps.org/about/fopss-sc… #pics2024 #etaps2024 #probability ACM SIGPLAN ACM SIGLOG ETAPS Conferences EATCS

Probabilistic Numerics—Computation as Inference by Philipp Hennig

Register and find out what <a href="/PhilippHennig5/">Philipp Hennig</a> will present in PICS PhD School in Oct in Copenhagen here: 
etaps.org/about/fopss-sc…

#pics2024 #etaps2024 #probability
<a href="/sigplan/">ACM SIGPLAN</a>  <a href="/acmsiglog/">ACM SIGLOG</a>  <a href="/ETAPSconf/">ETAPS Conferences</a> <a href="/eatcs_secretary/">EATCS</a>
Shubhendu Trivedi (@_onionesque) 's Twitter Profile Photo

Enjoyed reading this, esp. as I try to educate myself on Fourier neural operators. arxiv.org/abs/2406.05072 Quite an elegant combination/development of ideas towards quantifying the uncertainty of FNOs. PS: I know one of the authors has a mysterious existence on twitter: Tobias Weber

Enjoyed reading this, esp. as I try to educate myself on Fourier neural operators. arxiv.org/abs/2406.05072 Quite an elegant combination/development of ideas towards quantifying the uncertainty of FNOs.

PS: I know one of the authors has a mysterious existence on twitter: <a href="/2bys2/">Tobias Weber</a>
Zachary Nado (@zacharynado) 's Twitter Profile Photo

"Non-diagonal preconditioning has dethroned Nesterov Adam" 🧴👑 shampoo wins, finally the community can know what we have for years! this benchmark has been 3+ years in the making (we first talked about it Google in 2021), I'm beyond psyched that it's finally yielded results!

AI at Meta (@aiatmeta) 's Twitter Profile Photo

The #AlgoPerf competition was designed to find better training algorithms to speed up neural network training across a diverse set of workloads. 🎉 We’re proud to share that teams from Meta took first place across both external tuning and self-tuning tracks!

Google AI (@googleai) 's Twitter Profile Photo

Congratulations to everyone who submitted to the MLCommons AlgoPerf training algorithms competition! We were delighted to provide compute resources for evaluating so many exciting submissions.

Philipp Hennig (@philipphennig5) 's Twitter Profile Photo

If you’re looking for cold, hard numbers to improve your deep learning training stack, Frank and the AlgoPerf Competition have got you covered. Results out now. What a incredible effort to establish standards in a domain full of black magic!

Philipp Hennig (@philipphennig5) 's Twitter Profile Photo

Could it be that, a good 50 years after Broyden, Fletcher, Goldfarb, Shanno, Davidon and Powell, variable metric updates get their comeback in deep learning?

Philipp Hennig (@philipphennig5) 's Twitter Profile Photo

Despite everything, in 2024, three academic researchers can still build a self-tuning optimiser that can compete with the big industry labs. Congratulations to my next-door neighbours ELLIS Institute Tübingen, Niccolò Ajroldi, Jonas Geiping, Antonio Orvieto!