Weekly Reading Group

Upcoming

February 22, 2021 @ 21:00 JST

Seminar. Talk by Andrew Foong and David Burt: “On the Expressiveness of Approximate Inference in Bayesian Neural Networks”. [Paper]

February 08, 2021 @ 21:00 JST

Well-being chat.

February 01, 2021 @ 21:00 JST

Seminar. Talk by Blair Bilodeau and Jeffrey Negrea: “Relaxing the I.I.D. Assumption: Adaptively Minimax Optimal Regret via Root-Entropic Regularization”. [Paper]

January 25, 2021 @ 21:00 JST

Mutual feedback on ICML papers.

January 18, 2021 @ 21:00 JST

Catch up meeting.

Past Meetings

December 17, 2020

Seminar, unusual day! Erik.

November 30, 2020

Reading group. Qi Qian, Hao Li, Juhua Hu: “Efficient Kernel Transfer in Knowledge Distillation”. [Paper]

November 16, 2020

Seminar. Peter, Thomas. Uncertainty estimation for Bayesian Neural Networks using infinite-width nets.

November 06, 2020

Pre-ICLR paper discussion.

October 30, 2020

Reading group. Ben Recht: “A Tour of Reinforcement Learning: The View from Continuous Control”. [Paper]

October 23, 2020

Team work review for the past 4 months. Siddarth’s talk on functional regularization on the memorable past. [Paper]

October 22, 2020

Unusual day and time! Dimitri Meunier: “Meta Learning meets Variational Inference. Learning priors with guarantees”.

October 09, 2020

Feedback discussion on the new team webpage.

October 02, 2020

Talk by Benjamin Guedj: “A (condensed) primer on PAC-Bayesian Learning”. [Slides]

Show all reading group meetings

September 25, 2020

Talk by François-Xavier Briol: “Stein’s Method for Computational Statistics and Machine Learning”. [Slides]

September 18, 2020

Research chat and virtual breakfast/lunch/dinner in gather.town.

September 11, 2020

Siddharth Swaroop: A survey on federated learning.

September 04, 2020

Happy Buzaaba and Fariz Ikhwantri: A survey on transfer learning. [Slides]

August 28, 2020

Lab members project overview and discussion.

August 21, 2020

Pierre’s grant presentation rehearsal and feedback.

August 14, 2020

Reading group. “Gradient descent for wide two-layer neural networks”. [Blog post]

August 07, 2020

Reading group. “Generalized Variational Inference: Three arguments for deriving new posteriors”. [Paper]

July 31, 2020

Reading group. “On the measure of intelligence”. [Paper]

July 17, 2020

Whole week: comments on ICML tutorials and talks.

July 10, 2020

Pre-ICML paper discussion.

July 03, 2020

Talk by Evgenii Egorov. “Involutive MCMC: one way to derive them all”. [Paper]

June 26, 2020

Talk by Giulia Denevi on Efficient Lifelong Learning Algorithms: Regret Bounds and Statistical Guarantees.

June 12, 2020

Internal kickoff meeting.

June 05, 2020

Talk by Peter Nickl on “Variational Bayes for Infinite Mixtures of Local Regressors with Robotics Applications”. [Thesis]

May 29, 2020

Talk by Gian Maria Marconi on Manifold Regression by Structured Prediction: methodology and applications. [Paper]

May 22, 2020

Feedback discussion on our NeurIPS submissions.

May 15, 2020

Talk by Thomas Möllenhoff on Flat Metric Minimization with Applications in Generative Modeling. [Paper]

May 08, 2020

Talk by Evgenii Egorov on MaxEntropy Pursuit Variational Inference. [Paper]

May 01, 2020

Reading Group. “Continual Deep Learning by Functional Regularisation of the Memorable Past”. [Paper]

April 24, 2020

Reading Group. Shalev-Shwartz: “Introduction to online learning”, Chapter 2 (end). [Paper]

April 17, 2020

Reading Group. Shalev-Shwartz: “Introduction to online learning”, Chapter 2. [Paper]

April 09, 2020

Reading Group. Shalev-Shwartz: “Introduction to online learning”, Chapter 1. [Paper]

April 03, 2020

Reading Group. “Maximum Entropy Principle”, Jaynes (1957). [Paper I], [Paper II]