Research

Hi! PARIS Reading groups

The Hi! PARIS reading groups propose to study a topic using scientific articles on a theoretical and a practical point of view. The reading groups are opportunities of interaction between our corporate donors and our affiliates academic teams around selected topics of interest.

Each Edition is planned for 3-4 sessions presenting one topic by the mean of 3-4 research papers. For each session: presentation of mathematical models and theoretical advances by a researcher + simulations with a Python notebook by an engineer

Registration

Please register to the event using your professional email address to get your personal conference link. Please do not share your personalised link with others, it is unique to you. You will receive an email regarding your registration status.

Edition 2 - Uncertainty

Session 3/3 - Monte Carlo Variational AutoEncoders
Tuesday 12 April, 2.00-3.30pm (Online)

Speaker
Achille Thin, École polytechnique (CMAP)

Papers
- Monte Carlo Variational AutoEncoders

Notebook
More details coming soon

Session 2/3 - Monte Carlo Dropout
Tuesday 8 March, 2022 – 2.00-3.30pm (Online)

Speakers
Charles Ollion, École polytechnique
Sylvain Le Corff, Télécom SudParis

Papers
- Dopout as a Bayesian approximation: representing model uncertainty in deep learning 

Notebook
More details coming soon

Download documents
Coming soon

Next sessions:
Session 3/3: TBA - Tuesday 12 April, 2.00-3.30 PM

Session 1/3 - Bayes by backprop
Tuesday 8 February, 2022 – 2.00-3.30pm (Online)

Speakers
Charles Ollion, École polytechnique
Sylvain Le Corff, Télécom SudParis

Program
Coming soon

Papers
- Weight uncertainty in neural networks

Notebook/simulations

Collab weblink

Edition 1 - Transformers

Session 3/3 - Generative models based on Transformers
Tuesday 11 January, 2022 – 2.00-3.30pm (Online)

Speakers
Charles Ollion, École polytechnique
Sylvain Le Corff, Télécom SudParis

Papers
- The Monte Carlo Transformer

Download documents
Collab : Coming soon

Session 2/3 - Transformers for times series
Tuesday 30 November, 2021 – 2.00-3.30pm (Online)

Speakers
Charles Ollion, École polytechnique
Sylvain Le Corff, Télécom SudParis

Program
- Applications of Transformers networks for time series prediction.
- Comments on the links with recurrent networks.

Papers
- Attention Is All You Need
- Long ShortTerm Memory as a Dynamically Computed Element-wise Weighted Sum

Notebook
We will showcase a Jupyter notebook in Python using Pytorch to show the basic building blocks of the use of transformers for time series. The Jupyter Notebook will be made available, both locally or on Google Colab (no installation required).
More details coming soon

Download documents

Next sessions (TBC)
Session 3/3: “A stochastic transformer to capture distributions and uncertainty: SMC transformers”

Session 1/3 - Introduction
Tuesday 19 October, 2021 – 2.00-3.30pm (Online)

Speakers
Charles Ollion, École polytechnique
Sylvain Le Corff, Télécom SudParis

Program
- Introduction to transformers: motivations & current uses (~15min presentation).
- Typical mathematical models for transformers (~20min).
- Diving into details: building blocks, important tricks, example code, & visualisation of typical transformers (~40min).

Papers
- Original Transformers paper: Attention is all you need
- One of the most used Masked Language Model Transformer: BERT
- An example of successful Transformer model applied to time-series forecasting: Neurips2019Paper

Notebook
We will showcase a Jupyter notebook in Python using Pytorch to show the basic building blocks of transformers, and how to use large pretrained architectures easily using the Transformers open source library. The Jupyter Notebook will be made available, both locally or on Google Colab (no installation required).

Next sessions (TBC)
Session 2/3: “Transformers for time series”
Session 3/3: “A stochastic transformer to capture distributions and uncertainty: SMC transformers”

Download documents