Research

Hi! PARIS Reading groups

The Hi! PARIS reading groups propose to study a topic using scientific articles on a theoretical and a practical point of view. The reading groups are opportunities of interaction between our corporate donors and our affiliates academic teams around selected topics of interest.

Each Edition is planned for 3-4 sessions presenting one topic by the mean of 3-4 research papers. For each session: presentation of mathematical models and theoretical advances by a researcher + simulations with a Python notebook by an engineer

Registration

Please click here to register for the event using your professional email address to get your personal conference link. Please do not share your personalised link with others, it is unique to you. You will receive an email regarding your registration status.

Edition 1 - "Transformers"
Session 1 “Introduction”: Tuesday 19 October, 2021 – 2.00-3.30pm (Online)

Speakers
Charles Ollion, École polytechnique
Sylvain Le Corff, Télécom SudParis

Program
- Introduction to transformers: motivations & current uses (~15min presentation).
- Typical mathematical models for transformers (~20min).
- Diving into details: building blocks, important tricks, example code, & visualisation of typical transformers (~40min).

Papers
- Original Transformers paper: Attention is all you need
- One of the most used Masked Language Model Transformer: BERT
- An example of successful Transformer model applied to time-series forecasting: Neurips2019Paper

Notebook
We will showcase a Jupyter notebook in Python using Pytorch to show the basic building blocks of transformers, and how to use large pretrained architectures easily using the Transformers open source library. The Jupyter Notebook will be made available, both locally or on Google Colab (no installation required).

Next sessions (TBC)
Session 2/3: “Transformers for time series”
Session 3/3: “A stochastic transformer to capture distributions and uncertainty: SMC transformers”