Scientific Calendar Event



Starts 6 Nov 2024 14:30
Ends 8 Nov 2024 13:00
Central European Time
Giambiagi Lecture Room (Adriatico Guesthouse) and Luigi Stasi Seminar Room (Leonardo Building)

ICOMP is hosting at ICTP this short course by Cristiano De Nobili designed for young researchers in applied/theoretical physics and climate science, who are interested in applying large language models (LLMs) in their research.
 
Schedule and venues:
Wednesday  (6/11/24): 14.00 - 18.00, Giambiagi Lecture Room (Adriatico Guesthouse)
Thursday     (7/11/24):  9.00 - 13.00, Luigi Stasi Seminar Room (Leonardo Building)
Friday          (8/11/24):  9.00 - 13.00, Luigi Stasi Seminar Room (Leonardo Building)
 
Short description of the course:
To warm up, we will review the basics of deep learning and PyTorch. Then the Transformer architecture and its Self-attention Mechanism will be introduced and coded. A simple, small but complete autoregressive generative language model such as GPT-2 will be built. This will allow us to understand several relevant aspects of more sophisticated pre-trained LLMs, such as GPT4, Mistral or Llama. Afterwards, we will play with open-source pre-trained LLMs and, if possible, fine-tune one of them. In the last part of the course, we will explore some interesting, also from a physical point of view, emerging abilities of LLMs, touch upon multi-agent systems and their collective behaviour.
 
For people that cannot be present in person and would like to follow from remote, please connect via Zoom using this link:
https://zoom.us/j/99968323117
ID meeting: 999 6832 3117
Passcode: 101357