Seminar: Blockseminar: Neural ODE and Generative Modelling für Master - Details

Seminar: Blockseminar: Neural ODE and Generative Modelling für Master - Details

Sie sind nicht in Stud.IP angemeldet.

Allgemeine Informationen

Veranstaltungsname Seminar: Blockseminar: Neural ODE and Generative Modelling für Master
Veranstaltungsnummer MTH-1360; MTH-1410
Semester SS 2025
Aktuelle Anzahl der Teilnehmenden 1
Heimat-Einrichtung Mathematische Bildverarbeitung
Veranstaltungstyp Seminar in der Kategorie Lehre
Veranstaltung findet in Präsenz statt / hat Präsenz-Bestandteile Ja
Hauptunterrichtssprache englisch
Literaturhinweise Literature
- An introduction to deep generative modeling, Ruthotto, Lars and Haber, Eldad, GAMM-Mitteilungen, Wiley Online Library (2021)
- A conceptual introduction to Markov chain Monte Carlo methods, Speagle, Joshua S,arXiv preprint (2019)
- Neural ordinary differential equations, Chen, Ricky TQ and Rubanova, Yulia and Bettencourt, Jesse and Duvenaud, David K NeurIPS (2018)
- Ffjord: Free-form continuous dynamics for scalable reversible generative models, Grathwohl, Will and Chen, Ricky TQ and Bettencourt, Jesse and Sutskever, Ilya and Duvenaud, David arXiv preprint (2018)
- Diffusion models: A comprehensive survey of methods and applications, Yang, Ling and Zhang, Zhilong and Song, Yang and Hong, Shenda and Xu, Runsheng and Zhao, Yue and Shao, Yingxia and Zhang, Wentao and Cui, Bin and Yang, Ming-Hsuan, arXiv preprint (2022)
- Applied stochastic differential equations, Särkkä, Simo and Solin, Arno, Cambridge University Press (2019)
- Generative modeling by estimating gradients of the data distribution, Song, Yang and Ermon, Stefano, NeurIPS (2019)
- Improved techniques for training score-based generative models, Song, Yang and Ermon, Stefano, NeurIPS (2020)
- Score-based generative modeling through stochastic differential equations, Song, Yang and Sohl-Dickstein, Jascha and Kingma, Diederik P and Kumar, Abhishek and Ermon, Stefano and Poole, Ben, ICLR (2021)
- Gotta go fast when generating data with score-based models, olicoeur-Martineau, Alexia and Li, Ke and Piché-Taillefer, Rémi and Kachman, Tal and Mitliagkas, Ioannis, arXiv preprint (2021)
- Flow matching for generative modeling, Lipman, Yaron and Chen, Ricky TQ and Ben-Hamu, Heli and Nickel, Maximilian and Le, Matt, arXiv preprint arXiv:2210.02747

Räume und Zeiten

Keine Raumangabe

Studienbereiche

Modulzuordnungen

Kommentar/Beschreibung

Score-based generative models have demonstrated state-of-the-art performance in numerous applications in recent years. The central concept behind these models involves the gradual injection of noise into the training data, followed by learning the reverse process to generate new samples. The training and sampling procedures can be conducted independently. The learning phase is facilitated by noise-conditional score networks, while sampling can be accomplished through various methods, including Langevin Monte Carlo approaches, stochastic differential equations, ordinary differential equations, and various combinations.

This seminar will overview generative models and common architectures, comparing different training objectives and sampling methods. We will examine normalizing flows, score matching, and Langevin dynamics. Additionally, we will discuss the use of stochastic differential equations (SDEs) in score-based models and conclude with comparisons to other diffusion models and potential enhancements in sample generation.