NeurIPS Fest 2024

Annual NeurIPS-preview party spotlighting Amsterdam’s finest and latest research in machine learning

November 28, 2024

16:00-19:30

Lab42, Amsterdam Science Park

NeurIPS Fest 2023

NeurIPS Fest 2022

The 2024 NeurIPS conference and workshops will be held in Vancouver, Canada from 10th to 15th December. Prior to the major event, the NeurIPS Fest will be organised by the ELLIS unit Amsterdam. This year, we invite papers presenters from different universities across the Netherlands. Therefore, we organize the event closely with the other two ELLIS units located in the country; ELLIS unit Delft and ELLIS unit Nijmegen. As done in the past two years, we will celebrate the achievements of the Machine Learning ecosystem members with a keynote speaker, poster session, and networking. Grab this fantastic opportunity to connect with like-minded individuals and explore the latest research in Machine Learning in the Netherlands! We are looking forward to seeing you at the event!

Event programme

16:00-17:00

Keynote Presentation

Christian A. Naesseth (University of Amsterdam)
Venue: L3.33 & L3.35

17:00-19:30

Poster Session

with bites & drinks
Venue: ground floor

Christian A. Naesseth

Machine Learning Assistant Professor at the University of Amsterdam

About the keynote speaker

Christian A. Naesseth is an Assistant Professor of Machine Learning at the University of Amsterdam, a member of the Amsterdam Machine Learning Lab, the lab manager of the UvA-Bosch Delta Lab 2, and an ELLIS member.

His research interests span statistical inference, uncertainty quantification, reasoning, and machine learning, as well as their application to the sciences. He is currently working on generative modelling(diffusions, flows, AI4Science), approximate inference (variational and Monte Carlo methods), uncertainty quantification and hypothesis testing (E-values, conformal prediction). Previously, he was a postdoctoral research scientist with David Blei at the Data Science Institute, Columbia University. He completed his PhD in Electrical Engineering at Linköping University, advised by Fredrik Lindsten and Thomas Schön.

Diffusions, flows, and other stories
Generative models have taken the world by storm. Using generative modeling, a.k.a. generative AI, we can construct probabilistic approximations to any data-generating process. In the context of text, large language models place distributions over the next token, for images it is often a distribution over pixel color values, whereas for molecules it can be a combination of atom types, positions, and various chemical features. This talk will explore some of the dominant paradigms, applications, and recent developments in generative modeling.

At the 2024 NeurIPS Conference, his lab and collaborators will present 5 accepted papers.

Posters showcased at the event

Neural Flow Diffusion Models

Space-Time Continuous PDE Forecasting using Equivariant Neural Fields

Scalable Kernel Inverse Optimization

Rethinking Knowledge Transfer in Learning Using Privileged Information

3-in-1: 2D Rotary Adaptation for Efficient Finetuning, Efficient Batching and Composability

IPO: Interpretable Prompt Optimization for Vision-Language Models

AGALE: A Graph-Aware Continual Learning Evaluation Framework

Input-to-State Stable Coupled Oscillator Networks for Closed-form Model-based Control in Latent Space

Reproducibility Study of "Robust Fair Clustering: A Novel Fairness Attack and Defense Framework"

Equivariant Neural Diffusion for Molecule Generation

PART: Self-supervised Pre-Training with Continuous Relative Transformations

VISA: Variational Inference with Sequential Sample-Average Approximations

[Re] On the Reproducibility of Post-Hoc Concept Bottleneck Models

SIGMA: Sinkhorn-Guided Masked Video Modeling Main

On the Reproducibility of: "Learning Perturbations to Explain Time Series Predictions"

Rethinking Knowledge Transfer in Learning Using Privileged Information

“Studying How to Efficiently and Effectively Guide Models with Explanations” - A Reproducibility Study

Variational Flow Matching for Graph Generation

When Your AIs Deceive You: Challenges of Partial Observability in Reinforcement Learning from Human Feedback

FewViewGS: Gaussian Splatting with Few View Matching and Multi-stage Training

TVBench: Redesigning Video-Language Evaluation

GO4Align: Group Optimization for Multi-Task Alignment

No Train, all Gain: Self-Supervised Gradients Improve Deep Frozen Representations

Optimizing importance weighting in the presence of sub-population shifts

Reproducibility Study of Learning Fair Graph Representations Via Automated Data Augmentations

The NeurIPS Fest 2024 is ELLIS unit Amsterdam’s annual NeurIPS-preview party spotlighting Amsterdam’s finest and latest research in machine learning.