We propose variational multi-task learning (VMTL), a general probabilistic inference framework in which we cast multi-task learning as a variational Bayesian inference problem.
In this paper, we address the challenges of domain shift and the uncertainty with a probabilistic framework based on variational Bayesian inference, by incorporating uncertainty into neural network weights.
We propose MetaNorm, a simple yet effective meta-learning normalization approach that learns adaptive statistics for few-shot classification and domain generalization.
This paper strives for repetitive activity counting in videos. Different from existing works, which all analyze the visual video content only, we incorporate for the first time the corresponding sound into the repetition counting process.
We propose variational knowledge distillation (VKD), which is a new probabilistic inference framework for disease classification based on X-rays that lever- ages knowledge from EHRs.