Learning with Few Labeled Data
[Moved Online] Hot Topics: Optimal transport and applications to machine learning and statistics May 04, 2020 - May 08, 2020
Location: SLMath: Online/Virtual
information theory
thermodynamics
rate-distortion theory
transfer learning
information bottleneck
optimal transportation
Learning With Few Labeled Data
The human visual system is proof that it is possible to learn new categories with extremely few samples; humans do not need a million samples to learn to distinguish a poisonous mushroom from an edible one in the wild. Such ability, arguably, comes from having seen millions of other categories and transferring learnt representations to the new categories. This talk will present a formal connection of machine learning with thermodynamics to characterize the quality of learnt representations for transfer learning. We will discuss how information-theoretic functionals such as rate, distortion and classification loss lie on a convex, so-called, equilibrium surface. We prescribe dynamical processes to traverse this surface under constraints, e.g., an iso-classification process that modulates rate and distortion to keep the classification loss unchanged. We demonstrate how such processes allow complete control over the transfer from a source dataset to a target dataset and can guarantee the performance of the final model. This talk will discuss results from https://arxiv.org/abs/1909.02729 and https://arxiv.org/abs/2002.12406.
Learning With Few Labeled Data
H.264 Video | 928_28404_8332_Learning_with_Few_Labeled_Data.mp4 |
Please report video problems to itsupport@slmath.org.
See more of our Streaming videos on our main VMath Videos page.