Skip to yearly menu bar Skip to main content


Invited Talk
in
Workshop: 3rd Workshop on High-dimensional Learning Dynamics (HiLD)

Nathan Srebro (TTIC& University of Chicago), Is A Good Input Distribution All You Need?

Nati Srebro

[ ]
Fri 18 Jul 2 p.m. PDT — 2:45 p.m. PDT

Abstract:

What functions representable by neural nets are tractably learnable?  Complexity results tell us that not all of them are, and we have been on a quest to understand what is the subclass of functions that are learnable.  In this talk I will revisit and question this view, putting an emphasis on the input distribution, rather than the target function, and arguing that perhaps all functions are easy, it’s just a matter of the input distribution.  This also leads to understanding much of the current success of deep learning in terms of “Positive Distribution Shift”.

Chat is not available.