Relevant Papers

Abstract:

Important dividing lines in model theory (NIP and stability) coincide with notions of statistical learnability (PAC and online learning). In the discrete world, this correspondence goes through combinatorial dimensions of families of sets, but these combinatorial dimensions can be generalized to families of real-valued functions. We will explore how these generalized dimensions connect to dividing lines in continuous logic, and how randomizations in continuous logic can be used to deduce bounds for learning algorithms.

Joint work with Michael Benedikt.