Calendar

< 2021 >
May 11
  • 11
    May 11, 2021

    CMSA Computer Science for Mathematicians: Computability Theory for Designing Machine Learning Algorithms

    11:30 AM-12:30 PM
    May 11, 2021

    This talk is about learning from informant, a formal model for binary classification. Illustrating examples are linear separators and other uniformly decidable sets of formal languages. Due to the learning by enumeration technique by Gold the learning process can be assumed consistent when full-information is available.
    The original model can be adjusted towards the setting of deep learning. We investigate the learnability of the set of half-spaces by these incremental learners. Moreover, they have less learning power than the full-information variant by a fundamental proof technique due to Blum and Blum. This technique can also be used to separate consistency.
    Finally, we present recent results towards a better understanding of (strong) non-U-shaped learning from binary labeled input data. To separate the syntactic variant, we employ an infinite recursion theorem by Case.

    Zoom: https://harvard.zoom.us/j/98231541450