CMSA Computer Science for Mathematicians: Computability Theory for Designing Machine Learning Algorithms


View Calendar
May 11, 2021 11:30 am - 12:30 pm
via Zoom Video Conferencing

Karen Seidel - Hasso Plattner Institute

This talk is about learning from informant, a formal model for binary classification. Illustrating examples are linear separators and other uniformly decidable sets of formal languages. Due to the learning by enumeration technique by Gold the learning process can be assumed consistent when full-information is available.
The original model can be adjusted towards the setting of deep learning. We investigate the learnability of the set of half-spaces by these incremental learners. Moreover, they have less learning power than the full-information variant by a fundamental proof technique due to Blum and Blum. This technique can also be used to separate consistency.
Finally, we present recent results towards a better understanding of (strong) non-U-shaped learning from binary labeled input data. To separate the syntactic variant, we employ an infinite recursion theorem by Case.