# CMSA New Technologies in Mathematics Seminar: Large language models, mathematical discovery, and search in the space of strategies: an anecdote

CMSA EVENTS

##### Speaker:

Jordan Ellenberg *- Univ. of Wisconsin Dept. of Mathematics*

Please note special time

I spent a portion of 2023 working with a team at DeepMind on the “cap set problem” – how large can a subset of (Z/3Z)^n be which contains no three terms which sum to zero? (I will explain, for those not familiar with this problem, something about the role it plays in combinatorics, its history, and why number theorists care about it a lot.) By now, there are many examples of machine learning mechanisms being used to help generate interesting mathematical knowledge, and especially interesting examples. This project used a novel protocol; instead of searching directly for large cap sets, we used LLMs trained on code to search the space of short programs for those which, when executed, output large capsets. One advantage is that a program is much more human-readable than a large collection of vectors over Z/3Z, bringing us closer to the not-very-well-defined-but-important goal of “interpretable machine learning.” I’ll talk about what succeeded in this project (more than I expected!) what didn’t, and what role I can imagine this approach to the math-ML interface playing in near-future mathematical practice.

The paper:

https://www.nature.com/articles/s41586-023-06924-6

https://harvard.zoom.us/j/95706757940?pwd=dHhMeXBtd1BhN0RuTWNQR0xEVzJkdz09

Password: cmsa