Calendar

< 2020 >
September 23
  • 23
    September 23, 2020

    Multiplicative functions in short intervals revisited

    10:00 AM-11:00 AM
    September 23, 2020

    A few years ago Maksym Radziwill and I showed that the average of a multiplicative function in almost all very short intervals $[x, x+h]$ is close to its average on a long interval $[x, 2x]$. This result has since been utilized in many applications.

    I will talk about recent work, where Radziwill and I revisit the problem and generalise our result to functions which vanish often as well as prove a power-saving upper bound for the number of exceptional intervals (i.e. we show that there are $O(X/h^\kappa)$ exceptional $x \in [X, 2X]$).

    We apply this result for instance to studying gaps between norm forms of an arbitrary number field.

    Zoom: https://harvard.zoom.us/j/96767001802

    Password: The order of the permutation group on 9 elements.

    CMSA Strongly Correlated Quantum Materials and High-Temperature Superconductors Series: Metal-to-metal quantum phase transitions not described by symmetry-breaking orders II

    10:30 AM-12:00 PM
    September 23, 2020

    In this second talk, I will focus on (nearly) solvable models of metal-metal transition in random systems. The t-J model with random and all-to-all hopping and exchange can be mapped onto a quantum impurity model coupled self-consistently to an environment (the mapping also applies to a t-J model in a large dimension lattice, with random nearest-neighbor exchange). Such models will be argued to exhibit metal-metal quantum phase transitions in the universality class of the SYK model, accompanied by a linear-in-T resistivity from time reparameterization fluctuations. I will also present the results of exact diagonalization of random t-J clusters, obtained recently with Henry Shackleton, Alexander Wietek, and Antoine Georges.

    Zoom: https://harvard.zoom.us/j/977347126

    CMSA New Technologies in Mathematics: Self-induced regularization from linear regression to neural networks

    3:00 PM-4:00 PM
    September 23, 2020

    Modern machine learning methods –most noticeably multi-layer neural networks– require to fit highly non-linear models comprising tens of thousands to millions of parameters. Despite this, little attention is paid to the regularization mechanism to control model’s complexity. Indeed, the resulting models are often so complex as to achieve vanishing training error: they interpolate the data. Despite this, these models generalize well to unseen data: they have small test error. I will discuss several examples of this phenomenon, beginning with a simple linear regression model, and ending with two-layers neural networks in the so-called lazy regime. For these examples precise asymptotics could be determined mathematically, using tools from random matrix theory. I will try to extract a unifying picture.
    A common feature is the fact that a complex unregularized nonlinear model becomes essentially equivalent to a simpler model, which is however regularized in a non-trivial way.
    [Based on joint papers with: Behrooz Ghorbani, Song Mei, Theodor Misiakiewicz, Feng Ruan, Youngtak Sohn, Jun Yan, Yiqiao Zhong]

    Zoom: https://harvard.zoom.us/j/91458092166

    Mathematics from doodling

    4:30 PM-5:30 PM
    September 23, 2020

    We’ll start with a type of doodle most of you have done since you were little, and start wondering about it. We’ll be led through a number of questions I’ve heard starting from long ago, wandering through a number of different ideas, and hopefully ending in the vicinity of some of the groundbreaking ideas of Maryam Mirzakhani. This is what mathematics, and basic science, is really about — observing something out there “in nature”, wondering about it, explaining it, and realizing that it connects to other deeper questions, and then repeating the process.

    Zoom: https://harvard.zoom.us/j/96759150216