Calendar

< 2022 >
August
Sun
Mon
Tue
Wed
Thu
Fri
Sat
July
1
  • CONFERENCE: Advances in Mathematical Physics: A Conference in Honor of Elliott H. Lieb on his 90th Birthday.
    All day
    August 1, 2022-August 1, 2022

    Advances in Mathematical Physics

    A Conference in Honor of Elliott H. Lieb on his 90th Birthday

    Dates: July 30-August 1, 2022

    Harvard University

    July 30 – 31, 2022: Hall B, Science Center, 1 Oxford Street, Cambridge, MA, 02138
    August 1, 2022: Hall C, Science Center, 1 Oxford Street, Cambridge, MA, 02138

    Register Here

    Conference Schedule

    Download PDF for a detailed schedule of lectures and events.

    Saturday, July 30, Hall B

    Sunday, July 31, Hall B

    Monday, August 1, Hall C

    8:45 a.m. – 9 a.m.

    Refreshments

    8:45 a.m. – 9 a.m.

    Refreshments

    8:45 a.m. – 9 a.m.

    Refreshments

    9 a.m. – 9:45 a.m.

    Jan Philip Solovej

    9 a.m. – 10 a.m.

    Hugo Duminil-Copin

    9 a.m. – 9:45 a.m.

    Yoshiko Ogata

    9:45 a.m. – 10:30 a.m.

    László Erdös

    10 a.m. – 10:30 a.m.

    Tea Break

    9:45 a.m. – 10:30 a.m.

    Hal Tasaki

    10:30 a.m. – 11 a.m.

    Tea Break

    10:30 a.m. – 11:15 a.m.

    Jürg Fröhlich

    10:30 a.m. – 11 a.m.

    Tea Break

    11 a.m. – 11:45 a.m.

    Robert Seiringer

    11:15 a.m. – 12 p.m.

    Bertrand Halperin

    11 a.m. – 11:45 a.m.

    Bruno Nachtergaele

    11:45 a.m. – 12:30 p.m.

    Rupert Frank

    12:00 p.m. – 1:30 p.m.

    Lunch

    11:45 a.m. – 12:30 p.m.

    Alessandro Guiliani

    12:30 p.m. – 2 p.m.

    Lunch

    1:30 p.m. – 2:15 p.m.

    Jun Yin

    12:30 p.m. – 2 p.m.

    Lunch

    2 p.m. – 2:45 p.m.

    Simone Warzel

    2:15 p.m. – 3 p.m.

    Sabine Jansen

    2 p.m. – 2:45 p.m.

    Ron Peled

    2:45 p.m. – 3:30 p.m.

    Benjamin Schlein

    3 p.m. – 3:30 p.m.

    Tea Break

    2:45 p.m. – 3:30 p.m.

    Mathieu Lewin

    3:30 p.m. – 4 p.m.

    Tea Break

    3:30 p.m. – 4:30 p.m.

    A Review of Lieb’s Work

    3:30 p.m. – 4 p.m.

    Tea Break

    4 p.m. – 4:45 p.m.

    Rafael Benguria

    4 p.m. – 4:45 p.m.

    Eric Carlen

     

    Organizers:
    Michael Aizenman, Princeton University
    Joel Lebowitz, Rutgers University
    Ruedi Seiler, Technische Universität Berlin
    Herbert Spohn, Technical University of Munich
    Horng-Tzer Yau, Harvard University
    Shing-Tung Yau, Harvard University
    Jakob Yngvason, University of Vienna

    Speakers:
    Rafael Benguria, Pontificia Universidad Catolica de Chile
    Eric Carlen, Rutgers University
    Philippe Di Francesco, University of Illinois
    Hugo Duminil-Copin, IHES
    László Erdös, Institute of Science and Technology Austria
    Rupert Frank, The California Institute of Technology
    Jürg Fröhlich, ETH Zurich
    Alessandro Giuliani, Università degli Studi Roma Tre
    Bertrand Halperin, Harvard University
    Klaus Hepp, Institute for Theoretical Physics, ETH Zurich
    Sabine Jansen, Ludwig Maximilian University of Munich
    Mathieu Lewin, Université Paris-Dauphine
    Bruno Nachtergaele, The University of California, Davis
    Yoshiko Ogata, University of Tokyo
    Ron Peled, Tel Aviv University
    Benjamin Schlein, University of Zurich
    Robert Seiringer, Institute of Science and Technology Austria
    Jan Philip Solovej, University of Copenhagen
    Hal Tasaki, Gakushuin University
    Simone Warzel, Technical University of Munich
    Jun Yin, The University of California, Los Angeles

2
3
4
5
6
7
8
9
10
11
  • CMSA EVENT: CMSA Interdisciplinary Science Seminar: Exploring and Exploiting the Universality Phenomena in High-Dimensional Estimation and Learning

    Speaker: Yue M. Lu – Harvard University

    9:00 AM-10:00 AM
    August 11, 2022
    1 Oxford Street, Cambridge, MA 02138 USA

    Universality is a fascinating high-dimensional phenomenon. It points to the existence of universal laws that govern the macroscopic behavior of wide classes of large and complex systems, despite their differences in microscopic details. The notion of universality originated in statistical mechanics, especially in the study of phase transitions. Similar phenomena have been observed in probability theory, dynamical systems, random matrix theory, and number theory.
    In this talk, I will present some recent progresses in rigorously understanding and exploiting the universality phenomena in the context of statistical estimation and learning on high-dimensional data. Examples include spectral methods for high-dimensional projection pursuit, statistical learning based on kernel and random feature models, and approximate message passing algorithms on highly structured, strongly correlated, and even (nearly) deterministic data matrices. Together, they demonstrate the robustness and wide applicability of the universality phenomena.

     

    For more information on how to join, please see: https://cmsa.fas.harvard.edu/interdisciplinary-science-seminar/


12
13
14
15
16
  • SEMINARS: CMSA Quantum Matter in Mathematics and Physics: Transport in large-N critical fermi surface

    Speaker: Haoyu Guo – Harvard

    10:00 AM-11:30 AM
    August 16, 2022

    A Fermi surface coupled to a scalar field can be described in a 1/N expansion by choosing the fermion-scalar Yukawa coupling to be random in the N-dimensional flavor space, but invariant under translations. We compute the conductivity of such a theory in two spatial dimensions for a critical scalar. We find a Drude contribution, and show that a previously proposed  mega^{-2/3} contribution to the optical conductivity at frequency mega has vanishing co-efficient. We also describe the influence of impurity scattering of the fermions, and find that while the self energy resembles a marginal Fermi liquid, the resistivity behaves like a Fermi liquid.  Arxiv references: 2203.04990, 2207.08841


    For more information on how to join, please see: https://cmsa.fas.harvard.edu/quantum-matter-seminar/

17
18
  • CMSA EVENT: CMSA Interdisciplinary Science Seminar: Scalable Dynamic Graph Algorithms

    Speaker: Quanquan Liu – Northwestern University

    9:00 AM-10:00 AM
    August 18, 2022

    The field of dynamic graph algorithms seeks to understand and compute statistics on real-world networks that undergo changes with time. Some of these networks could have up to millions of edge insertions and deletions per second. In light of these highly dynamic networks, we propose various scalable and accurate graph algorithms for a variety of problems. In this talk, I will discuss new algorithms for various graph problems in the batch-dynamic model in shared-memory architectures where updates to the graph arrive in multiple batches of one or more updates. I’ll also briefly discuss my work in other dynamic models such as distributed dynamic models where the communication topology of the network also changes with time (ITCS 2022). In these models, I will present efficient algorithms for graph problems including k-core decomposition, low out-degree orientation, matching, triangle counting, and coloring.

    Specifically, in the batch-dynamic model where we are given a batch of B updates, I’ll discuss an efficient O(B log^2 n) amortized work and O(log^2 n log log n) depth algorithm that gives a (2+\epsilon)-approximation on the k-core decomposition after each batch of updates (SPAA 2022). We also obtain new batch-dynamic algorithms for matching, triangle counting, and coloring using techniques and data structures developed in our k-core decomposition algorithm. In addition to our theoretical results, we implemented and experimentally evaluated our k-core decomposition algorithm on a 30-core machine with two-way hyper-threading on 11 graphs of varying densities and sizes. Our experiments show improvements over state-of-the-art algorithms even on machines with only 4 cores (your standard laptop). I’ll conclude with a discussion of some open questions and potential future work that these lines of research inspire.


    For more information, please see: https://cmsa.fas.harvard.edu/interdisciplinary-science-seminar/

19
20
21
22
23
24
25
26
  • CMSA EVENT: CMSA Big Data Conference 2022
    9:00 AM-2:00 PM
    August 26, 2022

    On August 26, 2022 the CMSA will host our eighth annual Conference on Big Data. The Conference will feature many speakers from the Harvard community as well as scholars from across the globe, with talks focusing on computer science, statistics, math and physics, and economics. 

    The 2022 Big Data Conference will take place virtually on Zoom.

    You must register to attend. Register Here

    Organizers: 

    • Scott Duke Kominers, MBA Class of 1960 Associate Professor, Harvard Business
    • Horng-Tzer Yau, Professor of Mathematics, Harvard University
    • Sergiy Verstyuk, CMSA, Harvard University

    Speakers:

    Schedule:

    9:00 amConference OrganizersIntroduction and Welcome
    9:10 am – 9:55 amXiaohong ChenTitle: On ANN optimal estimation and inference for policy functionals of nonparametric conditional moment restrictions

    Abstract:  Many causal/policy parameters of interest are expectation functionals of unknown infinite-dimensional structural functions identified via conditional moment restrictions. Artificial Neural Networks (ANNs) can be viewed as nonlinear sieves that can approximate complex functions of high dimensional covariates more effectively than linear sieves. In this talk we present ANN optimal estimation and inference on  policy functionals, such as average elasticities or value functions, of unknown structural functions of endogenous covariates. We provide ANN efficient estimation and optimal t based confidence interval for regular policy functionals such as average derivatives in nonparametric instrumental variables regressions. We also present ANN quasi likelihood ratio based inference for possibly irregular policy functionals of general nonparametric conditional moment restrictions (such as quantile instrumental variables models or Bellman equations) for time series data. We conduct intensive Monte Carlo studies to investigate computational issues with ANN based optimal estimation and inference in economic structural models with endogeneity. For economic data sets that do not have very high signal to noise ratios, there are current gaps between theoretical advantage of ANN approximation theory vs inferential performance in finite samples.
    Some of the results are applied to efficient estimation and optimal inference for average price elasticity in consumer demand and BLP type demand.

    The talk is based on two co-authored papers:
    (1) Efficient Estimation of Average Derivatives in NPIV Models: Simulation Comparisons of Neural Network Estimators
    (Authors: Jiafeng Chen, Xiaohong Chen and Elie Tamer)
    https://arxiv.org/abs/2110.06763

    (2) Neural network Inference on Nonparametric conditional moment restrictions with weakly dependent data
    (Authors: Xiaohong Chen, Yuan Liao and Weichen Wang).

    10:00 am – 10:45 amJessica JeffersTitle: Labor Reactions to Credit Deterioration: Evidence from LinkedIn Activity

    Abstract: We analyze worker reactions to their firms’ credit deterioration. Using weekly networking activity on LinkedIn, we show workers initiate more connections immediately following a negative credit event, even at firms far from bankruptcy. Our results suggest that workers are driven by concerns about both unemployment and future prospects at their firm. Heightened networking activity is associated with contemporaneous and future departures, especially at financially healthy firms. Other negative events like missed earnings and equity downgrades do not trigger similar reactions. Overall, our results indicate that the build-up of connections triggered by credit deterioration represents a source of fragility for firms.

    10:50 am – 11:35 amMiles CranmerTitle: Interpretable Machine Learning for Physics

    Abstract: Would Kepler have discovered his laws if machine learning had been around in 1609? Or would he have been satisfied with the accuracy of some black box regression model, leaving Newton without the inspiration to discover the law of gravitation? In this talk I will explore the compatibility of industry-oriented machine learning algorithms with discovery in the natural sciences. I will describe recent approaches developed with collaborators for addressing this, based on a strategy of “translating” neural networks into symbolic models via evolutionary algorithms. I will discuss the inner workings of the open-source symbolic regression library PySR (github.com/MilesCranmer/PySR), which forms a central part of this interpretable learning toolkit. Finally, I will present examples of how these methods have been used in the past two years in scientific discovery, and outline some current efforts.

    11:40 am – 12:25 pmDan RobertsTitle: A Statistical Model of Neural Scaling Laws

    Abstract: Large language models of a huge number of parameters and trained on near internet-sized number of tokens have been empirically shown to obey “neural scaling laws” for which their performance behaves predictably as a power law in either parameters or dataset size until bottlenecked by the other resource. To understand this better, we first identify the necessary properties allowing such scaling laws to arise and then propose a statistical model — a joint generative data model and random feature model — that captures this neural scaling phenomenology. By solving this model using tools from random matrix theory, we gain insight into (i) the statistical structure of datasets and tasks that lead to scaling laws (ii) how nonlinear feature maps, i.e the role played by the deep neural network, enable scaling laws when trained on these datasets, and (iii) how such scaling laws can break down, and what their behavior is when they do. A key feature is the manner in which the power laws that occur in the statistics of natural datasets are translated into power law scalings of the test loss, and how the finite extent of such power laws leads to both bottlenecks and breakdowns.

    12:30 pmConference OrganizersClosing Remarks

     


    For more information, please see: https://cmsa.fas.harvard.edu/big-data-2022/

27
28
29
30
31
September
September
September