Calendar
- 02May 2, 2023
Harvard–MIT Algebraic Geometry Seminar: Moduli spaces of cubic hypersurfaces
Science Center 5071 Oxford Street, Cambridge, MA 02138 USAIn this talk I will give an overview of some recent work, joint with Samuel Grushevsky, Klaus Hulek, and Radu Laza, on the geometry and topology of compactifications of the moduli spaces of cubic threefolds and cubic surfaces. A focus of the talk will be on some results regarding non-isomorphic smooth compactifications of the moduli space of cubic surfaces, showing that two natural desingularizations of the moduli space have the same cohomology, and are both blow-ups of the moduli space at the same point, but are nevertheless, not isomorphic, and in fact, not even K-equivalent.
- 02May 2, 2023
Mathematical Picture Language Seminar: Hydrodynamics and Corrections to Random Matrix Universality in Quantum Chaos
Abstract TBA
The Math Picture Language seminar will be held at 9:30 a.m. Boston time.
Click the link for a Zoom Link for Tuesday Math Picture Language Seminars.
Recorded seminars can be viewed on the Mathematical Picture Language YouTube channel - 03May 3, 2023
CMSA Colloquium: Generative Adversarial Networks (GANs): An Analytical Perspective
CMSA, 20 Garden St, G1020 Garden Street, Cambridge, MA 02138Generative models have attracted intense interests recently. In this talk, I will discuss one class of generative models, Generative Adversarial Networks (GANs). I will first provide a gentle review of the mathematical framework behind GANs. I will then proceed to discuss a few challenges in GANs training from an analytical perspective. I will finally report some recent progress for GANs training in terms of its stability and convergence analysis.
- 03May 3, 2023
CMSA Probability Seminar: Random Neural Networks
Location TBA20 Garden Street, Cambridge, MA 02138Fully connected neural networks are described two by structural parameters: a depth L and a width N. In this talk, I will present results and open questions about the asymptotic analysis of such networks with random weights and biases in the regime where N (and potentially L) are large. The first set of results are for deep linear networks, which are simply products of L random matrices of size N x N. I’ll explain how the setting where the ratio L / N is fixed with both N and L large reveals a number of phenomena not present when only one of them is large. I will then state several results about non-linear networks in which this depth-to-width ratio L / N again plays a crucial role and gives an effective notion of depth for a random neural network. - 03May 3, 2023
CMSA Probability Seminar: Random Neural Networks
Location TBA20 Garden Street, Cambridge, MA 02138Fully connected neural networks are described two by structural parameters: a depth L and a width N. In this talk, I will present results and open questions about the asymptotic analysis of such networks with random weights and biases in the regime where N (and potentially L) are large. The first set of results are for deep linear networks, which are simply products of L random matrices of size N x N. I’ll explain how the setting where the ratio L / N is fixed with both N and L large reveals a number of phenomena not present when only one of them is large. I will then state several results about non-linear networks in which this depth-to-width ratio L / N again plays a crucial role and gives an effective notion of depth for a random neural network.