Sums of algebraic dilates
SEMINARS: HARVARD-MIT COMBINATORICS
Given a complex number $\lambda$ and a finite set $A$ of complex numbers, how small can the size of the sum of dilate $A + \lambda\cdot A$ be in terms of $|A|$? If $\lambda$ is transcendental, then $|A + \lambda\cdot A|$ grows superlinearly in $|A|$, whereas if $\lambda$ is algebraic, then $|A + \lambda\cdot A|$ only grows linearly in $|A|$. There have been several works in recent years to prove optimal linear bounds in the algebraic case.
In this talk, we answer the above problem in the following general form: if $\lambda_1,\ldots,\lambda_k$ are algebraic numbers, then
$$|A+\lambda_1\cdot A+\dots+\lambda_k\cdot A|\geq H(\lambda_1,\ldots,\lambda_k)|A|-o(|A|)$$
for all finite subsets $A$ of $\mathbb{C}$, where $H(\lambda_1,\ldots,\lambda_k)$ is an explicit constant that is best possible. We will discuss the main tools used in the proof, which include a Frieman-type structure theorem for sets with small sums of dilates, and a high-dimensional notion of density which we call “lattice density”. Joint work with David Conlon.
For information about the Richard P. Stanley Seminar in Combinatorics, visit… https://math.mit.edu/combin/
