Title: MetaGrad: Adapting to Easy Data in Online Sequential Prediction Abstract: In statistical learning, the limits of minimax analysis are pretty well understood. For example, in classification it is known from the work of Tsybakov and others, that it is possible to predict much better than the minimax rate in many common cases where the data distribution is 'easy', as characterized by the so-called margin condition. Adaptive methods that automatically exploit the margin condition are possible, although they require automatic tuning of a regularization parameter, which is theoretically complicated. In contrast to statistical learning, most work in online sequential prediction is still based only on minimax analysis. I will present recent work in which we develop a theory of 'easy data' for this setting. We introduce an adaptive method called MetaGrad, which automatically determines the optimal regularization parameter from the data. MetaGrad's performance is bounded in terms of a new data-dependent measure of variance, which automatically recovers the best-known rates in all known cases, and implies fast rates in a new class of cases that we characterize by an online version of the margin condition. This is joint work with Wouter Koolen and Peter Grünwald.