Wasserstein gradient flows often arise from mean-field interactions among exchangeable particles. In many interesting applications however, the "particles" are edge weights in a graph whose vertex labels are exchangeable but not the edges themselves. Motivated by such graph optimization problems we investigate the question of optimization of functions over this different class of symmetries. Popular applications include training of large computational graphs like (Deep) Neural Networks. This body of work shows that discrete stochastic optimization algorithms over finite graphs have a well-defined analytical scaling limit as the size of the network grows to infinity. The limiting space is that of graphons, a notion introduced by Lovász and Szegedy to describe limits of dense graph sequences. The limiting curves are given by a novel notion of McKean-Vlasov equations on graphons and a corresponding notion of propagation of chaos holds. In the asymptotically zero-noise case, the limit is a gradient flow on the metric space of graphons. This is an attempt to generalize the Wasserstein calculus to higher-order exchangeable structures.