Abstract: Given two discrete random variables $X$ and $Y$, how much information does $Y$ leak about $X$? An operational definition of this leakage (called as maximal leakage) was given by Issa, Kamath and Wagner 2016 motivated by the setup of a guessing adversary. Maximal leakage is defined as the multiplicative increase, upon observing $Y$, of the probability of correctly guessing a randomized function of $X$, maximized over all such randomized functions. We discuss Issa et al. 's result which shows that maximal leakage is equal to sibson mutual information of order infinity, giving the latter operational significance, and study some of its properties.