Consider a sample of size $n$ from a linear process of dimension $p$ where $n, p \to \infty$, $p/n \to y \in [0, \ \infty).$ Let $\hat{\Gamma}_{u}$ be the sample autocovariance of order $u$.
The existence of limiting spectral distribution (LSD) of $\hat{\Gamma}_{u} + \hat{\Gamma}_{u}^{*}$, the symmetric sum of the sample autocovariance matrix $\hat{\Gamma}_{u}$ of order $u$, is known in the literature under appropriate (strong) assumptions on the coefficient matrices. Under significantly weaker conditions, we prove, in a unified way, that the LSD of any symmetric polynomial in these matrices such as $\hat{\Gamma}_{u} + \hat{\Gamma}_{u}^{*}$, $\hat{\Gamma}_{u}\hat{\Gamma}_{u}^{*}$, $\hat{\Gamma}_{u} \hat{\Gamma}_{u}^{*}$+$\hat{\Gamma}_{k}\hat{\Gamma}_{k}^{*}$ exist.
Our approach is through the more intuitive algebraic method of free probability that is applicable after an appropriate embedding, in conjunction with the method of moments. Thus, we are able to provide a general description for the limits in terms of some freely independent variables.
All the previous results follow as special cases.
We suggest statistical uses of these LSD and related results in problems such as order determination and white noise testing.