Let $\mathbf{v}\in \mathbb{R}^{p+1}$ a known vector and $\mathbf{A}\in\mathbb{R}^{p\times p}$, $\mathbf{B}\in \mathbb{R}^{n \times p}$ known matrices. In this setting, $\mathbf{A}$ is symmetric and invertible. My objective is to determine whether the optimization problem $$\min_{a > 0} \mathbf{v}^\top(a\mathbf{A}+ \mathbf{B}^\top \mathbf{B})^{-1}\mathbf{v}$$ is convex and reformulate it in a friendly manner.
Straightforwardly, the objective function can be rewritten as $$\mathbf{v}^\top(a\mathbf{A}+ \mathbf{B}^\top \mathbf{B})^{-1}\mathbf{v} = \mathbf{v}^\top\left(\mathbf{I}+ \frac{1}{a}\mathbf{A}^{-1}\mathbf{B}^\top \mathbf{B}\right)^{-1}(a\mathbf{A})^{-1}\mathbf{v}.$$ I was not able to follow from here in the general case... However, if $\mathbf{A}^{-1}$ and $\mathbf{B}^\top \mathbf{B}$ commute, and since they are symmetric matrices, the product will be also symmetric, so it can be expressed as $\mathbf{A}^{-1}\mathbf{B}^\top \mathbf{B} = \mathbf{U}^{\top}\boldsymbol{\Sigma}\mathbf{U}$, where $\mathbf{U}^\top = \mathbf{U}^{-1}$ and $\boldsymbol{\Sigma} = \text{diag}(d_1, \ldots, d_p)$. In this case, $$\mathbf{v}^\top(a\mathbf{A}+ \mathbf{B}^\top \mathbf{B})^{-1}\mathbf{v} = \mathbf{v}^\top\left(\mathbf{U}^{\top}\mathbf{U}+ \frac{1}{a}\mathbf{U}^{\top}\boldsymbol{\Sigma}\mathbf{U}\right)^{-1}(a\mathbf{A})^{-1}\mathbf{v} = \frac{1}{a}\mathbf{v}^\top \mathbf{U}\left(\mathbf{I}+ \frac{1}{a}\boldsymbol{\Sigma}\right)^{-1}\mathbf{U}^{\top} \mathbf{A}^{-1}\mathbf{v}.$$
Since $\mathbf{I}+ \frac{1}{a}\boldsymbol{\Sigma}$ is diagonal, its elements are of the form $\frac{1}{1+ \frac{d_i}{a}}$, and multiply by the $\frac{1}{a}$ factor we get $$\min_{a > 0} \mathbf{r}^\top\mathbf{D}\mathbf{s}$$ where $\mathbf{r}^\top = \mathbf{v}^\top \mathbf{U}$, $\mathbf{s}=\mathbf{U}^{\top} \mathbf{A}^{-1}\mathbf{v}$ and $\mathbf{D} = \text{diag}\left(\frac{1}{a + d_1}, \ldots, \frac{1}{a+d_p}\right)$.
I was wondering if I could get rid of this strong assumption (the matrices commute). Any help will be appreciated.