Consider the following polynomials:
$$f=x^2 y +xy^2 +y^2; \qquad f_1 =y^2 -1; \quad\text{and}\quad f_2 = xy -1. $$
Dividing by $(f_1,f_2)$, one gets
$$f=f_1 (x+1)+f_2 (x)+2x+1. $$
During lecture the teacher found a Gröbner basis by multiplying 3 matrices:
$$\begin{bmatrix} -1 &-1& -y \\ 0& 1& 0 \\ 0& 0& 1\end{bmatrix} \begin{bmatrix} -1 &0 \\ 0& 1\\ y& -x\end{bmatrix} \begin{bmatrix} xy -1\\ y^2-1\end{bmatrix} = \begin{bmatrix} 0 \\ y^2-1\\ x-y \end{bmatrix} $$
This is supposed to be a Gröbner basis but I'm not sure how it was obtained through matrix multiplication.
This is an example from Cox, Little, and O'Shea's, Ideals, Varieties, and Algorithms, chapter 2, section 3, example 4.