By common convention, without further context $\,f(x) = x/x\,$ is not defined at $\,x = 0,\,$ since evaluating at $\,x = 0\,$ leads to division by $\,0.\,$ However, further contextual information may reveal that this discontinuity at $\,x = 0\,$ is an artifact of the contsruction of $\,f.\,$ For example, if we obtained that expression for $\,f\,$ by solving the equation $\,x f = x\,$ and we know in our context that the solution must be continuous (e.g. we may know a priori that $\,f(x)\,$ must be a polynomial in $\,x),\,$ then the unique solution is $\,f(x) = 1.\,$ In this case the discontinuity at $\,x = 0\,$ is an artifact of the algebra involved in the solution process - it was introduced by dividing by $\,x.\,$ In algebra, such removable discontinuities are often removed by cancellation, and this is a powerful algebraic method of removing such singularities, e.g. see this answer where it is used to algebraically define polynomial derivatives, and to obtain a slick proof of Sylvester's determinant identity.
In summary, the denotation of the expression $\,x/x\,$ depends upon the ambient context and, as such, it may or may not be defined and/or continuous at $\,x = 0.\,$ Lacking any further context, the normal convention is that it is not defined at $\,x = 0.$