The Fisher information is the variance of the score. The score, in turn, is the gradient of the log-likelihood function with respect to the parameter vector.
$$\mathcal{I}(\theta) = \text{Var}\{s(\theta)\} = \mathbb{E}[(s(\theta) - \mathbb{E}[s(\theta)])^2]$$
The score is defined as $s(\theta) = \frac{\partial \log \mathcal{L}(\theta)}{\partial \theta}$, where $\mathcal{L}$ is the likelihood function.
The expected value of the score at the true parameter is $0$, so the Fisher information becomes:
$$\mathcal{I}(\theta) = \mathbb{E}[s(\theta)^2] = \mathbb{E}[\frac{\partial}{(\partial \theta} \log f(X; \theta))^2 | \theta]$$
Let's now find the Fisher information of your estimator for the uniform distribution.
The probability distribution function (pdf) of the uniform distribution is:
$$f(x; \theta) = \frac{1}{\theta}$$
The likelihood function is then $\mathcal{L}(\theta) = \frac{1}{\theta}$, so the log-likelihood function is $\log \mathcal{L}(\theta) = - \log (\theta)$.
Let us now express the score by taking the derivative of the log-likelihood function.
$$s(\theta) = -\frac{1}{\theta}$$
Finally, let us compute the Fisher information of the uniformly distributed random variable (not the estimator's Fisher information!).
$$\mathcal{I}(\theta) = \int_0^{\theta} \frac{1}{\theta} \cdot (-\frac{1}{\theta})^2 dx = \frac{1}{\theta^3} \cdot \theta = \frac{1}{\theta^2}$$
Now that we have the Fisher information of the uniformly distributed random variable, the Cramér-Rao lower bound (CRB) is equal to $\theta^2$.
All that is left to do is to compute the variance of the estimator. To do so, you just take the empirical variance. Now, you can see how close your estimator comes to the CRB.