I have a project in which there exist $N$ Beta-distributed Random variables each of which should be estimated, having a sample for each of them. The sample domain is $\{0.1,0.3,0.5,0.7,0.9\}$ and the samples are similar to the following sets \begin{align*} &S_{1}=\{0.1,0.3,0.3,0.7\}\\ &S_{2}=\{0.3,0.3,0.9\}\\ &S_{3}=\{0.1,0.1,0.3,0.3,0.5,0.7,0.9\}\\ &S_{4}=\{0.3,0.5\}\\ &...\\ &S_{i}=\{0.3\}\\ &...\\ &S_{j}=\{0.1,0.1\}\\ &...\\ &S_{N-3}=\{0.5,0.5,0.7,0.7,0.7,0.7,0.9,0.9\}\\ &S_{N-2}=\{0.3,0.3,0.5\}\\ &S_{N-1}=\{0.5,0.5,0.7\}\\ &S_{N}=\{0.1,0.5,0.7,0.9,0.9,0.9,0.9\} \end{align*} Each of them is a sample (containing observation(s)), related to a separate Beta distributed random variable. For estimating the corresponding Beta random variables, I take advantage of the well-known estimators, introduced in general resources such as Wikipedia: \begin{align*} &\text{Method of Moments}:\\ &\bar{x}=\frac{1}{N}\sum_{i=1}^{N}X_i\\ &\bar{v}=\frac{1}{N-1}\sum_{i=1}^{N}{(X_i-\bar{x})^2}\\\\ &\hat{\alpha}=\bar{x}\left(\frac{\bar{x}(1-\bar{x})}{\bar{v}}-1\right),\text{if }\bar{v}<\bar{x}(1-\bar{x})\\ &\hat{\beta}=(1-\bar{x})\left(\frac{\bar{x}(1-\bar{x})}{\bar{v}}-1\right),\text{if }\bar{v}<\bar{x}(1-\bar{x})\\\\ &\text{Maximum Likelihood Estimator}\\ &\hat{G}_X=\prod_{i=1}^N{X_i^{\frac{1}{N}}}\\ &\hat{G}_{1-X}=\prod_{i=1}^N{(1-X_i)^{\frac{1}{N}}}\\\\ &\hat{\alpha}\approx\frac{1}{2}+\frac{\hat{G}_X}{2\left(1-\hat{G}_{X}-\hat{G}_{1-X}\right)}\text{if }\hat{\alpha}>1\\ &\hat{\beta}\approx\frac{1}{2}+\frac{\hat{G}_{1-X}}{2\left(1-\hat{G}_{X}-\hat{G}_{1-X}\right)}\text{if }\hat{\beta}>1 \end{align*} Most of the mentioned random variables are estimated by the above techniques. However, in (non-common) situations like $S_i$ or $S_j$, in which the variance of the sample is zero, the well-known estimators fail in estimation (having zero in denominator).
Briefly speaking, the question is how to estimate the parameters of the Beta distribution, when the sample variance is zero.