At many places I have read that the following language is not a regular, and thus it is impossible to express this in terms of Finite Automata.
L={0^i1^i | i>=0}
But I tried this as follows.
Can somebody explain the fallacy?
At many places I have read that the following language is not a regular, and thus it is impossible to express this in terms of Finite Automata.
L={0^i1^i | i>=0}
But I tried this as follows.
Can somebody explain the fallacy?