0

At many places I have read that the following language is not a regular, and thus it is impossible to express this in terms of Finite Automata.

L={0^i1^i | i>=0}

But I tried this as follows.

enter image description here

Can somebody explain the fallacy?

gpuguy
  • 1,799
  • 3
  • 22
  • 32
  • 1
    A DFA must decide the language. Hence, not only if $w$ is in $L$, the DFA should recongnize it, but also if the DFA recognizes a word $w$, it should be in $L$. The DFA you gave verifies $L \subset L_{DFA}$ but your DFA recognizes strings that are not in $L$. – Tpecatte Sep 06 '13 at 13:54
  • Thanks, can you give an example of a string that it recognizes , but is not in the language ? – gpuguy Sep 06 '13 at 13:56
  • 1
    Well, the language that your DFA decide is $0^*1^+$, hence it recognizes 1, 0000011, 011111,...$\not \in L$ – Tpecatte Sep 06 '13 at 13:58
  • You cleared a big misconception I was having. Thanks – gpuguy Sep 06 '13 at 14:05
  • @Timot Why not make this an answer? – Patrick87 Sep 06 '13 at 14:56
  • I'm closing this as duplicate because the question has been answered already. The language is a standard example of a non-regular language, so NFA can not model it. – Raphael Sep 06 '13 at 16:09

0 Answers0