World's most popular travel blog for travel bloggers.

[Solved]: If a parser can parse a non-deterministic grammar, is the parser non-deterministic?

, , No Comments
Problem Detail: 

I've written a recursive-descent parser generator, and I'm trying to classify it (call me a cowboy coder if you must). According to wikipedia, S → 0S0 | 1S1 | ε, which checks for an even-length palindrome, is a non-deterministic grammar. My parser generator can handle this grammar. Does that mean my parser is non-deterministic?

To be honest, I'm not even sure that it's proper to try to classify it like this. It doesn't really match the description of a pushdown automata, since it passes data up and down through the stack (parameters, passed by reference, which may be modified). If anyone would be interested in taking a closer look at it, I'd be most grateful. It handles left recursion and ambiguous grammars in (I believe) polynomial time and space. https://github.com/coder0xff/parlex

Asked By : Brent

Answered By : D.W.

No, it doesn't mean that. A non-deterministic algorithm is one that uses non-determinism. You're not using non-determinism; your algorithm is completely deterministic.

Here's what's tripped you up. You can recognize a non-deterministic grammar using a deterministic algorithm. Similarly, you can recognize whether a word is accepted by a non-deterministic finite automaton (NFA) using a deterministic algorithm (e.g., simply convert the NFA to a DFA, then check whether the DFA accepts the word). So, no, just because the grammar is non-deterministic does not mean that we should call your parsing algorithm non-deterministic.

Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/13393

0 comments:

Post a Comment

Let us know your responses and feedback