Conjunctive grammar

Conjunctive grammars are a class of formal grammars studied in formal language theory. They extend the basic type of grammars, the context-free grammars, with a conjunction operation. Besides explicit conjunction, conjunctive grammars allow implicit disjunction represented by multiple rules for a single nonterminal symbol, which is the only logical connective expressible in context-free grammars. Conjunction can be used, in particular, to specify intersection of languages. A further extension of conjunctive grammars known as Boolean grammars additionally allows explicit negation.

The rules of a conjunctive grammar are of the form


 * $$A \to \alpha_1 \And \ldots \And \alpha_m$$

where $$A$$ is a nonterminal and $$\alpha_1$$, ..., $$\alpha_m$$ are strings formed of symbols in $$\Sigma$$ and $$V$$ (finite sets of terminal and nonterminal symbols respectively). Informally, such a rule asserts that every string $$w$$ over $$\Sigma$$ that satisfies each of the syntactical conditions represented by $$\alpha_1$$, ..., $$\alpha_m$$ therefore satisfies the condition defined by $$A$$.

Formal definition
A conjunctive grammar $$G$$ is defined by the 4-tuple $$G = (V, \Sigma, R, S)$$ where
 * 1) $V$ is a finite set; each element $$ v\in V$$ is called a nonterminal symbol or a variable. Each variable represents a different type of phrase or clause in the sentence. Variables are also sometimes called syntactic categories.
 * 2) $&Sigma;$ is a finite set of terminals, disjoint from $V$, which make up the actual content of the sentence. The set of terminals is the alphabet of the language defined by the grammar $G$.
 * 3) $R$ is a finite set of productions, each of the form $$A \rightarrow \alpha_1\&\ldots\&\alpha_m$$ for some $$A$$ in $$V$$ and $$\alpha_i \in (V\cup\Sigma)^*$$. The members of $R$ are called the rules or productions of the grammar.
 * 4) $S$ is the start variable (or start symbol), used to represent the whole sentence (or program). It must be an element of $V$.

It is common to list all right-hand sides for the same left-hand side on the same line, using | (the pipe symbol) to separate them. Rules $$A\rightarrow\alpha_1\&\ldots\&\alpha_m$$ and $$A\rightarrow\beta_1\&\ldots\&\beta_n$$ can hence be written as $$A\rightarrow\alpha_1\&\ldots\&\alpha_m\ |\ \beta_1\&\ldots\&\beta_n$$.

Two equivalent formal definitions of the language specified by a conjunctive grammar exist. One definition is based upon representing the grammar as a system of language equations with union, intersection and concatenation and considering its least solution. The other definition generalizes Chomsky's generative definition of the context-free grammars using rewriting of terms over conjunction and concatenation.

Definition by derivation
For any strings $$u, v \in (V \cup \Sigma \cup \{ \text{“(”}, \text{“}\&\text{”}, \text{“)”} \})^{*}$$, we say $u$ directly yields $v$, written as $$u\Rightarrow v\,$$, if
 * either there is a rule $$A \rightarrow \alpha_1 \& \ldots \& \alpha_m \in R$$ such that $$u\,=u_{1} A u_{2}$$ and $$v\,=u_{1} (\alpha_1 \& \ldots \& \alpha_m) u_{2}$$,
 * or there exists a string $$w \in (V \cup \Sigma)^{*}$$ such that $$u\,=u_{1} (w \& \ldots \& w) u_{2}$$ and $$v\,=u_{1} w u_{2}$$.

For any string $$w \in \Sigma^{*}, $$ we say $G$ generates $w$, written as $$S \ \stackrel{*}{\Rightarrow} \ w$$, if $$\exists k\geq 1\, \exists \, u_{1}, \cdots, u_{k}\in (V \cup \Sigma \cup \{ \text{“(”}, \text{“}\&\text{”}, \text{“)”} \})^{*}$$ such that $$S = \, u_{1} \Rightarrow u_{2} \Rightarrow \cdots \Rightarrow u_{k} \, = w$$.

The language of a grammar $$G = (V, \Sigma, R, S)$$ is the set of all strings it generates.

Example
The grammar $$G = (\{S, A, B, C, D\}, \{a, b, c\}, R, S)$$, with productions
 * $$S\rightarrow AB \& DC$$,
 * $$A\rightarrow aA\ |\ \epsilon$$,
 * $$B\rightarrow bBc\ |\ \epsilon$$,
 * $$C\rightarrow cC\ |\ \epsilon$$,
 * $$D\rightarrow aDb\ |\ \epsilon$$,

is conjunctive. A typical derivation is
 * $$S \Rightarrow (AB \& DC) \Rightarrow (aAB \& DC) \Rightarrow (aB \& DC) \Rightarrow (abBc \& DC) \Rightarrow (abc \& DC) \Rightarrow (abc \& aDbC) \Rightarrow (abc \& abC) \Rightarrow (abc \& abcC) \Rightarrow (abc \& abc) \Rightarrow abc$$

It can be shown that $$L(G) = \{a^nb^nc^n:n \ge 0\}$$. The language is not context-free, proved by the pumping lemma for context-free languages.

Parsing algorithms
Though the expressive power of conjunctive grammars is greater than those of context-free grammars, conjunctive grammars retain some of the latter. Most importantly, there are generalizations of the main context-free parsing algorithms, including the linear-time recursive descent, the cubic-time generalized LR, the cubic-time Cocke-Kasami-Younger, as well as Valiant's algorithm running as fast as matrix multiplication.

Theoretical properties
A property that is undecidable already for context-free languages or finite intersections of them, must be undecidable also for conjunctive grammars; these include: emptiness, finiteness, regularity, context-freeness,

The family of conjunctive languages is closed under union, intersection, concatenation and Kleene star, but not under string homomorphism, prefix, suffix, and substring. Closure under complement and under ε-free string homomorphism are still open problems (as of 2001).

The expressive power of grammars over a one-letter alphabet has been researched.

This work provided a basis for the study of language equations of a more general form.

Synchronized alternating pushdown automata
Aizikowitz and Kaminski introduced a new class of pushdown automata (PDA) called synchronized alternating pushdown automata (SAPDA). They proved it to be equivalent to conjunctive grammars in the same way as nondeterministic PDAs are equivalent to context-free grammars.