Reading
Sections 3.3 of Theory of Computing, A Gentle Introduction.
Homework
Then printout the homework and answer the questions on that paper.

Adding a stack to NDFAs
We've been talking about context free languages, which include all of our regular languages, plus many more - like anbn. Next we're going to look at augmenting our finite automata with some memory in hope of constructing machines capable of recognizing languages like anbn. The simplest kind of memory available to us is the stack, because access to memory is so restricted. So we will augment our NDFAs with a stack, and call the resulting machine a pushdown automaton.

Now when we make decisions on what state to move to, we must consider what's on the stack as well as the next input character. Also, when we make a transition from one state to another, we may wish to push more data onto our stack. Thus, every transition must be labeled with three things:

  1. the input character to be read (or the empty string)
  2. the string to be popped off the top of the stack
  3. the string of symbols to be pushed onto the stack
    Note: The stack is a stack of characters. Because of the way stacks work, there's a question about what it means to push or pop a string. Basically the question amounts to whether we push/pop the strings in order front-to-back or back-to-front. For today's problems this will never matter, so I'm going to punt on it 'til next lesson. Then you'll see why a formal definition of "yields in one step" is so necessary!

So, if you consider the simple machine below, the transition labled a,λ;x means read character a, pop nothing off the stack, and push an x onto the stack.

Now, consider what this machine does on input like aabbb:
         _              _              _           _            _              _
INPUT:   aabbb         aabbb         aabbb       aabbb       aabbb         aabbb
STACK:       |S   ==>     x|S  ==>     xx|S  ==>   xx|S  ==>    x|S   ==>      |S  ==>  STUCK! Can't read last character
STATE     q0            q0            q0          q1          q1            q1
	
The machine gets stuck with b left to read, because reading that b requires popping x off the stack, and the stack is empty! Hopefully you can convince yourself that this machine will accept exactly {ambn | m ≥ n}.

anbn
There's a trick we can use to require that the stack is empty, which we talked about in class. Use it on the above machine and you get this machine for anbn.

wcwR, where w ∈ {a,b}*
Build a machine for {wcwR | where w ∈ {a,b}*}.

Palindromes
Build a machine for the palindromes over {a,b}.

ambn where 2*m = n
Build a machine for {ambn | 2m = n}.

ambn where m = 2*n
Build a machine for {ambn | m = 2n}.

A formal definition of a PDA
The machines, i.e. NDFA+stack machines, are called Push Down Automata, or PDA. What we're going to want to prove is that the languages generated by CFGs are exactly the languages accepted by PDAs, so that PDAs and CFGs are equivalent in exactly the same way as NDFAs and Regular Expressions. Of course to prove something like this we need a formal definition of what a PDA really is - something more precise than the pictures we;ve been drawing.
Definition A Push Down Automaton is a 6-tuple (Q,Σ,Γ,Δ,s,W) where
  • Q is the set of states (must be finite!)
  • Σ is the alphabet for input strings (must be finite!)
  • Γ is the set of symbols that can appear on the stack (must be finite!)
  • Δ is the set of transitions, Δ ⊆ Q × (Σ ∪ {λ}) × Γ * × Q × Γ *
  • s is the start state, s ∈ Q
  • W are the accepting states, W ⊆ Q
The transition (p,a,x,q,y) is a interpreted as starting at state p, reading a from the input, popping x off the stack, moving to state q, and pushing y onto the stack. Note that we can push and pop strings, not just individual characters. We will define precisely how to interpret this next lecture. For the moment, let this suffice: the stack is treated as a string, pushing is equivalent to prefixing, and what can be popped off the stack are prefixes.


Christopher W Brown
Last modified: Fri Oct 30 09:16:50 EDT 2009