Sections 1.1, 1.2, 1.3 of Theory of Computing, A Gentle Introduction.

Homework
Printout the homework and answer the questions on that paper.

Choose section leader, pass out course policy and departmental honor policy.
Note: Class lecture notes and homework will, generally, be available after class, not before!

What is "theory of computing"?
• Basically, the theory of computing is about using mathematics to model and investigate machines that compute, and the languages that their computation is concerned with.
• It's pretty clear why we want to investigate machines that compute ... I mean it's clear that this is fundamental to computer science. But why languages? We're not talking programming languages here, we're talking about the language in which input and output of the machine are written. In that sense, you see that languages and machines that compute are inexorably tied together.
• This course has theory in the title, and as you might expect, it's kind of theoretical. However, most of what we talk about in this class is theory that has lots of practical applications. I'll do my best to bring this out as much as I can - including a few hands-on labs where we work with some tools that apply the theory we discuss in class very directly to practical problems. However, this course also has some theory of another kind — theory with mind-expanding philosophical implications — and I'll also do my best to highlight those when we get there.

Languages, alphabets, strings
• alphabets - Definition: An alphabet is a finite set. The elements of the set are called the symbols or characters of the alphabet, although they need not resemble any of the objects we traditionally think of as symbols or characters. For whatever reason, the usual practice in theory of computing is to use the variable Σ (that's an uppercase "sigma") to represent an alphabet.
• strings - Definition: A string is a finite sequence of symbols from some alphabet. If Σ is an alphabet and w is a string whose characters come from Σ, we say that "w is a string over Σ".
• languages - Definition: A language is a set (not necessarily finite!) of strings. Normally we fix an alphabet Σ and consider a set L of strings over Σ. In this case, we say that "L is a language over Σ".
• notation: |w| = the length of string w. wR = the reverse of string w. λ = the empty string.

Further importance of languages
Most fundamental computing problems can be phrased as the following kind of problem: Given a description of a language L, decide whether a given string w is in the language L. These problems are decision problems, which means they have yes or no answers. Often the decision problem version of a question is a bit easier than the original, but they're related.

For example, suppose your original problem is "Compute (3 + (2*x)) + x*4 given x". As a decision problem this , might become "Given a string x:y, where x and y are strings of digits, decide if x:y is in the language of strings for which the numbers encoded by x and y satisfy y = (3 + (2*x)) + x*4".

By restricting our attention to decision problems like this, our computational models can become much simpler, so at some points we'll just talk about decision problems. True, the decision problem is often easier that the original problem, but this is OK. What we'll show at the end of the semester, is that there's a straightforward decision problem of the type described above that can't be solved by any computer or computer program. No matter how clever you are writing your program or building your machine, there will always be some input strings that foil your program. Either it will give wrong answers, or it will run forever, never giving any answer at all! This result about the limitations of computation is one of the most important results, from a philosophical perspective, in computer science and mathematics. And by the end of the semester you will not only understand the result, but how to prove it and, indeed, how it is possible to prove that something is impossible!

Christopher W Brown