Torque: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Excirial
m Reverted edits by 131.136.242.1 (talk) (HG 3)
en>Rchandra
m →‎Terminology: Fixing style/layout errors
Line 1: Line 1:
{{about|the machine|the test of artificial intelligence|Turing test|the instrumental rock band|Turing Machine (band)}}
Royal Votaw is my name but I never truly favored that title. Her spouse and her selected to reside in Alabama. The job I've been occupying for many years is a bookkeeper but I've currently applied for another one. Playing croquet is something I will by no means give up.<br><br>Here is my weblog - [http://Www.livewriters.com/uprofile.php?UID=444110 http://Www.livewriters.com/uprofile.php?UID=444110]
{{turing}}
[[Image:Maquina.png|thumb|An artistic representation of a Turing machine (Rules table not represented)]]
 
A '''Turing machine''' is a hypothetical device that manipulates symbols on a strip of tape according to a table of rules. Despite its simplicity, a Turing machine can be adapted to simulate the logic of any [[computer]] [[algorithm]], and is particularly useful in explaining the functions of a [[CPU]] inside a computer.
 
The "Turing" machine was invented in 1936 by [[Alan Turing]]<ref>
The idea came to him in mid-1935 (perhaps, see more in the History section) after a question posed by [[M. H. A. Newman]] in his lectures: "Was there a definite method, or as Newman put it, a ''mechanical process'' which could be applied to a mathematical statement, and which would come up with the answer as to whether it was provable" (Hodges 1983:93). Turing submitted his paper on 31 May 1936 to the London Mathematical Society for its ''Proceedings'' (cf Hodges 1983:112), but it was ''published'' in early 1937 and offprints were available in February 1937 (cf Hodges 1983:129).</ref>
who called it an "a-machine" (automatic machine). The Turing machine is not intended as practical computing technology, but rather as a hypothetical device representing a computing machine. Turing machines help computer scientists understand the limits of mechanical computation.
 
Turing gave a succinct definition of the experiment in his 1948 essay, "Intelligent Machinery". Referring to his 1936 publication, Turing wrote that the Turing machine, here called a Logical Computing Machine, consisted of:
 
<blockquote>
...an unlimited memory capacity obtained in the form of an infinite tape marked out into squares, on each of which a symbol could be printed. At any moment there is one symbol in the machine; it is called the scanned symbol. The machine can alter the scanned symbol and its behavior is in part determined by that symbol, but the symbols on the tape elsewhere do not affect the behavior of the machine. However, the tape can be moved back and forth through the machine, this being one of the elementary operations of the machine. Any symbol on the tape may therefore eventually have an innings.<ref>See the definition of "[[wikt:innings|innings]]" on [[Wiktionary]]</ref> (Turing 1948, p. 61)
</blockquote>
 
A Turing machine that is able to simulate any other Turing machine is called a [[universal Turing machine]] ('''UTM''', or simply a '''universal machine'''). A more mathematically-oriented definition with a similar "universal" nature was introduced by [[Alonzo Church]], whose work on [[lambda calculus]] intertwined with Turing's in a formal theory of [[computation]] known as the [[Church–Turing thesis]]. The thesis states that Turing machines indeed capture the informal notion of effective method in [[logic]] and [[mathematics]], and provide a precise definition of an [[algorithm]] or "mechanical procedure". Studying their  [[abstract machine|abstract properties]] yields many insights into [[computer science]] and [[computational complexity theory|complexity theory]].
 
==Informal description==
{{for|visualizations of Turing machines|Turing machine gallery}}
 
The Turing machine mathematically models a machine that mechanically operates on a tape.  On this tape are symbols, which the machine can read and write, one at a time, using a tape head. Operation is fully determined by a finite set of elementary instructions such as "in state 42, if the symbol seen is 0, write a 1; if the symbol seen is 1, change into state 17; in state 17, if the symbol seen is 0, write a 1 and change to state 6;" etc. In the original article ("On computable numbers, with an application to the [[Entscheidungsproblem]]", see also [[#The Entscheidungsproblem (the "decision problem"): Hilbert's tenth question of 1900|references below]]), Turing imagines not a mechanism, but a person whom he calls the "computer", who executes these deterministic mechanical rules slavishly (or as Turing puts it, "in a desultory manner").
 
[[Image:Turing machine 2a.svg|thumb|right|300px|The head is always over a particular square of the tape; only a finite stretch of squares is shown. The instruction to be performed (q<sub>4</sub>) is shown over the scanned square. (Drawing after Kleene (1952) p.375.)]]
 
[[Image:Turing machine 2b.svg|thumb|right|300px|Here, the internal state (q<sub>1</sub>) is shown inside the head, and the illustration describes the tape as being infinite and pre-filled with "0", the symbol serving as blank. The system's full state (its ''complete configuration'') consists of the internal state, any non-blank symbols on the tape (in this illustration "11B"), and the position of the head relative to those symbols including blanks, i.e. "011B". (Drawing after Minsky (1967) p. 121).]]
 
More precisely, a Turing machine consists of:
{{ordered list
|1 = A '''tape''' divided into cells, one next to the other. Each cell contains a symbol from some finite alphabet. The alphabet contains a special ''blank'' symbol (here written as '0') and one or more other symbols. The tape is assumed to be arbitrarily extendable to the left and to the right, i.e., the Turing machine is always supplied with as much tape as it needs for its computation. Cells that have not been written before are assumed to be filled with the blank symbol. In some models the tape has a left end marked with a special symbol; the tape extends or is indefinitely extensible to the right.
|2 = A '''head''' that can read and write symbols on the tape and move the tape left and right one (and only one) cell at a time. In some models the head moves and the tape is stationary.
|3 = A '''state register''' that stores the state of the Turing machine, one of finitely many. Among these is the special ''start state'' with which the state register is initialized. These states, writes Turing, replace the "state of mind" a person performing computations would ordinarily be in.
|4 = A finite '''table''' (occasionally called an '''action table''' or '''transition function''') of instructions (usually quintuples [5-tuples] : q<sub>i</sub>a<sub>j</sub>→q<sub>i1</sub>a<sub>j1</sub>d<sub>k</sub>, but sometimes quadruples [4-tuples]) that, given the ''state''(q<sub>i</sub>) the machine is currently in ''and'' the ''symbol''(a<sub>j</sub>) it is reading on the tape (symbol currently under the head) tells the machine to do the following in sequence (for the 5-tuple models):
* Either erase or write a symbol (replacing a<sub>j</sub> with a<sub>j1</sub>), ''and then''
* Move the head (which is described by d<sub>k</sub> and can have values: 'L' for one step left ''or'' 'R' for one step right ''or'' 'N' for staying in the same place), ''and then''
* Assume the same or a ''new state'' as prescribed (go to state q<sub>i1</sub>).
In the 4-tuple models, erasing or writing a symbol (a<sub>j1</sub>) and moving the head left or right (d<sub>k</sub>) are specified as separate instructions. Specifically, the table tells the machine to (ia) erase or write a symbol ''or'' (ib) move the head left or right, ''and then'' (ii) assume the same or a new state as prescribed, but not both actions (ia) and (ib) in the same instruction. In some models, if there is no entry in the table for the current combination of symbol and state then the machine will halt; other models require all entries to be filled.
}}
 
Note that every part of the machine (i.e. its state and symbol-collections) and its actions (such as printing, erasing and tape motion) is ''finite'', ''discrete'' and ''distinguishable''; it is the potentially unlimited amount of tape that gives it an unbounded amount of [[Computer storage|storage space]].
 
== Formal definition ==
 
Hopcroft and Ullman (1979, p.&nbsp;148) formally defined a (one-tape) Turing machine as a 7-[[tuple]] <math>M= \langle Q, \Gamma, b, \Sigma, \delta, q_0, F \rangle</math> where
* <math>Q</math> is a finite, non-empty set of ''states''
* <math>\Gamma</math> is a finite, non-empty set of the ''tape alphabet/symbols''
* <math>b \in \Gamma</math> is the ''blank symbol'' (the only symbol allowed to occur on the tape infinitely often at any step during the computation)
* <math>\Sigma\subseteq\Gamma\setminus\{b\}</math> is the set of ''input symbols''
* <math>q_0 \in Q</math> is the ''initial state''
* <math>F \subseteq Q</math> is the set of ''final'' or ''accepting states''.
* <math>\delta: Q \setminus F \times \Gamma \rightarrow Q \times \Gamma \times \{L,R\}</math> is a [[partial function]] called the ''[[transition function]]'', where L is left shift, R is right shift. (A relatively uncommon variant allows "no shift", say N, as a third element of the latter set.)
 
Anything that operates according to these specifications is a Turing machine.
 
The 7-tuple for the 3-state [[busy beaver]] looks like this (see more about this busy beaver at [[Turing machine examples]]):
* <math>Q = \{ \mbox{A}, \mbox{B}, \mbox{C}, \mbox{HALT} \}</math>
* <math>\Gamma = \{ 0, 1 \}</math>
* <math>b = 0</math>  ("blank")
* <math>\Sigma = \{ 1 \}</math>
* <math>q_0 = \mbox{A}</math> (the initial state)
* <math>F = \{ \mbox{HALT} \}</math>
* <math>\delta = </math> see state-table below
 
Initially all tape cells are marked with 0.
 
{| class="wikitable" style="text-align:center"
|+ State table for 3 state, 2 symbol busy beaver
! rowspan="2" | Tape symbol
! colspan="3" | Current state A
! colspan="3" | Current state B
! colspan="3" | Current state C
|- style="font-size:9pt"
| Write symbol
| Move tape
| Next state
| Write symbol
| Move tape
| Next state
| Write symbol
| Move tape
| Next state
|-
| 0
| 1
| R
| '''B'''
| 1
| L
| '''A'''
| 1
| L
| '''B'''
|-
| 1
| 1
| L
| '''C'''
| 1
| R
| '''B'''
| 1
| R
| '''HALT'''
|}
 
== Additional details required to visualize or implement Turing machines ==
 
In the words of van Emde Boas (1990), p.&nbsp;6: "The set-theoretical object [his formal seven-tuple description similar to the above] provides only partial information on how the machine will behave and what its computations will look like."
 
For instance,
* There will need to be many decisions on what the symbols actually look like, and a failproof way of reading and writing symbols indefinitely.
* The shift left and shift right operations may shift the tape head across the tape, but when actually building a Turing machine it is more practical to make the tape slide back and forth under the head instead.
* The tape can be finite, and automatically extended with blanks as needed (which is closest to the mathematical definition), but it is more common to think of it as stretching infinitely at both ends and being pre-filled with blanks except on the explicitly given finite fragment the tape head is on.  (This is, of course, not implementable in practice.)  The tape ''cannot'' be fixed in length, since that would not correspond to the given definition and would seriously limit the range of computations the machine can perform to those of a [[linear bounded automaton]].
 
===Alternative definitions===
Definitions in literature sometimes differ slightly, to make arguments or proofs easier or clearer, but this is always done in such a way that the resulting machine has the same computational power. For example, changing the set <math>\{L,R\}</math> to <math>\{L,R,N\}</math>, where ''N'' ("None" or "No-operation") would allow the machine to stay on the same tape cell instead of moving left or right, does not increase the machine's computational power.
 
The most common convention represents each "Turing instruction" in a "Turing table" by one of nine 5-tuples, per the convention of Turing/Davis (Turing (1936) in ''Undecidable'', p.&nbsp;126-127 and Davis (2000) p.&nbsp;152):
 
: (definition 1): '''(q<sub>i</sub>, S<sub>j</sub>, S<sub>k</sub>/E/N, L/R/N, q<sub>m</sub>)'''
:: '''(''' current state '''q<sub>i</sub>''' ''',''' symbol scanned '''S<sub>j</sub>''' ''',''' print symbol '''S<sub>k</sub>'''/erase '''E'''/none '''N''' ''',''' move_tape_one_square left '''L'''/right '''R'''/none '''N''' ''',''' new state '''q<sub>m</sub>''' ''')'''
 
Other authors (Minsky (1967) p.&nbsp;119, Hopcroft and Ullman (1979) p.&nbsp;158, Stone (1972) p.&nbsp;9) adopt a different convention, with new state '''q<sub>m</sub>''' listed immediately after the scanned symbol S<sub>j</sub>:
: (definition 2): '''(q<sub>i</sub>, S<sub>j</sub>, q<sub>m</sub>, S<sub>k</sub>/E/N, L/R/N)'''
:: '''(''' current state '''q<sub>i</sub>''' ''',''' symbol scanned '''S<sub>j</sub>''' ''',''' new state '''q<sub>m</sub>''' ''',''' print symbol '''S<sub>k</sub>'''/erase '''E'''/none '''N''' ''',''' move_tape_one_square left '''L'''/right '''R'''/none '''N''' ''')'''
 
For the remainder of this article "definition 1" (the Turing/Davis convention) will be used.
 
{| class="wikitable" style="text-align: center"
|+ Example: state table for the 3-state 2-symbol busy beaver reduced to 5-tuples
! Current state
! Scanned symbol
!
! Print symbol
! Move tape
! Final (i.e. next) state
! 5-tuples
|-
| '''A'''
| 0
|
| 1
| R
| '''B'''
| ('''A''', 0, 1, R, '''B''')
|-
| '''A'''
| 1
|
| 1
| L
| '''C'''
| ('''A''', 1, 1, L, '''C''')
|-
| '''B'''
| 0
|
| 1
| L
| '''A'''
| ('''B''', 0, 1, L, '''A''')
|-
| '''B'''
| 1
|
| 1
| R
| '''B'''
| ('''B''', 1, 1, R, '''B''')
|-
| '''C'''
| 0
|
| 1
| L
| '''B'''
| ('''C''', 0, 1, L, '''B''')
|-
| '''C'''
| 1
|
| 1
| N
| '''H'''
| ('''C''', 1, 1, N, '''H''')
|}
 
In the following table, Turing's original model allowed only the first three lines that he called N1, N2, N3 (cf Turing in ''Undecidable'', p.&nbsp;126). He allowed for erasure of the "scanned square" by naming a 0th symbol S<sub>0</sub> = "erase" or "blank", etc. However, he did not allow for non-printing, so every instruction-line includes "print symbol S<sub>k</sub>" or "erase" (cf footnote 12 in Post (1947), ''Undecidable'' p.&nbsp;300). The abbreviations are Turing's (''Undecidable'' p.&nbsp;119). Subsequent to Turing's original paper in 1936–1937, machine-models have allowed all nine possible types of five-tuples:
 
{| class="wikitable" style="text-align: center"
|-
!
! style="width:15%" | Current m-configuration (Turing state)
! Tape symbol
! Print-operation
! Tape-motion
! style="width:15%" | Final m-configuration (Turing state)
! 5-tuple
! 5-tuple comments
! 4-tuple
|-
| N1
| q<sub>i</sub>
| S<sub>j</sub>
| Print(S<sub>k</sub>)
| Left L
| q<sub>m</sub>
| (q<sub>i</sub>, S<sub>j</sub>, S<sub>k</sub>, L, q<sub>m</sub>)
| "blank" = S<sub>0</sub>, 1=S<sub>1</sub>, etc.
|
|-
| N2
| q<sub>i</sub>
| S<sub>j</sub>
| Print(S<sub>k</sub>)
| Right R
| q<sub>m</sub>
| (q<sub>i</sub>, S<sub>j</sub>, S<sub>k</sub>, R, q<sub>m</sub>)
| "blank" = S<sub>0</sub>, 1=S<sub>1</sub>, etc.
|
|-
| N3
| q<sub>i</sub>
| S<sub>j</sub>
| Print(S<sub>k</sub>)
| None N
| q<sub>m</sub>
| (q<sub>i</sub>, S<sub>j</sub>, S<sub>k</sub>, N, q<sub>m</sub>)
| "blank" = S<sub>0</sub>, 1=S<sub>1</sub>, etc.
| (q<sub>i</sub>, S<sub>j</sub>, S<sub>k</sub>, q<sub>m</sub>)
|-
| 4
| q<sub>i</sub>
| S<sub>j</sub>
| None N
| Left L
| q<sub>m</sub>
| (q<sub>i</sub>, S<sub>j</sub>, N, L, q<sub>m</sub>)
|
| (q<sub>i</sub>, S<sub>j</sub>, L, q<sub>m</sub>)
|-
| 5
| q<sub>i</sub>
| S<sub>j</sub>
| None N
| Right R
| q<sub>m</sub>
| (q<sub>i</sub>, S<sub>j</sub>, N, R, q<sub>m</sub>)
|
| (q<sub>i</sub>, S<sub>j</sub>, R, q<sub>m</sub>)
|-
| 6
| q<sub>i</sub>
| S<sub>j</sub>
| None N
| None N
| q<sub>m</sub>
| (q<sub>i</sub>, S<sub>j</sub>, N, N, q<sub>m</sub>)
| Direct "jump"
| (q<sub>i</sub>, S<sub>j</sub>, N, q<sub>m</sub>)
|-
| 7
| q<sub>i</sub>
| S<sub>j</sub>
| Erase
| Left L
| q<sub>m</sub>
| (q<sub>i</sub>, S<sub>j</sub>, E, L, q<sub>m</sub>)
|
|
|-
| 8
| q<sub>i</sub>
| S<sub>j</sub>
| Erase
| Right R
| q<sub>m</sub>
| (q<sub>i</sub>, S<sub>j</sub>, E, R, q<sub>m</sub>)
|
|
|-
| 9
| q<sub>i</sub>
| S<sub>j</sub>
| Erase
| None N
| q<sub>m</sub>
| (q<sub>i</sub>, S<sub>j</sub>, E, N, q<sub>m</sub>)
|
| (q<sub>i</sub>, S<sub>j</sub>, E, q<sub>m</sub>)
|}
 
Any Turing table (list of instructions) can be constructed from the above nine 5-tuples. For technical reasons, the three non-printing or "N" instructions (4, 5, 6) can usually be dispensed with. For examples see [[Turing machine examples]].
 
Less frequently the use of 4-tuples are encountered: these represent a further atomization of the Turing instructions (cf Post (1947), Boolos & Jeffrey (1974, 1999), Davis-Sigal-Weyuker (1994)); also see more at [[Post–Turing machine]].
 
===The "state"===
The word "state" used in context of Turing machines can be a source of confusion, as it can mean two things. Most commentators after Turing have used "state" to mean the name/designator of the current instruction to be performed—i.e. the contents of the state register. But Turing (1936) made a strong distinction between a record of what he called the machine's "m-configuration", (its internal state) and the machine's (or person's) "state of progress" through the computation - the current state of the total system. What Turing called "the state formula" includes both the current instruction and ''all'' the symbols on the tape:
 
{{quote|Thus the state of progress of the computation at any stage is completely determined by the note of instructions and the symbols on the tape. That is, the '''state of the system''' may be described by a single expression (sequence of symbols) consisting of the symbols on the tape followed by Δ (which we suppose not to appear elsewhere) and then by the note of instructions. This expression is called the 'state formula'.|''Undecidable'', p.139–140, emphasis added}}
 
Earlier in his paper Turing carried this even further: he gives an example where he placed a symbol of the current "m-configuration"—the instruction's label—beneath the scanned square, together with all the symbols on the tape (''Undecidable'', p.&nbsp;121); this he calls "the ''complete configuration''" (''Undecidable'', p.&nbsp;118). To print the "complete configuration" on one line, he places the state-label/m-configuration to the ''left'' of the scanned symbol.
 
A variant of this is seen in Kleene (1952) where [[Stephen_Cole_Kleene|Kleene]] shows how to write the [[Gödel number]] of a machine's "situation": he places the "m-configuration" symbol q<sub>4</sub> over the scanned square in roughly the center of the 6 non-blank squares on the tape (see the Turing-tape figure in this article) and puts it to the ''right'' of the scanned square. But Kleene refers to "q<sub>4</sub>" itself as "the machine state" (Kleene, p.&nbsp;374-375). Hopcroft and Ullman call this composite the "instantaneous description" and follow the Turing convention of putting the "current state" (instruction-label, m-configuration) to the ''left'' of the scanned symbol (p.&nbsp;149).
 
'''Example: total state of 3-state 2-symbol busy beaver after 3 "moves"''' (taken from example "run" in the figure below):
:: 1'''A'''1
This means: after three moves the tape has ... 000110000 ... on it, the head is scanning the right-most 1, and the state is '''A'''. Blanks (in this case represented by "0"s) can be part of the total state as shown here: '''B'''01; the tape has a single 1 on it, but the head is scanning the 0 ("blank") to its left and the state is '''B'''.
 
"State" in the context of Turing machines should be clarified as to which is being described: (''i'') the current instruction, or (''ii'') the list of symbols on the tape together with the current instruction, or (''iii'') the list of symbols on the tape together with the current instruction placed to the left of the scanned symbol or to the right of the scanned symbol.
 
Turing's biographer Andrew Hodges (1983: 107) has noted and discussed this confusion.
 
===Turing machine "state" diagrams===
 
{|class="wikitable"
|+ The table for the 3-state busy beaver ("P" = print/write a "1")
|-
! Tape symbol
! colspan="3" | Current state A
! colspan="3" | Current state B
! colspan="3" | Current state C
|-
|
| Write symbol
| Move tape
| Next state
| Write symbol
| Move tape
| Next state
| Write symbol
| Move tape
| Next state
|-
| '''0'''
| P
| R
| '''B'''
| P
| L
| '''A'''
| P
| L
| '''B'''
|-
| '''1'''
| P
| L
| '''C'''
| P
| R
| '''B'''
| P
| R
| '''HALT'''
|}
 
[[Image:State diagram 3 state busy beaver 2B.svg|thumb|500px|right|The "3-state busy beaver" Turing machine in a [[Finite State Machine|finite state representation]]. Each circle represents a "state" of the TABLE—an "m-configuration" or "instruction". "Direction" of a state ''transition'' is shown by an arrow. The label (e.g.. '''0/P,R''') near the outgoing state (at the "tail" of the arrow) specifies the scanned symbol that causes a particular transition (e.g. '''0''') followed by a slash '''/''', followed by the subsequent "behaviors" of the machine, e.g. "'''P''' '''P'''rint" then move tape "'''R''' '''R'''ight". No general accepted format exists. The convention shown is after McClusky (1965), Booth (1967), Hill, and Peterson (1974).]]
 
To the right: the above TABLE as expressed as a "state transition" diagram.
 
Usually large TABLES are better left as tables (Booth, p.&nbsp;74). They are more readily simulated by computer in tabular form (Booth, p.&nbsp;74). However, certain concepts—e.g. machines with "reset" states and machines with repeating patterns (cf Hill and Peterson p.&nbsp;244ff)—can be more readily seen when viewed as a drawing.
 
Whether a drawing represents an improvement on its TABLE must be decided by the reader for the particular context. See [[Finite state machine]] for more.
 
[[Image:State diagram 3 state busy beaver 4 .JPG|thumbnail|500px|right|The evolution of the busy-beaver's computation starts at the top and proceeds to the bottom.]]
 
The reader should again be cautioned that such diagrams represent a snapshot of their TABLE frozen in time, ''not'' the course ("trajectory") of a computation ''through'' time and/or space. While every time the busy beaver machine "runs" it will always follow the same state-trajectory, this is not true for the "copy" machine that can be provided with variable input "parameters".
 
The diagram "Progress of the computation" shows the 3-state busy beaver's "state" (instruction) progress through its computation from start to finish. On the far right is the Turing "complete configuration" (Kleene "situation"<!-- Stuation? Not Situation? Is that the right word? Yes, it is, cf page 375 for example "The Godel number of the stiuation..." and then he shows a tape, etc. This usage runs throughout the chapter. wvbailey~~~~ --><!--So is it "stuation" or "stiuation"?-->, Hopcroft–Ullman "instantaneous description") at each step. If the machine were to be stopped and cleared to blank both the "state register" and entire tape, these "configurations" could be used to rekindle a computation anywhere in its progress (cf Turing (1936) ''Undecidable'' pp.&nbsp;139–140).
 
==Models equivalent to the Turing machine model==
{{See also|Turing machine equivalents|Register machine|Post–Turing machine}}
 
Many machines that might be thought to have more computational capability than a simple universal Turing machine can be shown to have no more power (Hopcroft and Ullman p.&nbsp;159, cf Minsky (1967)). They might compute faster, perhaps, or use less memory, or their instruction set might be smaller, but they cannot compute more powerfully (i.e. more mathematical functions). (Recall that the [[Church–Turing thesis]] ''hypothesizes'' this to be true for any kind of machine: that anything that can be "computed" can be computed by some Turing machine.)
 
A Turing machine is equivalent to a [[pushdown automaton]] that has been made more flexible and concise by relaxing the [[LIFO (computing)|last-in-first-out]] requirement of its stack.
 
At the other extreme, some very simple models turn out to be [[Turing completeness|Turing-equivalent]], i.e. to have the same computational power as the Turing machine model.
 
Common equivalent models are the [[multi-tape Turing machine]], [[multi-track Turing machine]], machines with input and output, and the [[Non-deterministic Turing machine|''non-deterministic'' Turing machine]] (NDTM) as opposed to the ''deterministic'' Turing machine (DTM) for which the action table has at most one entry for each combination of symbol and state.
 
[[Read-only right moving Turing machines|Read-only, right-moving Turing machines]] are equivalent to [[Nondeterministic finite automaton|NDFAs]] (as well as [[Deterministic finite automaton|DFAs]] by conversion using the [[NDFA to DFA conversion algorithm]]).
 
For practical and didactical intentions the equivalent [[register machine]] can be used as a usual [[Assembly language|assembly]] [[programming language]].
 
==Choice c-machines, Oracle o-machines==
Early in his paper (1936) Turing makes a distinction between an "automatic machine"—its "motion ... completely determined by the configuration" and a "choice machine":
 
{{quote|...whose motion is only partially determined by the configuration ... When such a machine reaches one of these ambiguous configurations, it cannot go on until some arbitrary choice has been made by an external operator. This would be the case if we were using machines to deal with axiomatic systems.|''Undecidable'', p. 118}}
 
Turing (1936) does not elaborate further except in a footnote in which he describes how to use an a-machine to "find all the provable formulae of the [Hilbert] calculus" rather than use a choice machine. He "suppose[s] that the choices are always between two possibilities 0 and 1. Each proof will then be determined by a sequence of choices i<sub>1</sub>, i<sub>2</sub>, ..., i<sub>n</sub> (i<sub>1</sub> = 0 or 1, i<sub>2</sub> = 0 or 1, ..., i<sub>n</sub> = 0 or 1), and hence the number 2<sup>n</sup> + i<sub>1</sub>2<sup>n-1</sup> + i<sub>2</sub>2<sup>n-2</sup> + ... +i<sub>n</sub> completely determines the proof. The automatic machine carries out successively proof 1, proof 2, proof 3, ..." (Footnote ‡, ''Undecidable'', p.&nbsp;138)
 
This is indeed the technique by which a deterministic (i.e. a-) Turing machine can be used to mimic the action of a [[nondeterministic Turing machine]]; Turing solved the matter in a footnote and appears to dismiss it from further consideration.
 
An [[oracle machine]] or o-machine is a Turing a-machine that pauses its computation at state "'''o'''" while, to complete its calculation, it "awaits the decision" of "the oracle"—an unspecified entity "apart from saying that it cannot be a machine" (Turing (1939), Undecidable p.&nbsp;166–168). The concept is now actively used by mathematicians.
 
==Universal Turing machines==
{{Main|Universal Turing machine}}
 
[[File:Model of a Turing machine.jpg|thumb|An implementation of a Turing machine]]
As Turing wrote in ''Undecidable'', p.&nbsp;128 (italics added):
{{quote|It is possible to invent a ''single machine'' which can be used to compute ''any'' computable sequence. If this machine '''U''' is supplied with the tape on the beginning of which is written the string of quintuples separated by semicolons of some computing machine '''M''', then '''U''' will compute the same sequence as '''M'''.}}
 
This finding is now taken for granted, but at the time (1936) it was considered astonishing. The model of computation that Turing called his "universal machine"—"'''U'''" for short—is considered by some (cf Davis (2000)) to have been the fundamental theoretical breakthrough that led to the notion of the [[stored-program computer]].
 
{{quote|Turing's paper ... contains, in essence, the invention of the modern computer and some of the programming techniques that accompanied it.|Minsky (1967), p. 104}}
 
In terms of [[computational complexity]], a multi-tape universal Turing machine need only be slower by [[logarithm]]ic factor compared to the machines it simulates. This result was obtained in 1966 by F. C. Hennie and [[R. E. Stearns]]. (Arora and Barak, 2009, theorem 1.9)
 
==Comparison with real machines==
[[File:Lego Turing Machine.jpg|thumb|A Turing machine realisation in LEGO]]
It is often said that Turing machines, unlike simpler automata, are as powerful as real machines, and are able to execute any operation that a real program can. What is neglected in this statement is that, because a real machine can only have a finite number of ''configurations'', this "real machine" is really nothing but a [[linear bounded automaton]]. On the other hand, Turing machines are equivalent to machines that have an unlimited amount of storage space for their computations. As a matter of fact, Turing machines are not intended to model computers, but rather they are intended to model computation itself. Historically, computers, which compute only on their (fixed) internal storage, were developed only later.
 
There are a number of ways to explain why Turing machines are useful models of real computers:
 
# Anything a real computer can compute, a Turing machine can also compute. For example: "A Turing machine can simulate any type of subroutine found in programming languages, including recursive procedures and any of the known parameter-passing mechanisms" (Hopcroft and Ullman p.&nbsp;157). A large enough FSA can also model any real computer, disregarding IO. Thus, a statement about the limitations of Turing machines will also apply to real computers.
# The difference lies only with the ability of a Turing machine to manipulate an unbounded amount of data. However, given a finite amount of time, a Turing machine (like a real machine) can only manipulate a finite amount of data.
# Like a Turing machine, a real machine can have its storage space enlarged as needed, by acquiring more disks or other storage media. If the supply of these runs short, the Turing machine may become less useful as a model. But the fact is that neither Turing machines nor real machines need astronomical amounts of storage space in order to perform useful computation. The processing time required is usually much more of a problem.
# Descriptions of real machine programs using simpler abstract models are often much more complex than descriptions using Turing machines. For example, a Turing machine describing an algorithm may have a few hundred states, while the equivalent deterministic finite automaton (DFA) on a given real machine has quadrillions. This makes the DFA representation infeasible to analyze.
# Turing machines describe algorithms independent of how much memory they use. There is a limit to the memory possessed by any current machine, but this limit can rise arbitrarily in time. Turing machines allow us to make statements about algorithms which will (theoretically) hold forever, regardless of advances in ''conventional'' computing machine architecture.
# Turing machines simplify the statement of algorithms. Algorithms running on Turing-equivalent abstract machines are usually more general than their counterparts running on real machines, because they have arbitrary-precision data types available and never have to deal with unexpected conditions (including, but not limited to, running [[out of memory]]).
 
One way in which Turing machines are a poor model for programs is that many real programs, such as [[operating system]]s and [[word processor]]s, are written to receive unbounded input over time, and therefore do not halt. Turing machines do not model such ongoing computation well (but can still model portions of it, such as individual procedures).
 
===Limitations of Turing machines===
====Computational complexity theory====
{{further2|[[Computational complexity theory]]}}
A limitation of Turing machines is that they do not model the strengths of a particular arrangement well. For instance, modern stored-program computers are actually instances of a more specific form of [[abstract machine]] known as the [[random access stored program machine]] or RASP machine model. Like the [[Universal Turing machine]] the RASP stores its "program" in "memory" external to its finite-state machine's "instructions". Unlike the universal Turing machine, the RASP has an infinite number of distinguishable, numbered but unbounded "registers"—memory "cells" that can contain any integer (cf. Elgot and Robinson (1964), Hartmanis (1971), and in particular Cook-Rechow (1973); references at [[random access machine]]). The RASP's finite-state machine is equipped with the capability for indirect addressing (e.g. the contents of one register can be used as an address to specify another register); thus the RASP's "program" can address any register in the register-sequence. The upshot of this distinction is that there are computational optimizations that can be performed based on the memory indices, which are not possible in a general Turing machine; thus when Turing machines are used as the basis for bounding running times, a 'false lower bound' can be proven on certain algorithms' running times (due to the false simplifying assumption of a Turing machine). An example of this is [[binary search]], an algorithm that can be shown to perform more quickly when using the RASP model of computation rather than the Turing machine model.
 
====Concurrency====
Another limitation of Turing machines is that they do not model concurrency well. For example, there is a bound on the size of integer that can be computed by an always-halting nondeterministic Turing machine starting on a blank tape. (See article on [[unbounded nondeterminism]].) By contrast, there are always-halting concurrent systems with no inputs that can compute an integer of unbounded size. (A process can be created with local storage that is initialized with a count of 0 that concurrently sends itself both a stop and a go message. When it receives a go message, it increments its count by 1 and sends itself a go message. When it receives a stop message, it stops with an unbounded number in its local storage.)
 
==History==
{{See also|Algorithm|Church–Turing thesis}}
 
They were described in 1936 by [[Alan Turing]].
 
=== Historical background: computational machinery ===
[[Robin Gandy]] (1919–1995)—a student of [[Alan Turing]] (1912–1954) and his lifelong friend—traces the lineage of the notion of "calculating machine" back to [[Babbage]] (circa 1834) and actually proposes "Babbage's Thesis":
 
{{quote|''That the whole of development and operations of analysis are now capable of being executed by machinery''.|(italics in Babbage as cited by Gandy, p. 54)}}
 
Gandy's analysis of Babbage's [[Analytical Engine]] describes the following five operations (cf p.&nbsp;52–53):
# The arithmetic functions +, &minus;, &times; where &minus; indicates "proper" subtraction x&nbsp;&minus;&nbsp;y&nbsp;=&nbsp;0 if y&nbsp;≥&nbsp;x
# Any sequence of operations is an operation
# Iteration of an operation (repeating n times an operation P)
# Conditional iteration (repeating n times an operation P conditional on the "success" of test T)
# Conditional transfer (i.e. conditional "[[goto]]").
 
Gandy states that "the functions which can be calculated by (1), (2), and (4) are precisely those which are [[Turing computable]]." (p.&nbsp;53). He cites other proposals for "universal calculating machines" included those of [[Percy Ludgate]] (1909), [[Leonardo Torres y Quevedo]] (1914), [[Maurice d'Ocagne]] (1922), [[Louis Couffignal]] (1933), [[Vannevar Bush]] (1936), [[Howard Aiken]] (1937). However:
 
{{quote|...&nbsp;the emphasis is on programming a fixed iterable sequence of arithmetical operations. The fundamental importance of conditional iteration and conditional transfer for a general theory of calculating machines is not recognized ...|Gandy p. 55}}
 
=== The Entscheidungsproblem (the "decision problem"): Hilbert's tenth question of 1900 ===
 
With regards to [[Hilbert's problems]] posed by the famous mathematician [[David Hilbert]] in 1900, an aspect of problem #10 had been floating about for almost 30 years before it was framed precisely. Hilbert's original expression for #10 is as follows:
 
{{quote|'''10. Determination of the solvability of a Diophantine equation'''. Given a [[Diophantine equation]] with any number of unknown quantities and with rational integral coefficients: To devise a process according to which it can be determined in a finite number of operations whether the equation is solvable in rational integers.
 
The Entscheidungsproblem [decision problem for [[first-order logic]]] is solved when we know a procedure that allows for any given logical expression to decide by finitely many operations its validity or satisfiability ... The Entscheidungsproblem must be considered the main problem of mathematical logic.|quoted, with this translation and the original German, in Dershowitz and Gurevich, 2008}}
 
By 1922, this notion of "[[Entscheidungsproblem]]" had developed a bit, and [[Heinrich Behmann|H. Behmann]] stated that
 
{{quote|...&nbsp;most general form of the Entscheidungsproblem [is] as follows:
:A quite definite generally applicable prescription is required which will allow one to decide in a finite number of steps the truth or falsity of a given purely logical assertion ...|Gandy p. 57, quoting Behmann}}
{{quote|Behmann remarks that ... the general problem is equivalent to the problem of deciding which mathematical propositions are true.|''ibid.''}}
{{quote|If one were able to solve the Entscheidungsproblem then one would have a "procedure for solving many (or even all) mathematical problems".|''ibid.'', p. 92}}
 
By the 1928 international congress of mathematicians, Hilbert "made his questions quite precise. First, was mathematics ''[[Completeness#Mathematical_completeness|complete]]'' ... Second, was mathematics ''[[Consistency proof|consistent]]'' ... And thirdly, was mathematics ''[[Decidability (logic)|decidable]]''?" (Hodges p.&nbsp;91, Hawking p.&nbsp;1121). The first two questions were answered in 1930 by [[Kurt Gödel]] at the very same meeting where Hilbert delivered his retirement speech (much to the chagrin of Hilbert); the third—the Entscheidungsproblem—had to wait until the mid-1930s.
 
The problem was that an answer first required a precise definition of "''definite general applicable prescription''", which Princeton professor [[Alonzo Church]] would come to call "[[effective calculability]]", and in 1928 no such definition existed. But over the next 6–7 years [[Emil Post]] developed his definition of a worker moving from room to room writing and erasing marks per a list of instructions (Post 1936), as did Church and his two students [[Stephen Kleene]] and [[J. B. Rosser]] by use of Church's [[lambda-calculus]] and Gödel's [[recursion theory]] (1934). Church's paper (published 15 April 1936) showed that the Entscheidungsproblem was indeed "undecidable" and beat Turing to the punch by almost a year (Turing's paper submitted 28 May 1936, published January 1937). In the meantime, Emil Post submitted a brief paper in the fall of 1936, so Turing at least had priority over Post. While Church refereed Turing's paper, Turing had time to study Church's paper and add an Appendix where he sketched a proof that Church's lambda-calculus and his machines would compute the same functions.
 
{{quote|But what Church had done was something rather different, and in a certain sense weaker. ... the Turing construction was more direct, and provided an argument from first principles, closing the gap in Church's demonstration.|Hodges p. 112}}
 
And Post had only proposed a definition of [[Church–Turing thesis|calculability]] and criticized Church's "definition", but had proved nothing.
 
=== Alan Turing's a- (automatic-)machine ===
 
In the spring of 1935, Turing as a young Master's student at [[King's College Cambridge]], [[UK]], took on the challenge; he had been stimulated by the lectures of the logician [[M. H. A. Newman]] "and learned from them of Gödel's work and the Entscheidungsproblem ... Newman used the word 'mechanical' ... In his obituary of Turing 1955 Newman writes:
 
{{quote|To the question 'what is a "mechanical" process?' Turing returned the characteristic answer 'Something that can be done by a machine' and he embarked on the highly congenial task of analysing the general notion of a computing machine.|Gandy, p. 74}}
 
Gandy states that:
 
{{quote|I suppose, but do not know, that Turing, right from the start of his work, had as his goal a proof of the undecidability of the Entscheidungsproblem. He told me that the 'main idea' of the paper came to him when he was lying in Grantchester meadows in the summer of 1935. The 'main idea' might have either been his analysis of computation or his realization that there was a universal machine, and so a [[Cantor's diagonal argument|diagonal argument]] to prove unsolvability.|''ibid.'', p. 76}}
 
While Gandy believed that Newman's statement above is "misleading", this opinion is not shared by all. Turing had a lifelong interest in machines: "Alan had dreamt of inventing typewriters as a boy; [his mother] Mrs. Turing had a typewriter; and he could well have begun by asking himself what was meant by calling a typewriter 'mechanical'" (Hodges p.&nbsp;96). While at Princeton pursuing his PhD, Turing built a Boolean-logic multiplier (see below). His PhD thesis, titled "Systems of Logic Based on Ordinals", contains the following definition of "a computable function":
 
{{quote|It was stated above that 'a function is effectively calculable if its values can be found by some purely mechanical process'. We may take this statement literally, understanding by a purely mechanical process one which could be carried out by a machine. It is possible to give a mathematical description, in a certain normal form, of the structures of these machines. The development of these ideas leads to the author's definition of a computable function, and to an identification of computability with effective calculability. It is not difficult, though somewhat laborious, to prove that these three definitions [the 3rd is the λ-calculus] are equivalent.|Turing (1939) in ''The Undecidable'', p. 160}}
 
When Turing returned to the UK he ultimately became jointly responsible for breaking the German secret codes created by encryption machines called "The Enigma"; he also became involved in the design of the ACE ([[Automatic Computing Engine]]), "[Turing's] ACE proposal was effectively self-contained, and its roots lay not in the [[EDVAC]] [the USA's initiative], but in his own universal machine" (Hodges p.&nbsp;318). Arguments still continue concerning the origin and nature of what has been named by Kleene (1952) [[Turing's Thesis]]. But what Turing ''did prove'' with his computational-machine model appears in his paper ''On Computable Numbers, With an Application to the Entscheidungsproblem'' (1937):
 
{{quote|[that] the Hilbert Entscheidungsproblem can have no solution ... I propose, therefore to show that there can be no general process for determining whether a given formula U of the functional calculus K is provable, i.e. that there can be no machine which, supplied with any one U of these formulae, will eventually say whether U is provable.|from Turing's paper as reprinted in ''The Undecidable'', p. 145}}
 
Turing's example (his second proof): If one is to ask for a general procedure to tell us: "Does this machine ever print 0", the question is "undecidable".
 
=== 1937–1970: The "digital computer", the birth of "computer science" ===
 
In 1937, while at Princeton working on his PhD thesis, Turing built a digital (Boolean-logic) multiplier from scratch, making his own electromechanical relays (Hodges p.&nbsp;138). "Alan's task was to embody the logical design of a Turing machine in a network of relay-operated switches ..." (Hodges p.&nbsp;138). While Turing might have been just initially curious and experimenting, quite-earnest work in the same direction was going in Germany ([[Konrad Zuse]] (1938)), and in the United States ([[Howard Aiken]]) and [[George Stibitz]] (1937); the fruits of their labors were used by the Axis and Allied military in [[World War II]] (cf Hodges p.&nbsp;298–299). In the early to mid-1950s [[Hao Wang (academic)|Hao Wang]] and [[Marvin Minsky]] reduced the Turing machine to a simpler form (a precursor to the [[Post-Turing machine]] of [[Martin Davis]]); simultaneously European researchers were reducing the new-fangled [[electronic computer]] to a computer-like theoretical object equivalent to what was now being called a "Turing machine". In the late 1950s and early 1960s, the coincidentally parallel developments of Melzak and Lambek (1961), Minsky (1961), and Shepherdson and Sturgis (1961) carried the European work further and reduced the Turing machine to a more friendly, computer-like abstract model called the [[counter machine]]; Elgot and Robinson (1964), Hartmanis (1971), Cook and Reckhow (1973) carried this work even further with the [[register machine]] and [[random access machine]] models—but basically all are just multi-tape Turing machines with an arithmetic-like instruction set.
 
=== 1970–present: the Turing machine as a model of computation ===
 
Today, the counter, register and random-access machines and their sire the Turing machine continue to be the models of choice for theorists investigating questions in the [[theory of computation]]. In particular, [[computational complexity theory]] makes use of the Turing machine:
 
{{quote|Depending on the objects one likes to manipulate in the computations (numbers like nonnegative integers or alphanumeric strings), two models have obtained a dominant position in machine-based complexity theory:
:''the off-line multitape Turing machine''..., which represents the standard model for string-oriented computation, and
:the ''random access machine (RAM)'' as introduced by Cook and Reckhow ..., which models the idealized Von Neumann style computer.|van Emde Boas 1990:4}}
{{quote|Only in the related area of analysis of algorithms this role is taken over by the RAM model.|van Emde Boas 1990:16}}
 
==See also==
{{col-begin}}
{{col-break}}
* [[Algorithm]], for a brief history of some of the inventions and the mathematics leading to Turing's definition of what he called his "a-machine"
* [[Arithmetical hierarchy]]
* [[Bekenstein bound]], showing the impossibility of infinite-tape Turing machines of finite size and bounded energy
* [[BlooP and FlooP]]
* [[Busy beaver]]
* [[Chaitin constant]] or [[Omega (computer science)]] for information relating to the halting problem
* [[Church-Turing thesis]], which says Turing machines can perform any computation that can be performed
* [[Claude Shannon]], another leading thinker in information theory
* [[Conway's Game of Life]], a Turing-complete cellular automaton
* [[Digital infinity]]
* [[Genetix]] a virtual machine created by Bernard Hodson containing only 34 executable instructions
* ''[[Gödel, Escher, Bach: An Eternal Golden Braid]]'', an influential book largely about the Church–Turing thesis
{{col-break}}
* [[Halting problem]], for more references
* [[Harvard architecture]]
* [[Langton's ant]] and [[Turmite]]s, simple two-dimensional analogues of the Turing machine
* [[Modified Harvard architecture]]
* [[Probabilistic Turing machine]]
* [[Quantum Turing machine]]
* [[Turing completeness]], an attribute used in [[Computability theory (computation)|computability theory]] to describe computing systems with power equivalent to a universal Turing machine
* [[Turing switch]]
* [[Turing machine examples]]
* [[Turing tarpit]], any computing system or language which, despite being Turing complete, is generally considered useless for practical computing
* [[Von Neumann architecture]]
 
{{col-end}}
 
==Notes==
<references/>
 
==References==
=== Primary literature, reprints, and compilations ===
* B. [[Jack Copeland]] ed. (2004), ''The Essential Turing: Seminal Writings in Computing, Logic, Philosophy, Artificial Intelligence, and Artificial Life plus The Secrets of Enigma,'' Clarendon Press (Oxford University Press), Oxford UK, ISBN 0-19-825079-7. Contains the Turing papers plus a draft letter to [[Emil Post]] re his criticism of "Turing's convention", and Donald W. Davies' ''Corrections to Turing's Universal Computing Machine''
* [[Martin Davis]] (ed.) (1965), ''The Undecidable'', Raven Press, Hewlett, NY.
* Emil Post (1936), "Finite Combinatory Processes—Formulation 1", ''Journal of Symbolic Logic'', 1, 103–105, 1936. Reprinted in ''The Undecidable'' pp.&nbsp;289ff.
* Emil Post (1947), "Recursive Unsolvability of a Problem of Thue", ''Journal of Symbolic Logic'', vol. 12, pp.&nbsp;1–11. Reprinted in ''The Undecidable'' pp.&nbsp;293ff. In the Appendix of this paper Post comments on and gives corrections to Turing's paper of 1936–1937. In particular see the footnotes 11 with corrections to the universal computing machine coding and footnote 14 with comments on [[Turing's proof|Turing's first and second proofs]].
* {{Cite journal | last= Turing | first= A.M. | publication-date = 1937 | year = 1936 | title = On Computable Numbers, with an Application to the Entscheidungs problem | journal = Proceedings of the London Mathematical Society | series = 2 | volume = 42 | pages = 230–265 | doi= 10.1112/plms/s2-42.1.230 }} (and {{Cite journal | last = Turing | first = A.M. | publication-date = 1937 | title = On Computable Numbers, with an Application to the Entscheidungsproblem: A correction | journal = Proceedings of the London Mathematical Society | series = 2 | volume = 43 | pages = 544–6 | doi = 10.1112/plms/s2-43.6.544 | year = 1938 | issue = 6 }}). Reprinted in many collections, e.g. in ''The Undecidable'' pp.&nbsp;115–154; available on the web in many places, e.g. [http://www.scribd.com/doc/2937039/Alan-M-Turing-On-Computable-Numbers at Scribd].
* Alan Turing, 1948, "Intelligent Machinery."  Reprinted in "Cybernetics: Key Papers." Ed. C.R. Evans and A.D.J. Robertson.  Baltimore: University Park Press, 1968. p.&nbsp;31.
* F. C. Hennie and [[R. E. Stearns]]. ''Two-tape simulation of multitape Turing machines''. [[JACM]], 13(4):533–546, 1966.
 
=== Computability theory ===
 
* {{cite book | last = Boolos | first = George | coauthors = Richard Jeffrey | title = Computability and Logic | edition = 3rd | publisher = Cambridge University Press | location = Cambridge UK | year = 1989, 1999| isbn = 0-521-20402-X }}
*{{cite book | last = Boolos | first = George | coauthors = John Burgess, Richard Jeffrey,  | title = Computability and Logic | edition = 4th | publisher = Cambridge University Press | location = Cambridge UK | year = 2002 | isbn = 0-521-00758-5 (pb.) }} Some parts have been significantly rewritten by Burgess. Presentation of Turing machines in context of Lambek "abacus machines" (cf [[Register machine]]) and [[Computable function|recursive functions]], showing their equivalence.
* [[Taylor L. Booth]] (1967), ''Sequential Machines and Automata Theory'', John Wiley and Sons, Inc., New York. Graduate level engineering text; ranges over a wide variety of topics, Chapter IX ''Turing Machines'' includes some recursion theory.
* {{cite book|author = [[Martin Davis]] | year = 1958| title = Computability and Unsolvability| publisher = McGraw-Hill Book Company, Inc, New York }}. On pages 12–20 he gives examples of 5-tuple tables for Addition, The Successor Function, Subtraction (x ≥ y), Proper Subtraction (0 if x < y), The Identity Function and various identity functions, and Multiplication.
* {{cite book | last = Davis| first = Martin | coauthors = Ron Sigal, Elaine J. Weyuker| title = Computability, Complexity, and Languages and Logic: Fundamentals of Theoretical Computer Science | edition = 2nd | publisher = Academic Press, Harcourt, Brace & Company| location = San Diego | year = 1994 | isbn = 0-12-206382-1}}
* {{cite book|last =Hennie |first = Fredrick | year = 1977| title = Introduction to Computability| publisher = Addison–Wesley, Reading, Mass|unused_data =id: QA248.5H4 1977 }}. On pages 90–103 Hennie discusses the UTM with examples and flow-charts, but no actual 'code'.
* {{cite book|author = [[John Hopcroft]] and [[Jeffrey Ullman]], | year = 1979| title = [[Introduction to Automata Theory, Languages, and Computation]]| publisher = Addison–Wesley, Reading Mass| edition = 1st | isbn = 0-201-02988-X}} A difficult book. Centered around the issues of machine-interpretation of "languages", NP-completeness, etc.
* {{cite book | last = Hopcroft | first = John E.| coauthors = Rajeev Motwani, Jeffrey D. Ullman| title = Introduction to Automata Theory, Languages, and Computation| edition = 2nd | publisher = Addison–Wesley | location = Reading Mass | year = 2001 | isbn = 0-201-44124-1}} Distinctly different and less intimidating than the first edition.
* [[Stephen Kleene]] (1952), ''Introduction to Metamathematics'', North–Holland Publishing Company, Amsterdam Netherlands, 10th impression (with corrections of 6th reprint 1971). Graduate level text; most of Chapter XIII ''Computable functions'' is on Turing machine proofs of computability of recursive functions, etc.
* {{cite book | last = Knuth | first = Donald E. | authorlink = Donald Knuth | title = Volume 1/Fundamental Algorithms: The Art of computer Programming| edition = 2nd | publisher = Addison–Wesley Publishing Company| location = Reading, Mass.| year = 1973 }}. With reference to the role of Turing machines in the development of computation (both hardware and software) see 1.4.5 ''History and Bibliography'' pp.&nbsp;225ff and 2.6 ''History and Bibliography''pp.&nbsp;456ff.
*[[Zohar Manna]], 1974, ''[[Mathematical Theory of Computation]]''. Reprinted, Dover, 2003. ISBN 978-0-486-43238-0
* [[Marvin Minsky]], ''Computation: Finite and Infinite Machines'', Prentice–Hall, Inc., N.J., 1967. See Chapter 8, Section 8.2 "Unsolvability of the Halting Problem." Excellent, i.e. relatively readable, sometimes funny.
* {{cite book|author = [[Christos Papadimitriou]] | year = 1993 | title = Computational Complexity | publisher = Addison Wesley | edition = 1st | isbn = 0-201-53082-1}} Chapter 2: Turing machines, pp.&nbsp;19–56.
* {{cite book|author = [[Michael Sipser]] | year = 1997 | title = Introduction to the Theory of Computation | publisher = PWS Publishing | isbn = 0-534-94728-X}} Chapter 3: The Church–Turing Thesis, pp.&nbsp;125–149.
* {{cite book | last = Stone | first = Harold S. | title = Introduction to Computer Organization and Data Structures| edition = 1st | publisher = McGraw–Hill Book Company | location = New York | year = 1972 | isbn = 0-07-061726-0 }}
* [[Peter van Emde Boas]] 1990, ''Machine Models and Simulations'', pp.&nbsp;3–66, in [[Jan van Leeuwen]], ed., ''Handbook of Theoretical Computer Science, Volume A: Algorithms and Complexity'', The MIT Press/Elsevier, [place?], ISBN 0-444-88071-2 (Volume A). QA76.H279 1990. Valuable survey, with 141 references.
 
=== Church's thesis ===
* {{cite journal |author=Nachum Dershowitz |coauthors=Yuri Gurevich |date=September 2008 |title=A natural axiomatization of computability and proof of Church's Thesis |journal=Bulletin of Symbolic Logic |volume=14 |issue=3 |url=http://research.microsoft.com/en-us/um/people/gurevich/Opera/188.pdf |accessdate=2008-10-15}}
* {{cite book|author = [[Roger Penrose]] | year = 1989, 1990 | title = The Emperor's New Mind | publisher = Oxford University Press, New York| edition = 2nd | isbn = 0-19-851973-7}}
 
=== Small Turing machines ===
* Rogozhin, Yurii, 1998, "[http://web.archive.org/web/20050308141040/http://www.imt.ro/Romjist/Volum1/Vol1_3/turing.htm A Universal Turing Machine with 22 States and 2 Symbols]", ''Romanian Journal Of Information Science and Technology'', 1(3), 259–265, 1998. (surveys known results about small universal Turing machines)
* [[Stephen Wolfram]], 2002, [http://www.wolframscience.com/nksonline/page-707 ''A New Kind of Science''], Wolfram Media, ISBN 1-57955-008-8
* Brunfiel, Geoff, [http://www.nature.com/news/2007/071024/full/news.2007.190.html Student snags maths prize], ''Nature'', October 24. 2007.
* Jim Giles (2007), [http://technology.newscientist.com/article/dn12826-simplest-universal-computer-wins-student-25000.html Simplest 'universal computer' wins student $25,000], New Scientist, October 24, 2007.
* Alex Smith, [http://www.wolframscience.com/prizes/tm23/TM23Proof.pdf Universality of Wolfram’s 2, 3 Turing Machine], Submission for the Wolfram 2, 3 Turing Machine Research Prize.
* Vaughan Pratt, 2007, "[http://cs.nyu.edu/pipermail/fom/2007-October/012156.html Simple Turing machines, Universality, Encodings, etc.]", FOM email list. October 29, 2007.
* Martin Davis, 2007, "[http://cs.nyu.edu/pipermail/fom/2007-October/012132.html Smallest universal machine]", and [http://cs.nyu.edu/pipermail/fom/2007-October/012145.html Definition of universal Turing machine] FOM email list. October 26–27, 2007.
 
* Alasdair Urquhart, 2007 "[http://cs.nyu.edu/pipermail/fom/2007-October/012140.html Smallest universal machine]", FOM email list. October 26, 2007.
* Hector Zenil (Wolfram Research), 2007 "[http://cs.nyu.edu/pipermail/fom/2007-October/012163.html smallest universal machine]", FOM email list. October 29, 2007.
* Todd Rowland, 2007, "[http://forum.wolframscience.com/showthread.php?s=&threadid=1472 Confusion on FOM]", Wolfram Science message board, October 30, 2007.
 
===Other===
* {{cite book|author = [[Martin Davis]] | year = 2000| title = Engines of Logic: Mathematicians and the origin of the Computer| publisher = W. W. Norton & Company, New York| edition = 1st | isbn = 0-393-32229-7 pbk.}}
* [[Robin Gandy]], "The Confluence of Ideas in 1936", pp.&nbsp;51–102 in [[Rolf Herken]], see below.
* [[Stephen Hawking]] (editor), 2005, ''God Created the Integers: The Mathematical Breakthroughs that Changed History'', Running Press, Philadelphia, ISBN 978-0-7624-1922-7. Includes Turing's 1936–1937 paper, with brief commentary and biography of Turing as written by Hawking.
* {{cite book | author = Rolf Herken | title = The Universal Turing Machine—A Half-Century Survey | publisher = Springer Verlag | isbn = 3-211-82637-8 | year = 1995}}
* [[Andrew Hodges]], ''Alan Turing: The Enigma'', Simon and Schuster, New York. Cf Chapter "The Spirit of Truth" for a history leading to, and a discussion of, his proof.
* {{cite book|author = [[Ivars Peterson]] | year = 1988| title = The Mathematical Tourist: Snapshots of Modern Mathematics| publisher = W. H. Freeman and Company, New York| edition = 1st | isbn = 0-7167-2064-7 (pbk.)}}
* {{cite book | author = [[Paul Strathern]] | title = Turing and the Computer—The Big Idea | publisher = Anchor Books/Doubleday | isbn = 0-385-49243-X | year = 1997}}
* [[Hao Wang (academic)|Hao Wang]], "A variant to Turing's theory of computing machines", ''Journal of the Association for Computing Machinery'' (JACM) 4, 63–92 (1957).
* [[Charles Petzold]], [http://www.theannotatedturing.com/ Petzold, Charles, ''The Annotated Turing''], John Wiley & Sons, Inc., ISBN 0-470-22905-5
* Arora, Sanjeev; Barak, Boaz, [http://www.cs.princeton.edu/theory/complexity/ "Complexity Theory: A Modern Approach"], Cambridge University Press, 2009, ISBN 978-0-521-42426-4, section 1.4, "Machines as strings and the universal Turing machine" and 1.7, "Proof of theorem 1.9"
* [[A Note On Turing Machine Computability Of Rule Driven Systems]], SIGACT News December 2005
 
==External links==
{{Commons category|Turing machines}}
* {{springer|title=Turing machine|id=p/t094460}}
* [http://plato.stanford.edu/entries/turing-machine/ Turing Machine on Stanford Encyclopedia of Philosophy]
* [http://plato.stanford.edu/entries/church-turing/ Detailed info on the Church–Turing Hypothesis] (Stanford Encyclopedia of Philosophy)
* [http://webturingmachine.appspot.com/ Web Turing Machine] Web application to construct and execute Turing machines (Javascript)
* [http://www.weizmann.ac.il/mathusers/lbn/new_pages/Research_Turing.html Turing Machine-Like Models] in Molecular Biology, to understand life mechanisms with a DNA-tape processor.
* [http://www.SaschaSeidel.de/html/programmierung/download_The_Turing_machine.php The Turing machine]—Summary about the Turing machine, its functionality and historical facts
* [http://www.wolframscience.com/prizes/tm23/ The Wolfram 2,3 Turing Machine Research Prize]—Stephen Wolfram's $25,000 prize for the proof or disproof of the universality of the potentially smallest universal Turing Machine. The contest has ended, with the proof affirming the machine's universality.
* "[http://demonstrations.wolfram.com/TuringMachineCausalNetworks/ Turing Machine Causal Networks]" by Enrique Zeleny, [[Wolfram Demonstrations Project]].
* {{dmoz|Computers/Computer_Science/Theoretical/Automata_Theory/Turing_Machines|Turing Machines}}
* [http://www.turing2012.fr/?p=530&lang=en Purely mechanical Turing Machine]
 
{{Formal languages and grammars}}
 
{{DEFAULTSORT:Turing machine}}
[[Category:1937 in computer science]]
[[Category:Turing machine]]
[[Category:Educational abstract machines]]
[[Category:Theoretical computer science]]
[[Category:Alan Turing]]
[[Category:Models of computation]]
[[Category:Formal methods]]
[[Category:Computability theory]]
[[Category:English inventions]]

Revision as of 23:54, 27 February 2014

Royal Votaw is my name but I never truly favored that title. Her spouse and her selected to reside in Alabama. The job I've been occupying for many years is a bookkeeper but I've currently applied for another one. Playing croquet is something I will by no means give up.

Here is my weblog - http://Www.livewriters.com/uprofile.php?UID=444110