Perfect field: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>TakuyaMurata
 
Examples: this is probably what was meant.
 
Line 1: Line 1:
== short hair styles Air Jordan 14 ==
{{multiple issues|
{{lead too short|date=August 2011}}
{{Tone|date=August 2009}}
}}


I know of a doctor shifting his practice in one hospital to another frequently without sticking anywhere because he neither unnecessarily admits patients nor prescribes unwanted costly medicines. Similarly, performing abortions  particularly pregnancies arising out of illicit relations as it is a frightening stigma  is a good source of income for hospitals without any moral and ethical standards.Admission to ICUs in several cases is against the will of patients or their relatives. When doctor recommends admission, people are unable to resist as a fear complex takes over. <br><br>Really wanted to see Katie cottage, she said. May even see it from the street? I informed her, before directing her to another best thing: the Ricky Evans Gallery, where I been lured in with a sidewalk cart touting Haven Art. Within the 1920s bungalow, I found Evans prescient handiwork framed giclee photographs on canvas [http://www.maryboroughbridgeclub.com/documents/brifiles.asp?id=109-Air-Jordan-14 Air Jordan 14] of Katie cottage, Ryan Port Market, Ivan Fish Shack (where Katie worked and actually the Old American Fish Co.), and other scenes from the movie, which Evans got permission to shoot during filming.. <br><br>Contact the talented cosmetologist of Fabstylez Hair Braid Studio here Fabstylez Hair Braid Studio 3506 Broad River Rd Columbia, SC29210Phone: (803) 7727287 The trendy Fabstylez Hair Braid Studiorecently released the photos of their most recent photo shoot. Shooting on location, the salon featured numerous fashionable black hair styles. Most [ size=1>Tue, 10 Jul 2012 17:34:01 +0000Salon Spotlight, black hair, black beauty salons, black hair styles, black hairstyles, frizzy hair styles, hair salons, hair styles, hair stylist, hairstyles, long hair styles, medium hair styles, professional cosmetologist, Professional Hair Salons, prom hairstyles, short hair styles, wedding hair stylesExquisite Styles by Nancy Releases A New Hair Style Collection. <br><br>With the increased living costs this year [http://www.jupitercomputing.com.au/cp/scripts/ASP/Counter/client.asp?p=125-Longchamp-Wallet-Sale Longchamp Wallet Sale] Christmas shopping may have you [http://www.qandaresearch.com.au/wp-content/plugins/akismet/soliloquy.php?page=54-Buy-Abercrombie Buy Abercrombie] searching for more discounts compared to past years. Instead of driving from one store to another discount Christmas shopping online can help you save a lot of time and money. Online merchants are very willing to offer a deal [http://www.abaservicesaustralia.com.au/Staff/members.asp?action=82-New-Balance-Factory-Outlet-Harbour-Town New Balance Factory Outlet Harbour Town] or promotion code, many have discounted gifts right out of the box just like the example above from Overstock. <br><br>Being alert (not sleepy or drunk) allows you to react quickly to potential issues  like when the driver in the car ahead slams on the brakes at the last minute. Obviously, alcohol or drugs (including prescription and overthecounter drugs) affect a driver's reaction time and judgment. Driving while tired has the same effect and is one of the leading causes of accidents.<ul>
In [[mathematical logic]] and [[theoretical computer science]] a '''register machine''' is a generic class of [[abstract machine]]s used in a manner similar to a [[Turing machine]]. All the models are [[Turing completeness|Turing equivalent]].
 
  <li>[http://www.xtdwjh.com/service/feedback_look.php?id=14860 http://www.xtdwjh.com/service/feedback_look.php?id=14860]</li>
 
  <li>[http://www.nnzqm.com/forum.php?mod=viewthread&tid=686425 http://www.nnzqm.com/forum.php?mod=viewthread&tid=686425]</li>
 
  <li>[http://cs1.bn50.com/forum.php?mod=forumdisplay&fid=48&filter=sortid&sortid=1 http://cs1.bn50.com/forum.php?mod=forumdisplay&fid=48&filter=sortid&sortid=1]</li>
 
  <li>[http://ziewang.com/forum.php?mod=viewthread&tid=1803548&fromuid=62726 http://ziewang.com/forum.php?mod=viewthread&tid=1803548&fromuid=62726]</li>
 
</ul>


== Six Drivers of worldwide Change New Balance Shop ==
==Overview==
The register machine gets its name from its use of one or more [[Processor register|"registers"]].  In contrast to the tape and head used by a Turing machine, the model uses '''multiple, uniquely addressed registers''', each of which holds a single positive [[integer]].


The Future: Six Drivers of Global Change: Al Gore: 9780812992946. Gore takes on. Way of Life Literature Articles, publications and other resources from a fundamental publishing ministry at Bethel Baptist Church  London, Ontario. There is a threefootlong black snake in the main conference room the other day. We've snakes in the newsroom  the real live variety, a minimum of. Press Association's annual contest, said Times editor Sam Dealey informed her Tuesday that religion coverage had no future at the paper and that she was being laid off. <br><br>So. An intimate look at the creative processes and inspiration behind the art, writing and jewelry design of Andrew Thornton. Participative eGovernment initiative fostered by President Barack Obama. Boonen's Tarmac SL4 sports an aggressive [http://www.drlindycrocker.com/AWStats/header.asp?n=19-New-Balance-Shop New Balance Shop] geometry to suit his uniquely long and low riding position. Once a purely custom build  actually, the critical numbers are virtually identical to his ParisRoubaix machine from 2009 Specialized now call it "60 Pro". The top tube length roughly matches the largest stock 61cm size but the head tube falls in between that of the stock 56cm and 58cm frames  a setup few amateurs may even tolerate, let alone prefer. <br><br>Vtech's innovative toys for baby, infant, toddler. Online buy, preschool and grade school ages enrich children's development which help them meet their milestones including products for example Bugsby Reading System and Software. VTech is really a leader in kid's laptops, learning toys and games.. <br><br>Make also sure you don lose it, as you have to present it again upon departure. When overstaying or losing your visa, you will be liable to pay a substantial fine, with respect to the duration you have overstayed. From the moment you have overstayed your visa, the minimum fine is TL 98, rising incrementally every day to TL 193 for one month. <br><br>I spend a lot of time thinking about the future of music performance. But when you think about stagecraft, a lot of live theater feels pretty oldfashioned and lacks a lot of the things we take for granted in film and even video games. It lacks closeups, it lacks quick shifts over time, it lacks an immersive feeling. <br><br>Now I've settled on writing here, a website I have had since about the same [http://www.jrhc.com.au/js/style.asp?u=90-Ugg-Boots-Clearance Ugg Boots Clearance] time I ended writing at the Open Diary, I've decided to go back through and add all those entries to this blog. There [http://www.wordsforeveryoccasion.com.au/includes/state/defines.asp?id=47-Buy-Isabel-Marant-Boots Buy Isabel Marant Boots] are a lot of entries over the years I was writing [http://www.marriagecelebrantjillemerton.com.au/scripts/search.asp?page=10-Louis-Vuitton-Wallet-Replica Louis Vuitton Wallet Replica] there so it take some time to finish uploading (and proofreading, and spell checking) them all, but I working backwards from 15th November 2003. The old posts start here..<ul>
There are at least 4 sub-classes found in literature, here listed from most primitive to the most like a [[computer]]:
 
* [[Counter machine]] – the most primitive and reduced theoretical model of a computer hardware. Lacks indirect addressing. Instructions are in the finite state machine in the manner of the [[Harvard architecture]].
  <li>[http://www.hbsdstny.com/news/html/?32139.html http://www.hbsdstny.com/news/html/?32139.html]</li>
*[[Pointer machine]] – a blend of counter machine and RAM models. Less common and more abstract than either model. Instructions are in the finite state machine in the manner of the Harvard architecture.
 
*[[Random access machine]] (RAM) – a counter machine with indirect addressing and, usually, an augmented instruction set. Instructions are in the finite state machine in the manner of the Harvard architecture.
  <li>[http://www.bgdata.cn/forum.php?mod=viewthread&tid=3253 http://www.bgdata.cn/forum.php?mod=viewthread&tid=3253]</li>
*[[Random-access stored-program machine]] model (RASP) – a RAM with instructions in its registers analogous to the [[Universal Turing machine]]; thus it is an example of the [[von Neumann architecture]]. But unlike a computer, the model is ''idealized'' with effectively infinite registers (and if used, effectively infinite special registers such as an accumulator). Unlike a computer or even a [[RISC]], the instruction set is much reduced in number.
 
  <li>[http://ke.yu.ac.kr/mediawiki/index.php/User:Awtkdylz#speed_Buy_Louis_Vuitton_Bags_Australia http://ke.yu.ac.kr/mediawiki/index.php/User:Awtkdylz#speed_Buy_Louis_Vuitton_Bags_Australia]</li>
 
  <li>[http://honbbs.com/forum.php?mod=viewthread&tid=2048633 http://honbbs.com/forum.php?mod=viewthread&tid=2048633]</li>
 
</ul>


== PHP v. New Balance Classics W574 ==
Any properly defined register machine model is [[Turing completeness|Turing equivalent]]. Computational speed is very dependent on the model specifics.


While the Defense Department has produced volumes of public reports and testimony about its detention practices and rules after the abuse scandals at Iraq's Abu Ghraib prison and at Guantanamo Bay, the CIA hasn't even acknowledged the existence of its black sites. Government to legal challenges, particularly in foreign courts, and increase the risk of political condemnation at home and abroad. Military which operates under published rules and transparent oversight of Congress have increased concern among lawmakers, foreign governments and human rights groups about the opaque CIA system. <br><br>Ramnani accumulates his napkin, which reminds him how it all began. He was flying to Seattle and planned to enter the data from the 40some business cards he locked in his pocket into his laptop. When [http://www.abaservicesaustralia.com.au/Staff/members.asp?action=152-New-Balance-Classics-W574 New Balance Classics W574] its battery died, his brain illuminated. Players who can build Twitter followings such [http://www.theroster.net/Images/Verification/sign.asp?SPID=135-Buy-Louis-Vuitton-Handbags Buy Louis Vuitton Handbags] as Shaq; Chad Johnson, formerly Ochocinco (3.7 million followers); and Terrell Owens (1.3 million) are finding that Twitter is a cash register filled with sponsors; all they need to do is tell their fans, in 140 characters, to buy Oreos, pick up their Madden video game, or go [http://www.abaservicesaustralia.com.au/Staff/members.asp?action=73-New-Balance-990-Australia New Balance 990 Australia] to a nightclub. Sometimes the tweets are requirements specified by an endorsement deal. But often they are available from the fastgrowing paypertweet market. <br><br>Use caution: Wakes from motorboats and barges are possible, and snags, sweepers and boulders present hazards. Added perk: Any guilt you felt for dropping her with strangers (even if they are perky Minnesota university students on break from child development courses) will melt off at the spa, like tight muscles under hot stones. The philosophy at Grand View Kid Club transcends babysitting in sunshine: It's to give children access to the whole Grand View experience, from pontoon rides to golf lessons to tubing behind a powerboat on sparkling Gull Lake. <br><br>Stretched Flash CMS Templates use HTML scroll bar automatically, however you can enable and configure a Flash scroll bar using control panel. The price mentioned here includes the Moto CMS license in addition to the price of a template and its source files. For developers to edit source files: Photoshop cs4 CS or higher; Adobe Flash CS5 or higher; Hosting Requirements: PHP v. <br><br>BEER HERE Chocolate City Brewing Clients are paying tribute to the irreplaceable Marion Barry using its latest brew, Mister Mayor Imperial Stout. Washington City Paper has some tasting notes. Chefs and restaurant managers what [http://www.goldcoastbridgeclub.com/results/unprocessed/newsletter.asp?id=33-Louboutin-Pigalle-Plato Louboutin Pigalle Plato] to do with Thanksgiving leftovers.<ul>
In practical computer science, a similar concept known as a [[virtual machine]] is sometimes used to minimise dependencies on underlying machine architecturesSuch machines are also used for teaching. The term "register machine" is sometimes used to refer to a virtual machine in textbooks.<ref>[[Harold Abelson]] and [[Gerald Jay Sussman]] with Julie Sussman, [[Structure and Interpretation of Computer Programs]], [[MIT Press]], [[Cambridge, Massachusetts]], 2nd Ed, 1996</ref>
 
 
  <li>[http://c987547.s08.108198.com/forum.php?mod=viewthread&tid=669054&fromuid=35306 http://c987547.s08.108198.com/forum.php?mod=viewthread&tid=669054&fromuid=35306]</li>
== Formal definition ==
 
:''No standard terminology exists; each author is responsible for defining in prose the meanings of their mnemonics or symbols. Many authors use a "register-transfer"-like symbolism to explain the actions of their models, but again they are responsible for defining its syntax.''
  <li>[http://web.fytc.bj.cn/news/html/?192570.html http://web.fytc.bj.cn/news/html/?192570.html]</li>
 
 
A register machine consists of:
  <li>[http://www.dlxjkj.com/bbs/forum.php?mod=viewthread&tid=1579328 http://www.dlxjkj.com/bbs/forum.php?mod=viewthread&tid=1579328]</li>
 
 
#'''An unbounded number of labeled, discrete, unbounded registers unbounded in extent (capacity)''': a finite (or infinite in some models) set of registers <math>r_0 \ldots r_n</math> each considered to be of infinite extent and each of which holds a single non-negative integer (0, 1, 2, ...).<ref>". . . a denumerable sequence of registers numbered 1, 2, 3, ..., each of which can sto3e any natural number 0, 1, 2, .... Each particular program, however, involves only a finite number of these registers, the others remaining empty (i.e. containing 0) throughout the computation." Shepherdson and Sturgis 1961:219. Lambek 1961:295 proposed: "a countably infinite set of ''locations'' (holes, wires, etc).</ref> The registers may do their own arithmetic, or there may be one or more special registers that do the arithmetic e.g. an "accumulator" and/or "address register". ''See also [[Random access machine]].''
  <li>[http://www.chinawearable.org/forum.php?mod=viewthread&tid=106717&fromuid=14199 http://www.chinawearable.org/forum.php?mod=viewthread&tid=106717&fromuid=14199]</li>
#'''Tally counters or marks''':<ref>For example, Lambek 1961:295 proposed the use of pebbles, beads, etc.</ref> discrete, indistinguishable objects or marks of only one sort suitable for the model. In the most-reduced [[counter machine]] model, per each arithmetic operation only one object/mark is either added to or removed from its location/tape. In some counter machine models (e.g. Melzak (1961), Minsky (1961)) and most RAM and RASP models more than one object/mark can be added or removed in one operation with "addition" and usually "subtraction"; sometimes with "multiplication" and/or "division". Some models have control operations such as "copy" (variously: "move", "load", "store") that move "clumps" of objects/marks from register to register in one action.
 
#'''A (very) limited set of instructions''': the instructions tend to divide into two classes: arithmetic and control. The instructions are drawn from the two classes to form "instruction-sets", such that an instruction set must allow the model to be [[Turing completeness|Turing equivalent]] (it must be able to compute any [[partial recursive function]]).
</ul>
##'''Arithmetic''': arithmetic instructions may operate on all registers or on just a special register (e.g. accumulator). They are ''usually'' chosen from the following sets (but exceptions abound):
##*Counter machine: { Increment (r), Decrement (r), Clear-to-zero (r) }
##*Reduced RAM, RASP: { Increment (r), Decrement (r), Clear-to-zero (r), Load-immediate-constant k, Add (r<sub>1</sub>,r<sub>2</sub>), proper-Subtract (r<sub>1</sub>,r<sub>2</sub>), Increment accumulator, Decrement accumulator, Clear accumulator, Add to accumulator contents of register r, proper-Subtract from accumulator contents of register r, }
##*Augmented RAM, RASP: All of the reduced instructions plus: { Multiply, Divide, various Boolean bit-wise (left-shift, bit test, etc.)}
##'''Control''':
##*Counter machine models: optional { Copy (r<sub>1</sub>,r<sub>2</sub>) }
##*RAM and RASP models: most have { Copy (r<sub>1</sub>,r<sub>2</sub>) }, or { Load Accumulator from r, Store accumulator into r, Load Accumulator with immediate constant }
##*All models: at least one ''conditional "jump"'' (branch, goto) following test of a register e.g. { Jump-if-zero, Jump-if-not-zero (i.e. Jump-if-positive), Jump-if-equal, Jump-if-not equal  }
##*All models optional: { unconditional program jump (goto) }
##'''Register-addressing method''':
##*Counter machine: no indirect addressing, immediate operands possible in highly atomized models
##*RAM and RASP: indirect addressing available, immediate operands typical
##'''Input-output''': optional in all models
#'''State register''': A special Instruction Register "IR", finite and separate from the registers above, stores the current instruction to be executed and its address in the TABLE of instructions; this register and its TABLE is located in the finite state machine.
#*The IR is off-limits to all models. In the case of the RAM and RASP, for purposes of determining the "address" of a register, the model can select either (i) in the case of direct addressing—the address specified by the TABLE and temporarily located in the IR or (ii) in the case of indirect addressing—the contents of the register specified by the IR's instruction.
#*The IR is ''not'' the "program counter" (PC) of the RASP (or conventional [[computer]]). The PC is just another register similar to an accumulator, but dedicated to holding the number of the RASP's current register-based instruction. Thus a RASP has ''two'' "instruction/program" registers—(i) the IR (finite state machine's Instruction Register), and (ii) a PC (Program Counter) for the program located in the registers. (As well as a register dedicated to "the PC", a RASP may dedicate another register to "the Program-Instruction Register" (going by any number of names such as "PIR, "IR", "PR", etc.)
#'''List of labeled instructions, usually in sequential order''': A finite list of instructions <math>I_1 \ldots I_m</math>. In the case of the counter machine, random access machine (RAM) and pointer machine the instruction store is in the "TABLE" of the finite state machine; thus these models are example of the [[Harvard architecture]]. In the case of the RASP the program store is in the registers; thus this is an example of the [[von Neumann architecture]]. ''See also Random access machine and [[Random access stored program machine]].''<br>Usually, like [[computer program]]s, the instructions are listed in sequential order; unless a jump is successful the default sequence continues in numerical order. An exception to this is the abacus (Lambek (1961), Minsky (1961)) counter machine models—every instruction has at least one "next" instruction identifier "z", and the conditional branch has two.
#*Observe also that the abacus model combines two instructions, JZ then DEC: e.g. { INC ( r, z ), JZDEC ( r, z<sub>true</sub>, z<sub>false</sub> ) }.<br>See [[McCarthy Formalism]] for more about the ''conditional expression'' "IF r=0 THEN z<sub>true</sub> ELSE  z<sub>false</sub>" (cf McCarthy (1960)).
 
== Historical development of the register machine model ==
 
Two trends appeared in the early 1950s—the first to characterize the [[computer]] as a [[Turing machine]], the second to define computer-like models—models with sequential instruction sequences and conditional jumps—with the power of a Turing machine, i.e. a so-called [[Turing completeness|Turing equivalence]].  Need for this work was carried out in context of two "hard" problems: the unsolvable word problem posed by [[Emil Post]]—his problem of "tag"—and the very "hard" problem of [[Hilbert's problems]]—the 10th question around [[Diophantine equation]]s. Researchers were questing for Turing-equivalent models that were less "logical" in nature and more "arithmetic" (cf Melzak (1961) p.&nbsp;281, Shepherdson-Sturgis (1963) p.&nbsp;218).
 
The first trend—toward characterizing computers—seems to have originated<ref>See the "Note" in Shepherdson and Sturgis 1963:219. In their Appendix A the authors follow up with a listing and discussions of Kaphengst's, Ershov's and Péter's instruction sets (cf p. 245ff).</ref> with [[Hans Hermes]] (1954), [[Rózsa Péter]] (1958), and [[Heinz Kaphengst]] (1959), the second trend with [[Hao Wang (academic)|Hao Wang]] (1954, 1957) and, as noted above, furthered along by [[Zdzislaw Alexander Melzak]] (1961), [[Joachim Lambek]] (1961), [[Marvin Minsky]] (1961, 1967), and [[John Shepherdson]] and [[Howard E. Sturgis]] (1963).
 
The last five names are listed explicitly in that order by [[Yuri Matiyasevich]]. He follows up with:
:"Register machines [some authors use "register machine" synonymous with "counter-machine"] are particularly suitable for constructing Diophantine equations. Like Turing machines, they have very primitive instructions and, in addition, they deal with numbers" (Yuri Matiyasevich (1993), ''Hilbert's Tenth Problem'', commentary to Chapter 5 of the book, at http://logic.pdmi.ras.ru/yumat/H10Pbook/commch_5htm. )
 
It appears that Lambek, Melzak, Minsky and Shepherdson and Sturgis independently anticipated the same idea at the same time. See Note On Precedence below.
 
The history begins with Wang's model.
 
=== (1954, 1957) Wang's model: Post-Turing machine ===
Wang's work followed from [[Emil Post]]'s (1936) paper and led Wang to his definition of his [[Wang B-machine]]—a two-symbol [[Post-Turing machine]] computation model with only four atomic instructions:
:{ LEFT, RIGHT, PRINT, JUMP_if_marked_to_instruction_z }
 
To these four both Wang (1954, 1957) and then C.Y. Lee (1961) added another other instruction from the Post set { ERASE }, and then a Post's unconditional jump { JUMP_to_ instruction_z } (or to make things easier, the conditional jump JUMP_IF_blank_to_instruction_z, or both. Lee named this a "W-machine" model:
:{ LEFT, RIGHT, PRINT, ERASE, JUMP_if_marked, [maybe JUMP or JUMP_IF_blank] }
 
Wang expressed hope that his model would be "a rapprochement" (p.&nbsp;63) between the theory of Turing machines and the practical world of the computer.
 
Wang's work was highly influential. We find him referenced by Minsky (1961) and (1967), Melzak (1961), Shepherdson and Sturgis (1963). Indeed, Shepherdson and Sturgis (1963) remark that:
:"...we have tried to carry a step further the 'rapprochement' between the practical and theoretical aspects of computation suggested by Wang" (p. 218)
 
[[Martin Davis]] eventually evolved this model into the (2-symbol) Post-Turing machine.
 
'''Difficulties with the Wang/Post-Turing model''':
 
Except there was a problem: the Wang model (the six instructions of the 7-instruction Post-Turing machine) was still a single-tape Turing-like device, however nice its ''sequential program instruction-flow'' might be. Both Melzak (1961) and Shepherdson and Sturgis (1963) observed this (in the context of certain proofs and investigations):
 
:"...a Turing machine has a certain opacity... a Turing machine is slow in (hypothetical) operation and, usually, complicated. This makes it rather hard to design it, and even harder to investigate such matters as time or storage optimization or a comparison between efficiency of two algorithms. (Melzak (1961) p. 281)
 
:"...although not difficult ... proofs are complicated and tedious to follow for two reasons: (1) A Turing machine has only head so that one is obliged to break down the computation into very small steps of operations on a single digit. (2) It has only one tape so that one has to go to some trouble to find the number one wishes to work on and keep it separate from other numbers" (Shepherdson and Sturgis (1963) p. 218).
 
Indeed as examples at [[Turing machine examples]], Post-Turing machine and [[partial function]] show, the work can be "complicated".
<!-- Example: Multiply '''a''' x '''b''' = '''c''', for example: 3 x 4 = 12.
 
The scanned square is indicated by brackets around the mark i.e. ['''1''']. An extra mark serves to indicate the symbol "0".
 
At the start of a computation, just as Shepherdson-Sturgis and Melzak complain, we see the variables expressed in unary—i.e. the tally marks for '''a'''= '''| | | |''' and '''b''' = '''| | | | |''' – "in a line" (concatenated on what Melzak calls a "linear tape"). Space must be available for '''c''' at the end of the computation, extending without bounds to the right:
{|class="wikitable"
|- style="font-size:9pt" align="center" valign="bottom"
| width="14.4" Height="11.4" |
| width="13.8" |
| width="13.8" |
| width="13.8" | top
| width="13.8" | a
| width="13.8" | a
| width="13.8" | a
| width="13.8" |
| width="13.8" | top
| width="13.8" | b
| width="13.8" | b
| width="13.8" | b
| width="15.6" | b
| width="13.8" |
| width="13.8" | btm
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" |
| width="13.8" |
| width="13.8" |
|- style="font-size:9pt" align="center" valign="bottom"
| Height="11.4" |
|
|
|style="background-color:#FFFF99" | [1]
|style="background-color:#FFFF99" | 1
|style="background-color:#FFFF99" | 1
|style="background-color:#FFFF99" | 1
|
|style="background-color:#CCFFCC" | 1
|style="background-color:#CCFFCC" | 1
|style="background-color:#CCFFCC" | 1
|style="background-color:#CCFFCC" | 1
|style="background-color:#CCFFCC" | 1
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|}
 
At the end of the computation the multiplier '''b''' is 5 marks "in a line" (i.e. concatenated) to left of the 13 marks of product '''c'''.
{|class="wikitable"
|- style="font-size:9pt" align="center" valign="bottom"
| width="14.4" Height="11.4" |
| width="13.8" |
| width="13.8" |
| width="13.8" | top
| width="13.8" | a
| width="13.8" | a
| width="13.8" | a
| width="13.8" |
| width="13.8" | top
| width="13.8" | b
| width="13.8" | b
| width="13.8" | b
| width="15.6" | b
| width="13.8" |
| width="16.8" | btm
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
| width="13.8" | c
|- style="font-size:9pt" align="center" valign="bottom"
| Height="11.4" |
|
|
|
|
|
|
|
|style="background-color:#CCFFCC" | [1]
|style="background-color:#CCFFCC" | 1
|style="background-color:#CCFFCC" | 1
|style="background-color:#CCFFCC" | 1
|style="background-color:#CCFFCC" | 1
|
|style="background-color:#99CCFF" | 1
|style="background-color:#99CCFF" | 1
|style="background-color:#99CCFF" | 1
|style="background-color:#99CCFF" | 1
|style="background-color:#99CCFF" | 1
|style="background-color:#99CCFF" | 1
|style="background-color:#99CCFF" | 1
|style="background-color:#99CCFF" | 1
|style="background-color:#99CCFF" | 1
|style="background-color:#99CCFF" | 1
|style="background-color:#99CCFF" | 1
|style="background-color:#99CCFF" | 1
|style="background-color:#99CCFF" | 1
|
|
|
|}-->
 
===Minsky, Melzak-Lambek and Shepherdson-Sturgis models "cut the tape" into many===
So why not 'cut the tape' so each is infinitely long (to accommodate any size integer) but left-ended, and call these three tapes "Post-Turing (ie. Wang-like) tapes"? The individual heads will move left (for decrement) and right (for increment). In one sense the heads indicate "the tops of the stack" of concatenated marks. Or in Minsky (1961) and Hopcroft and Ullman (1979, p.&nbsp;171ff) the tape is always blank except for a mark at the left end—at no time does a head ever print or erase.
 
We just have to be careful to write our instructions so that a test-for-zero and jump occurs ''before'' we decrement otherwise our machine will "fall off the end" or "bump against the end"—we will have an instance of a [[partial function]]. Before a decrement our machine must always ask the question: "Is the tape/counter empty? If so then I can't decrement, otherwise I can."
 
:''For example of the addition algorithm written for a counter machine see [[Algorithm examples]], and for an example of (im-) proper subtraction see [[Partial function]].''
 
Minsky (1961) and Shepherdson-Sturgis (1963) prove that only a few tapes—as few as one—still allow the machine to be Turing equivalent ''IF'' the data on the tape is represented as a [[Gödel number]] (or some other uniquely encodable-decodable number); this number will evolve as the computation proceeds. In the one tape version with Gödel number encoding the counter machine must be able to (i) multiply the Gödel number by a constant (numbers "2" or "3"), and (ii) divide by a constant (numbers "2" or "3") and jump if the remainder is zero. Minsky (1967) shows that the need for this bizarre instruction set can be relaxed to { INC (r), JZDEC (r, z) } and the convenience instructions { CLR (r), J (r) } if two tapes are available. A simple Gödelization is still required, however. A similar result appears in Elgot-Robinson (1964) with respect to their RASP model.
<!-- To do a multiplication algorithm we don't need the extra mark to indicate "0", but we will need an extra "temporary" tape '''t'''.  And we will need an extra "blank/zero" register (e.g. register #0) for an unconditional jump:
 
{|class="wikitable"
|- style="font-size:9pt" align="center" valign="bottom"
|style="font-weight:bold" width="83.4" Height="12" | At  the start:
| width="16.8" |
| width="15" |
| width="16.2" |
| width="15" |
| width="15" |
| width="15" |
| width="15" |
| width="15" |
| width="15" |
| width="15" |
| width="15" |
| width="15" |
| width="15" |
| width="15" |
|- style="font-size:9pt" align="center" valign="bottom"
| Height="12" | register 0:
|style="font-weight:bold" | []
|
|
|
|
|
|
|
|
|
|
|
|
|
|- style="font-size:9pt" align="center" valign="bottom"
| Height="12" | a = register 1:
|style="background-color:#FFFF99" | 1
|style="background-color:#FFFF99" | 1
|style="background-color:#FFFF99" | 1
|style="background-color:#FFFF99;font-weight:bold" | []
|style="font-weight:bold" |
|
|
|
|
|
|
|
|
|
|- style="font-size:9pt" align="center" valign="bottom"
| Height="12" | b = register 2:
|style="background-color:#CCFFCC" | 1
|style="background-color:#CCFFCC" | 1
|style="background-color:#CCFFCC" | 1
|style="background-color:#CCFFCC" | 1
|style="background-color:#CCFFCC;font-weight:bold" | []
|
|
|
|
|
|
|
|
|
|- style="font-size:9pt" align="center" valign="bottom"
| Height="12" | c = register 3:
|style="background-color:#CCFFFF;font-weight:bold" | []
|
|
|
|
|
|
|
|
|
|
|
|
|
|- style="font-size:9pt" align="center" valign="bottom"
| Height="12" | t = register 4:
|style="font-weight:bold" | []
|
|
|
|
|
|
|
|
|
|
|
|
|
|- style="font-size:9pt" align="center" valign="bottom"
| Height="3" |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|- style="font-size:9pt"
|style="font-weight:bold" Height="12" align="center" valign="bottom" | At the end:
|  valign="bottom" |
| align="center" valign="bottom" |
| align="center" valign="bottom" |
| align="center" valign="bottom" |
| align="center" valign="bottom" |
| align="center" valign="bottom" |
| align="center" valign="bottom" |
| align="center" valign="bottom" |
| align="center" valign="bottom" |
| align="center" valign="bottom" |
| align="center" valign="bottom" |
| align="center" valign="bottom" |
| align="center" valign="bottom" |
| align="center" valign="bottom" |
|- style="font-size:9pt" align="center" valign="bottom"
| Height="12" | register 0:
|style="font-weight:bold" | []
|
|
|
|
|
|
|
|
|
|
|
|
|
|- style="font-size:9pt" align="center" valign="bottom"
| Height="12" | a = register 1:
|style="background-color:#FFFF99;font-weight:bold" | []
|
|
|
|style="font-weight:bold" |
|
|
|
|
|
|
|
|
|
|- style="font-size:9pt" align="center" valign="bottom"
| Height="12" | b = register 2:
|style="background-color:#CCFFCC" | 1
|style="background-color:#CCFFCC" | 1
|style="background-color:#CCFFCC" | 1
|style="background-color:#CCFFCC" | 1
|style="background-color:#CCFFCC;font-weight:bold" | []
|
|
|
|
|
|
|
|
|
|- style="font-size:9pt" align="center" valign="bottom"
| Height="12" | c = register 3:
|style="background-color:#CCFFFF" | 1
|style="background-color:#CCFFFF" | 1
|style="background-color:#CCFFFF" | 1
|style="background-color:#CCFFFF" | 1
|style="background-color:#CCFFFF" | 1
|style="background-color:#CCFFFF" | 1
|style="background-color:#CCFFFF" | 1
|style="background-color:#CCFFFF" | 1
|style="background-color:#CCFFFF" | 1
|style="background-color:#CCFFFF" | 1
|style="background-color:#CCFFFF" | 1
|style="background-color:#CCFFFF" | 1
|style="background-color:#CCFFFF;font-weight:bold" | []
|
|- style="font-size:9pt" align="center" valign="bottom"
| Height="12" | t = register 4:
| 1
| 1
| 1
| 1
|style="font-weight:bold" | []
|
|
|
|
|
|
|
|
|
|}
 
We can write simple Post-Turing "subroutines" to atomize "increment" and "decrement" into Post-Turing instructions. Note that the head stays always just one square to the right of the top-most printed mark, i.e. at the "top of the stack". "r" is a parameter in the instructions that symbolizes the tape-as-register to be moved and printed or erased, and tested:
:
: "Increment r" = PRINT_SCANNED_SQUARE_of_TAPE_r, MOVE_TAPE_r_LEFT; i.e. (or: move tape r's head right)
::'''X+''' r is equivalent to '''P''' r; '''L''' r
: "Decrement r" = JUMP_IF_TAPE_r_BLANK(ZERO) TO XXX, ELSE MOVE_TAPE_r_RIGHT, ERASE_SCANNED_SQUARE_of_TAPE_rN; (or: move tape r's head left)
::'''X-''' r is equivalent to '''J0''' r, xxx; '''R''' r; '''E''' r
 
Indeed this is similar to the approach that Minsky (1961) took. He started with 4 left-ended tape-machine that:
: "used the basic arithmetic device of the present paper. Then, two of the tapes were eliminated by the prime-factor method" (p. 438).
 
He then observed that:
: "we may formulate these results so that the operations act essentially only on the ''length'' of the strings" (his italics, p. 449).
 
His first model, "1961" (it had changed by 1967) started out with only a single mark at the left end of each tape-as-register. The machine was not allowed to '''P'''rint any marks, just move '''L'''eft or '''R'''ight and test for the mark = "1" in the following example. Thus the conventional Post-Turing-like instruction set went from
: { R; L; P; E; J0 xxx; J1 xxx, H }
 
to, for each tape-as-register:
: { R; L; J1 xxx, H }
 
where '''R''' can be renamed '''INC'''rement, '''R''' can be renamed '''DEC'''rement, "J1" can be combined with "DEC" to create a non-atomized instruction, or can be kept separate and renamed '''J'''ump if '''Z'''ero. -->
 
=== (1961) Melzak's model is different: clumps of pebbles go into and out of holes ===
Melzak's (1961) model is significantly different. He took his own model, flipped the tapes vertically, called them "holes in the ground" to be filled with "pebble counters". Unlike Minsky's "increment" and "decrement", Melzak allowed for proper subtraction of any count of pebbles and "adds" of any count of pebbles.
 
He defines indirect addressing for his model (p.&nbsp;288) and provides two examples of its use (p.&nbsp;89); his "proof" (p.&nbsp;290-292) that his model is [[Turing completeness|Turing equivalent]] is so sketchy that the reader cannot tell whether or not he intended the indirect addressing to be a requirement for the proof.
 
Legacy of Melzak's model is Lambek's simplification and the reappearance of his mnemonic conventions in Cook and Reckhow 1973.
 
=== Lambek (1961) atomizes Melzak's model into the Minsky (1961) model: INC and DEC-with-test ===
Lambek (1961) took Melzak's ternary model and atomized it down to the two unary instructions—X+, X- if possible else jump—exactly the same two that Minsky (1961) had come up with.
 
However, like the Minsky (1961) model, the Lambek model does execute its instructions in a default-sequential manner—both X+ and X- carry the identifier of the next instruction, and X- also carries the jump-to instruction if the zero-test is successful.
 
=== Elgot-Robinson (1964) and the problem of the RASP without indirect addressing ===
A RASP or [[Random access stored program machine]] begins as a counter machine with its "program of instruction" placed in its "registers". Analogous to, but independent of, the finite state machine's "Instruction Register", at least one of the registers (nicknamed the "program counter" (PC)) and one or more "temporary" registers maintain a record of, and operate on, the current instruction's number. The finite state machine's TABLE of instructions is responsible for (i) fetching the current ''program'' instruction from the proper register, (ii) parsing the ''program'' instruction, (ii) fetching operands specified by the ''program '' instruction, and (iv) executing the ''program'' instruction.
 
Except there is a problem: If based on the ''counter machine'' chassis this computer-like, [[von Neumann]] machine will not be Turing equivalent. It cannot compute everything that is computable. Intrinsically the model is bounded by the size of its (very-) ''finite'' state machine's instructions. The counter machine based RASP can compute any [[primitive recursive function]] (e.g. multiplication) but not all [[mu recursive function]]s (e.g. the [[Ackermann function]] ).
 
Elgot-Robinson investigate the possibility of allowing their RASP model to "self modify" its program instructions. The idea was an old one, proposed by Burks-Goldstine-von Neumann (1946-7), and sometimes called "the computed goto." Melzak (1961) specifically mentions the "computed goto" by name but instead provides his model with indirect addressing.
 
'''Computed goto:''' A RASP ''program'' of instructions that modifies the "goto address" in a conditional- or unconditional-jump ''program'' instruction.
 
But this does not solve the problem (unless one resorts to [[Gödel number]]s). What is necessary is a method to fetch the address of a program instruction that lies (far) "beyond/above" the upper bound of the ''finite'' state machine's instruction register and TABLE.
 
:Example: A counter machine equipped with only four unbounded registers can e.g. multiply any two numbers ( m, n ) together to yield p—and thus be a primitive recursive function—no matter how large the numbers m and n; moreover, less than 20 instructions are required to do this! e.g. { 1: CLR ( p ), 2: JZ ( m, done ), 3 outer_loop: JZ ( n, done ), 4: CPY ( m, temp ), 5: inner_loop: JZ ( m, outer_loop ), 6: DEC ( m ), 7: INC ( p ), 8: J ( inner_loop ), 9: outer_loop: DEC ( n ), 10 J ( outer_loop ), HALT }
 
:However, with only 4 registers, this machine has not nearly big enough to build a RASP that can execute the multiply algorithm as a ''program''. No matter how big we build our finite state machine there will always be a ''program'' (including its parameters) which is larger. So by definition the bounded program machine that does not use unbounded encoding tricks such as Gödel numbers cannot be ''universal''.
 
Minsky (1967) hints at the issue in his investigation of a counter machine (he calls them "program computer models") equipped with the instructions { CLR (r), INC (r), and RPT ("a" times the instructions m to n) }. He doesn't tell us how to fix the problem, but he does observe that:
: "... the program computer has to have some way to keep track of how many RPT's remain to be done, and this might exhaust any particular amount of storage allowed in the finite part of the computer. RPT operations require infinite registers of their own, in general, and they must be treated differently from the other kinds of operations we have considered." (p. 214)
 
But Elgot and Robinson solve the problem: They augment their P<sub>0</sub> RASP with an indexed set of instructions—a somewhat more complicated (but more flexible) form of indirect addressing. Their P'<sub>0</sub> model addresses the registers by adding the contents of the "base" register (specified in the instruction) to the "index" specified explicitly in the instruction (or vice versa, swapping "base" and "index"). Thus the indexing P'<sub>0</sub> instructions have one more parameter than the non-indexing P<sub>0</sub> instructions:
: Example: INC ( r<sub>base</sub>, index ) ; effective address will be [r<sub>base</sub>] + index, where the natural number "index" is derived from the finite-state machine instruction itself.
 
=== Hartmanis (1971) ===
By 1971 Hartmanis has simplified the indexing to [[indirection]] for use in his RASP model.
 
'''Indirect addressing:''' A pointer-register supplies the finite state machine with the address of the target register required for the instruction. Said another way: The ''contents'' of the pointer-register is the ''address'' of the "target" register to be used by the instruction. If the pointer-register is unbounded, the RAM, and a suitable RASP built on its chassis, will be Turing equivalent. The target register can serve either as a source or destination register, as specified by the instruction.
 
Note that the finite state machine does not have to explicitly specify this target register's address. It just says to the rest of the machine: Get me the contents of the register pointed to by my pointer-register and then do xyz with it. It must specify explicitly by name, via its instruction, this pointer-register (e.g. "N", or "72" or "PC", etc.) but it doesn't have to know what number the pointer-register actually contains (perhaps 279,431).
 
=== Cook and Reckhow (1973) describe the RAM ===
Cook and Reckhow (1973) cite Hartmanis (1971) and simplify his model to what they call a [[Random access machine]] ( RAM—i.e. a machine with indirection and the [[Harvard architecture]]). In a sense we are back to Melzak (1961) but with a much simpler model than Melzak's.
 
== Precedence ==
 
Minsky was working at the [[Lincoln Laboratory|M.I.T. Lincoln Labs]] and published his work there; his paper was received for publishing in the ''Annals of Mathematics'' on August 15, 1960 but not published until November 1961. While receipt occurred a full year before the work of Melzak and Lambek was received and published (received, respectively, May and June 15, 1961 and published side-by-side September 1961). That (i) both were Canadians and published in the Canadian Mathematical Bulletin, (ii) neither would have had reference to Minsky's work because it was not yet published in a peer-reviewed journal, but (iii) Melzak references Wang, and Lambek references Melzak, leads one to hypothesize that their work occurred simultaneously and independently.
 
Almost exactly the same thing happened to Shepherdson and Sturgis. Their paper was received in December 1961—just a few months after Melzak and Lambek's work was received. Again, they had little (at most 1 month) or no benefit of reviewing the work of Minsky. They were careful to observe in footnotes that papers by Ershov, Kaphengst and Peter had "recently appeared" (p.&nbsp;219). These were published much earlier but appeared in the German language in German journals so issues of accessibility present themselves.
 
The final paper of Shepherdson and Sturgis did not appear in a peer-reviewed journal until 1963. And as they fairly and honestly note in their Appendix A, the 'systems' of Kaphengst (1959), Ershov (1958), Peter (1958) are all so similar to what results were obtained later as to be indistinguishable to a set of the following:
: produce 0 i.e. 0 --> n
: increment a number i.e. n+1 --> n
::"i.e. of performing the operations which generate the natural numbers" (p. 246)
: copy a number i.e. n --> m
: to "change the course of a computation", either comparing two numbers or decrementing until 0
 
Indeed, Shepherson and Sturgis conclude
::"The various minimal systems are very similar"( p. 246)
 
By order of ''publishing'' date the work of Kaphengst (1959), Ershov (1958), Peter (1958) were first.
 
==See also==
<div style="-moz-column-count:3; column-count:3;">
* [[Counter machine]]
:* [[Counter machine:Reference model]]
:* [[Counter machine models]]
* [[Pointer machine]]
* [[Random access machine]]
* [[Random access stored program machine]]
* [[Turing machine]]
:* [[Universal Turing machine]]
:* [[Turing machine gallery]]
:* [[Turing machine examples]]
* [[Wang B-machine]]
* [[Post-Turing machine]] - description plus examples
* [[Algorithm]]
:* [[Algorithm characterizations]]
:* [[Algorithm examples]]
* [[Halting problem]]
* [[Busy beaver]]
* [[Stack machine]]
</div>
 
== Bibliography ==
'''Background texts:''' The following bibliography of source papers includes a number of texts to be used as background. The mathematics that led to the flurry of papers about abstract machines in the 1950s and 1960s can be found in van Heijenoort (1967)—an assemblage of original papers spanning the 50 years from Frege (1879) to Gödel (1931). Davis (ed.) ''The Undecidable'' (1965) carries the torch onward beginning with Gödel (1931) through Gödel's (1964) postscriptum (p.&nbsp;71); the original papers of [[Alan Turing]] (1936-7) and [[Emil Post]] (1936) are included in ''The Undecidable''. The mathematics of Church, Rosser and Kleene that appear as reprints of original papers in ''The Undecidable'' is carried further in Kleene (1952), a mandatory text for anyone pursuing a deeper understanding of the mathematics behind the machines. Both Kleene (1952) and Davis (1958) are referenced by a number of the papers.
 
For a good treatment of the counter machine see Minsky (1967) Chapter 11 "Models similar to Digital Computers"—he calls the counter machine a "program computer". A recent overview is found at van Emde Boas (1990). A recent treatment of the Minsky (1961)/Lambek (1961) model can be found Boolos-Burgess-Jeffrey (2002); they reincarnate Lambek's "abacus model" to demonstrate equivalence of Turing machines and partial recursive functions, and they provide a graduate-level introduction to both abstract machine models (counter- and Turing-) and the mathematics of recursion theory. Beginning with the first edition Boolos-Burgess (1970) this model appeared with virtually the same treatment.
 
'''The papers''': The papers begin with Wang (1957) and his dramatic simplification of the Turing machine.  Turing (1936), Kleene (1952), Davis (1958) and in particular Post (1936) are cited in Wang (1957); in turn, Wang is referenced by Melzak (1961), Minsky (1961) and Shepherdson-Sturgis (1961-3) as they independently reduce the Turing tapes to "counters". Melzak (1961) provides his pebble-in-holes counter machine model with indirection but doesn't carry the treatment further. The work of Elgot-Robinson (1964) define the RASP—the computer-like [[random access stored program machine]]s—and appear to be the first to investigate the failure of the bounded [[counter machine]] to calculate the mu-recursive functions. This failure—except with the draconian use of [[Gödel number]]s in the manner of Minsky (1961))—leads to their definition of "indexed" instructions (i.e. indirect addressing) for their RASP model. Elgot-Robinson (1964) and more so Hartmanis (1971) investigate RASPs with self-modifying programs. Hartmanis (1971) specifies an instruction set with indirection, citing lecture notes of Cook (1970). For use in investigations of computational complexity Cook and his graduate student Reckhow (1973) provide the definition of a RAM (their model and mnemonic convention are similar to Melzak's, but offer him no reference in the paper). The pointer machines are an offshoot of Knuth (1968, 1973) and independently Schönhage (1980).
 
For the most part the papers contain mathematics beyond the undergraduate level—in particular the [[primitive recursive function]]s and [[mu recursive function]]s presented elegantly in Kleene (1952) and less in depth, but useful nonetheless, in Boolos-Burgess-Jeffrey (2002).
 
All texts and papers excepting the four starred have been witnessed. These four are written in German and appear as references in Shepherdson-Sturgis (1963) and Elgot-Robinson (1964); Shepherdson-Sturgis (1963) offer a brief discussion of their results in Shepherdson-Sturgis' Appendix A. The terminology of at least one paper (Kaphengst (1959) seems to hark back to the Burke-Goldstine-von Neumann (1946-7) analysis of computer architecture.
 
{|class="wikitable"
|-  style="text-align:center; font-size:9pt; vertical-align:bottom;"
!  style="width:114.6; height:49.2;"| Author
! style="width:43.2;"| Year
! style="width:46.8;"| Reference
! style="width:41.4;"| Turing machine
! style="width:43.2;"| Counter machine
! style="width:27.6;"| RAM
! style="width:28.2;"| RASP
! style="width:38.4;"| Pointer machine
! style="width:54px;"| Indirect addressing
! style="width:43.8;"| Self-modifying program
|-  style="font-size:9pt; vertical-align:bottom;"
|  style="height:11.4; "| Goldstine & von Neumann
|  style="text-align:center; "| 1947
|  style="text-align:center; "| X
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "| X
|-  style="font-size:9pt; vertical-align:bottom;"
|  style="height:11.4; "| Kleene
|  style="text-align:center; "| 1952
|  style="text-align:center; "| X
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|-  style="font-size:9pt; vertical-align:bottom;"
|  style="height:11.4; "| *Hermes
|  style="text-align:center; "| 1954, 5
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "| ?
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|-  style="font-size:9pt; vertical-align:bottom;"
|  style="height:11.4; "| Wang
|  style="text-align:center; "| 1957
|  style="text-align:center; "| X
|  style="text-align:center; "| X
|  style="text-align:center; "| hints
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "| hints
|  style="text-align:center; "|
|-  style="font-size:9pt; vertical-align:bottom;"
|  style="height:11.4; "| *Peter
|  style="text-align:center; "| 1958
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "| ?
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|-  style="font-size:9pt; vertical-align:bottom;"
|  style="height:12px; "| Davis
|  style="text-align:center; "| 1958
|  style="text-align:center; "| X
|  style="text-align:center; "| X
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|-  style="font-size:9pt; vertical-align:bottom;"
|  style="height:11.4; "| *Ershov
|  style="text-align:center; "| 1959
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "| ?
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|-  style="font-size:9pt; vertical-align:bottom;"
|  style="height:11.4; "| *Kaphengst
|  style="text-align:center; "| 1959
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "| ?
|  style="text-align:center; "|
|  style="text-align:center; "| X
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|-  style="font-size:9pt; vertical-align:bottom;"
|  style="height:11.4; "| Melzak
|  style="text-align:center; "| 1961
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "| X
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "| X
|  style="text-align:center; "| hints
|-  style="font-size:9pt; vertical-align:bottom;"
|  style="height:11.4; "| Lambek
|  style="text-align:center; "| 1961
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "| X
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|-  style="font-size:9pt; vertical-align:bottom;"
|  style="height:11.4; "| Minsky
|  style="text-align:center; "| 1961
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "| X
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|-  style="font-size:9pt; vertical-align:bottom;"
|  style="height:11.4; "| Shepherdson & Sturgis
|  style="text-align:center; "| 1963
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "| X
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "| hints
|  style="text-align:center; "|
|-  style="font-size:9pt; vertical-align:bottom;"
|  style="height:12px; "| Elgot & Robinson
|  style="text-align:center; "| 1964
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "| X
|  style="text-align:center; "|
|  style="text-align:center; "| X
|  style="text-align:center; "| X
|-  style="font-size:9pt; vertical-align:bottom;"
|  style="height:12px; "| Davis- Undecidable
|  style="text-align:center; "| 1965
|  style="text-align:center; "| X
|  style="text-align:center; "| X
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|-  style="font-size:9pt; vertical-align:bottom;"
|  style="height:12px; "| van Heijenoort
|  style="text-align:center; "| 1967
|  style="text-align:center; "| X
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|-  style="font-size:9pt; vertical-align:bottom;"
|  style="height:12px; "| Minsky
|  style="text-align:center; "| 1967
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "| X
|  style="text-align:center; "| hints
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "| hints
|  style="text-align:center; "|
|-  style="font-size:9pt; vertical-align:bottom;"
|  style="height:11.4; "| Knuth
|  style="text-align:center; "| 1968, 73
|  style="text-align:center; "| X
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "| X
|  style="text-align:center; "| X
|  style="text-align:center; "| X
|-  style="font-size:9pt; vertical-align:bottom;"
|  style="height:12px; "| Hartmanis
|  style="text-align:center; "| 1971
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "| X
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "| X
|-  style="font-size:9pt; vertical-align:bottom;"
|  style="height:12px; "| Cook & Reckhow
|  style="text-align:center; "| 1973
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "| X
|  style="text-align:center; "| X
|  style="text-align:center; "|
|  style="text-align:center; "| X
|  style="text-align:center; "| X
|-  style="font-size:9pt; vertical-align:bottom;"
|  style="height:11.4; "| Schonhage
|  style="text-align:center; "| 1980
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "| X
|  style="text-align:center; "|
|  style="text-align:center; "| X
|  style="text-align:center; "| X
|  style="text-align:center; "|
|-  style="font-size:9pt; vertical-align:bottom;"
|  style="height:11.4; "| van Emde Boas
|  style="text-align:center; "| 1990
|  style="text-align:center; "| X
|  style="text-align:center; "| X
|  style="text-align:center; "| X
|  style="text-align:center; "| X
|  style="text-align:center; "| X
|  style="text-align:center; "| X
|  style="text-align:center; "|
|  style="text-align:center; "|
|-  style="font-size:9pt; vertical-align:bottom;"
|  style="height:12.6; "| Boolos & Burgess; Boolos, Burgess & Jeffrey
|  style="text-align:center; "| 1970–2002
|  style="text-align:center; "| X
|  style="text-align:center; "| X
|  style="text-align:center; "| X
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|  style="text-align:center; "|
|}
 
==References==
<references/>
* [[George Boolos]], [[John P. Burgess]], [[Richard Jeffrey]] (2002), ''Computability and Logic: Fourth Edition'', Cambridge University Press, Cambridge, England. The original Boolos-Jeffrey text has been extensively revised by Burgess: more advanced than an introductory textbook. "Abacus machine" model is extensively developed in Chapter 5 ''Abacus Computability''; it is one of three models extensively treated and compared—the Turing machine (still in Boolos' original 4-tuple form) and recursion the other two.
* [[Arthur Burks]], [[Herman Goldstine]], [[John von Neumann]] (1946), ''Preliminary discussion of the logical design of an electronic computing instrument'', reprinted pp.&nbsp;92ff in [[Gordon Bell]] and [[Allen Newell]] (1971), ''Computer Structures: Readings and Examples'', mcGraw-Hill Book Company, New York. ISBN 0-07-004357-4  .
* [[Stephen Cook|Stephen A. Cook]] and Robert A. Reckhow (1972), ''Time-bounded random access machines'', Journal of Computer Systems Science 7 (1973), 354-375.
* [[Martin Davis]] (1958), ''Computability & Unsolvability'', McGraw-Hill Book Company, Inc. New York.
* [[Calvin Elgot]] and [[Abraham Robinson]] (1964), ''Random-Access Stored-Program Machines, an Approach to Programming Languages'', Journal of the Association for Computing Machinery, Vol. 11, No. 4 (October, 1964), pp.&nbsp;365–399.
* [[Juris Hartmanis|J. Hartmanis]] (1971), "Computational Complexity of Random Access Stored Program Machines," Mathematical Systems Theory 5, 3 (1971) pp.&nbsp;232–245.
* [[John Hopcroft]], [[Jeffrey Ullman]] (1979). ''Introduction to Automata Theory, Languages and Computation'', 1st ed., Reading Mass: Addison-Wesley. ISBN 0-201-02988-X.  A difficult book centered around the issues of machine-interpretation of "languages", NP-Completeness, etc.
* [[Stephen Kleene]] (1952), ''Introduction to Metamathematics'', North-Holland Publishing Company, Amsterdam, Netherlands. ISBN 0-7204-2103-9.
*[[Donald Knuth]] (1968), ''The Art of Computer Programming'', Second Edition 1973, Addison-Wesley, Reading, Massachusetts. Cf pages 462-463 where he defines "a new kind of abstract machine or 'automaton' which deals with linked structures."
*[[Joachim Lambek]] (1961, received 15 June 1961), ''How to Program an Infinite Abacus'', Mathematical Bulletin, vol. 4, no. 3. September 1961 pages 295-302. In his Appendix II, Lambek proposes a "formal definition of 'program'. He references Melzak (1961) and Kleene (1952) ''Introduction to Metamathematics''.
*[[Zdzislaw Alexander Melzak|Z. A. Melzak]] (1961, received 15 May 1961), ''An informal Arithmetical Approach to Computability and Computation'', Canadian Mathematical Bulletin, vol. 4, no. 3. September 1961 pages 279-293. Melzak offers no references but acknowledges "the benefit of conversations with Drs. R. Hamming, D. McIlroy and V. Vyssots of the Bell telephone Laborators and with Dr. H. Wang of Oxford University."
*{{cite journal|author=[[Marvin Minsky]]
|title=Recursive Unsolvability of Post's Problem of 'Tag' and Other Topics in Theory of Turing Machines
|journal=Annals of Math
|date=1961, received August 15, 1960
|volume=74
|pages=437–455
|doi=10.2307/1970290|issue=3|publisher=The Annals of Mathematics, Vol. 74, No. 3|jstor=1970290
}}
*{{cite book |author= Marvin Minsky |title = Computation: Finite and Infinite Machines | edition = 1st | publisher = Prentice-Hall, Inc.| location = Englewood Cliffs, N. J. | year = 1967}} In particular see chapter 11: ''Models Similar to Digital Computers'' and chapter 14: ''Very Simple Bases for Computability''. In the former chapter he defines "Program machines" and in the later chapter he discusses "Universal Program machines with Two Registers" and "...with one register", etc.
*[[John C. Shepherdson]] and [[H. E. Sturgis]] (1961) received December 1961 ''Computability of Recursive Functions'', Journal of the Association of Computing Machinery (JACM) 10:217-255, 1963. An extremely valuable reference paper. In their Appendix A the authors cite 4 others with reference to "Minimality of Instructions Used in 4.1: Comparison with Similar Systems".
:*Kaphengst, Heinz, ''Eine Abstrakte programmgesteuerte Rechenmaschine''', Zeitschrift fur mathematische Logik und Grundlagen der Mathematik:''5'' (1959), 366-379.
:*[[Andrey Ershov|Ershov, A. P.]] ''On operator algorithms'', (Russian) Dok. Akad. Nauk 122 (1958), 967-970. English translation, Automat. Express 1 (1959), 20-23.
:*[[Rózsa Péter|Péter, Rózsa]] ''Graphschemata und rekursive Funktionen'', Dialectica 12 (1958), 373.
:*Hermes, Hans ''Die Universalität programmgesteuerter Rechenmaschinen. Math.-Phys. Semesterberichte (Göttingen) 4 (1954), 42-53.
* [[Arnold Schönhage]] (1980), ''Storage Modification Machines'', Society for Industrial and Applied Mathematics, SIAM J. Comput. Vol. 9, No. 3, August 1980. Wherein Schōnhage shows the equivalence of his SMM with the "successor RAM" (Random Access Machine), etc. resp. ''Storage Modification Machines'', in ''Theoretical Computer Science'' (1979), pp.&nbsp;36–37
*[[Peter van Emde Boas]], "Machine Models and Simulations" pp.&nbsp;3–66, in: [[Jan van Leeuwen]], ed. ''Handbook of Theoretical Computer Science. Volume A: Algorithms and Complexity'', The MIT PRESS/Elsevier, 1990. ISBN 0-444-88071-2 (volume A). QA 76.H279 1990. van Emde Boas' treatment of SMMs appears on pp.&nbsp;32–35. This treatment clarifies Schōnhage 1980—it closely follows but expands slightly the Schōnhage treatment. Both references may be needed for effective understanding.
*[[Hao Wang (academic)|Hao Wang]] (1957), ''A Variant to Turing's Theory of Computing Machines'', JACM (Journal of the Association for Computing Machinery) 4; 63-92. Presented at the meeting of the Association, June 23–25, 1954.
 
==External links==
* {{MathWorld|title=Register machine|urlname=RegisterMachine}}
* [http://www.igblan.free-online.co.uk/igblan/ca/minsky.html Igblan - Minsky Register Machines]
 
{{DEFAULTSORT:Register Machine}}
[[Category:Models of computation]]
[[Category:Register machines|*]]

Latest revision as of 04:09, 22 December 2013

Template:Multiple issues

In mathematical logic and theoretical computer science a register machine is a generic class of abstract machines used in a manner similar to a Turing machine. All the models are Turing equivalent.

Overview

The register machine gets its name from its use of one or more "registers". In contrast to the tape and head used by a Turing machine, the model uses multiple, uniquely addressed registers, each of which holds a single positive integer.

There are at least 4 sub-classes found in literature, here listed from most primitive to the most like a computer:

  • Counter machine – the most primitive and reduced theoretical model of a computer hardware. Lacks indirect addressing. Instructions are in the finite state machine in the manner of the Harvard architecture.
  • Pointer machine – a blend of counter machine and RAM models. Less common and more abstract than either model. Instructions are in the finite state machine in the manner of the Harvard architecture.
  • Random access machine (RAM) – a counter machine with indirect addressing and, usually, an augmented instruction set. Instructions are in the finite state machine in the manner of the Harvard architecture.
  • Random-access stored-program machine model (RASP) – a RAM with instructions in its registers analogous to the Universal Turing machine; thus it is an example of the von Neumann architecture. But unlike a computer, the model is idealized with effectively infinite registers (and if used, effectively infinite special registers such as an accumulator). Unlike a computer or even a RISC, the instruction set is much reduced in number.

Any properly defined register machine model is Turing equivalent. Computational speed is very dependent on the model specifics.

In practical computer science, a similar concept known as a virtual machine is sometimes used to minimise dependencies on underlying machine architectures. Such machines are also used for teaching. The term "register machine" is sometimes used to refer to a virtual machine in textbooks.[1]

Formal definition

No standard terminology exists; each author is responsible for defining in prose the meanings of their mnemonics or symbols. Many authors use a "register-transfer"-like symbolism to explain the actions of their models, but again they are responsible for defining its syntax.

A register machine consists of:

  1. An unbounded number of labeled, discrete, unbounded registers unbounded in extent (capacity): a finite (or infinite in some models) set of registers r0rn each considered to be of infinite extent and each of which holds a single non-negative integer (0, 1, 2, ...).[2] The registers may do their own arithmetic, or there may be one or more special registers that do the arithmetic e.g. an "accumulator" and/or "address register". See also Random access machine.
  2. Tally counters or marks:[3] discrete, indistinguishable objects or marks of only one sort suitable for the model. In the most-reduced counter machine model, per each arithmetic operation only one object/mark is either added to or removed from its location/tape. In some counter machine models (e.g. Melzak (1961), Minsky (1961)) and most RAM and RASP models more than one object/mark can be added or removed in one operation with "addition" and usually "subtraction"; sometimes with "multiplication" and/or "division". Some models have control operations such as "copy" (variously: "move", "load", "store") that move "clumps" of objects/marks from register to register in one action.
  3. A (very) limited set of instructions: the instructions tend to divide into two classes: arithmetic and control. The instructions are drawn from the two classes to form "instruction-sets", such that an instruction set must allow the model to be Turing equivalent (it must be able to compute any partial recursive function).
    1. Arithmetic: arithmetic instructions may operate on all registers or on just a special register (e.g. accumulator). They are usually chosen from the following sets (but exceptions abound):
      • Counter machine: { Increment (r), Decrement (r), Clear-to-zero (r) }
      • Reduced RAM, RASP: { Increment (r), Decrement (r), Clear-to-zero (r), Load-immediate-constant k, Add (r1,r2), proper-Subtract (r1,r2), Increment accumulator, Decrement accumulator, Clear accumulator, Add to accumulator contents of register r, proper-Subtract from accumulator contents of register r, }
      • Augmented RAM, RASP: All of the reduced instructions plus: { Multiply, Divide, various Boolean bit-wise (left-shift, bit test, etc.)}
    2. Control:
      • Counter machine models: optional { Copy (r1,r2) }
      • RAM and RASP models: most have { Copy (r1,r2) }, or { Load Accumulator from r, Store accumulator into r, Load Accumulator with immediate constant }
      • All models: at least one conditional "jump" (branch, goto) following test of a register e.g. { Jump-if-zero, Jump-if-not-zero (i.e. Jump-if-positive), Jump-if-equal, Jump-if-not equal }
      • All models optional: { unconditional program jump (goto) }
    3. Register-addressing method:
      • Counter machine: no indirect addressing, immediate operands possible in highly atomized models
      • RAM and RASP: indirect addressing available, immediate operands typical
    4. Input-output: optional in all models
  4. State register: A special Instruction Register "IR", finite and separate from the registers above, stores the current instruction to be executed and its address in the TABLE of instructions; this register and its TABLE is located in the finite state machine.
    • The IR is off-limits to all models. In the case of the RAM and RASP, for purposes of determining the "address" of a register, the model can select either (i) in the case of direct addressing—the address specified by the TABLE and temporarily located in the IR or (ii) in the case of indirect addressing—the contents of the register specified by the IR's instruction.
    • The IR is not the "program counter" (PC) of the RASP (or conventional computer). The PC is just another register similar to an accumulator, but dedicated to holding the number of the RASP's current register-based instruction. Thus a RASP has two "instruction/program" registers—(i) the IR (finite state machine's Instruction Register), and (ii) a PC (Program Counter) for the program located in the registers. (As well as a register dedicated to "the PC", a RASP may dedicate another register to "the Program-Instruction Register" (going by any number of names such as "PIR, "IR", "PR", etc.)
  5. List of labeled instructions, usually in sequential order: A finite list of instructions I1Im. In the case of the counter machine, random access machine (RAM) and pointer machine the instruction store is in the "TABLE" of the finite state machine; thus these models are example of the Harvard architecture. In the case of the RASP the program store is in the registers; thus this is an example of the von Neumann architecture. See also Random access machine and Random access stored program machine.
    Usually, like computer programs, the instructions are listed in sequential order; unless a jump is successful the default sequence continues in numerical order. An exception to this is the abacus (Lambek (1961), Minsky (1961)) counter machine models—every instruction has at least one "next" instruction identifier "z", and the conditional branch has two.
    • Observe also that the abacus model combines two instructions, JZ then DEC: e.g. { INC ( r, z ), JZDEC ( r, ztrue, zfalse ) }.
      See McCarthy Formalism for more about the conditional expression "IF r=0 THEN ztrue ELSE zfalse" (cf McCarthy (1960)).

Historical development of the register machine model

Two trends appeared in the early 1950s—the first to characterize the computer as a Turing machine, the second to define computer-like models—models with sequential instruction sequences and conditional jumps—with the power of a Turing machine, i.e. a so-called Turing equivalence. Need for this work was carried out in context of two "hard" problems: the unsolvable word problem posed by Emil Post—his problem of "tag"—and the very "hard" problem of Hilbert's problems—the 10th question around Diophantine equations. Researchers were questing for Turing-equivalent models that were less "logical" in nature and more "arithmetic" (cf Melzak (1961) p. 281, Shepherdson-Sturgis (1963) p. 218).

The first trend—toward characterizing computers—seems to have originated[4] with Hans Hermes (1954), Rózsa Péter (1958), and Heinz Kaphengst (1959), the second trend with Hao Wang (1954, 1957) and, as noted above, furthered along by Zdzislaw Alexander Melzak (1961), Joachim Lambek (1961), Marvin Minsky (1961, 1967), and John Shepherdson and Howard E. Sturgis (1963).

The last five names are listed explicitly in that order by Yuri Matiyasevich. He follows up with:

"Register machines [some authors use "register machine" synonymous with "counter-machine"] are particularly suitable for constructing Diophantine equations. Like Turing machines, they have very primitive instructions and, in addition, they deal with numbers" (Yuri Matiyasevich (1993), Hilbert's Tenth Problem, commentary to Chapter 5 of the book, at http://logic.pdmi.ras.ru/yumat/H10Pbook/commch_5htm. )

It appears that Lambek, Melzak, Minsky and Shepherdson and Sturgis independently anticipated the same idea at the same time. See Note On Precedence below.

The history begins with Wang's model.

(1954, 1957) Wang's model: Post-Turing machine

Wang's work followed from Emil Post's (1936) paper and led Wang to his definition of his Wang B-machine—a two-symbol Post-Turing machine computation model with only four atomic instructions:

{ LEFT, RIGHT, PRINT, JUMP_if_marked_to_instruction_z }

To these four both Wang (1954, 1957) and then C.Y. Lee (1961) added another other instruction from the Post set { ERASE }, and then a Post's unconditional jump { JUMP_to_ instruction_z } (or to make things easier, the conditional jump JUMP_IF_blank_to_instruction_z, or both. Lee named this a "W-machine" model:

{ LEFT, RIGHT, PRINT, ERASE, JUMP_if_marked, [maybe JUMP or JUMP_IF_blank] }

Wang expressed hope that his model would be "a rapprochement" (p. 63) between the theory of Turing machines and the practical world of the computer.

Wang's work was highly influential. We find him referenced by Minsky (1961) and (1967), Melzak (1961), Shepherdson and Sturgis (1963). Indeed, Shepherdson and Sturgis (1963) remark that:

"...we have tried to carry a step further the 'rapprochement' between the practical and theoretical aspects of computation suggested by Wang" (p. 218)

Martin Davis eventually evolved this model into the (2-symbol) Post-Turing machine.

Difficulties with the Wang/Post-Turing model:

Except there was a problem: the Wang model (the six instructions of the 7-instruction Post-Turing machine) was still a single-tape Turing-like device, however nice its sequential program instruction-flow might be. Both Melzak (1961) and Shepherdson and Sturgis (1963) observed this (in the context of certain proofs and investigations):

"...a Turing machine has a certain opacity... a Turing machine is slow in (hypothetical) operation and, usually, complicated. This makes it rather hard to design it, and even harder to investigate such matters as time or storage optimization or a comparison between efficiency of two algorithms. (Melzak (1961) p. 281)
"...although not difficult ... proofs are complicated and tedious to follow for two reasons: (1) A Turing machine has only head so that one is obliged to break down the computation into very small steps of operations on a single digit. (2) It has only one tape so that one has to go to some trouble to find the number one wishes to work on and keep it separate from other numbers" (Shepherdson and Sturgis (1963) p. 218).

Indeed as examples at Turing machine examples, Post-Turing machine and partial function show, the work can be "complicated".

Minsky, Melzak-Lambek and Shepherdson-Sturgis models "cut the tape" into many

So why not 'cut the tape' so each is infinitely long (to accommodate any size integer) but left-ended, and call these three tapes "Post-Turing (ie. Wang-like) tapes"? The individual heads will move left (for decrement) and right (for increment). In one sense the heads indicate "the tops of the stack" of concatenated marks. Or in Minsky (1961) and Hopcroft and Ullman (1979, p. 171ff) the tape is always blank except for a mark at the left end—at no time does a head ever print or erase.

We just have to be careful to write our instructions so that a test-for-zero and jump occurs before we decrement otherwise our machine will "fall off the end" or "bump against the end"—we will have an instance of a partial function. Before a decrement our machine must always ask the question: "Is the tape/counter empty? If so then I can't decrement, otherwise I can."

For example of the addition algorithm written for a counter machine see Algorithm examples, and for an example of (im-) proper subtraction see Partial function.

Minsky (1961) and Shepherdson-Sturgis (1963) prove that only a few tapes—as few as one—still allow the machine to be Turing equivalent IF the data on the tape is represented as a Gödel number (or some other uniquely encodable-decodable number); this number will evolve as the computation proceeds. In the one tape version with Gödel number encoding the counter machine must be able to (i) multiply the Gödel number by a constant (numbers "2" or "3"), and (ii) divide by a constant (numbers "2" or "3") and jump if the remainder is zero. Minsky (1967) shows that the need for this bizarre instruction set can be relaxed to { INC (r), JZDEC (r, z) } and the convenience instructions { CLR (r), J (r) } if two tapes are available. A simple Gödelization is still required, however. A similar result appears in Elgot-Robinson (1964) with respect to their RASP model.

(1961) Melzak's model is different: clumps of pebbles go into and out of holes

Melzak's (1961) model is significantly different. He took his own model, flipped the tapes vertically, called them "holes in the ground" to be filled with "pebble counters". Unlike Minsky's "increment" and "decrement", Melzak allowed for proper subtraction of any count of pebbles and "adds" of any count of pebbles.

He defines indirect addressing for his model (p. 288) and provides two examples of its use (p. 89); his "proof" (p. 290-292) that his model is Turing equivalent is so sketchy that the reader cannot tell whether or not he intended the indirect addressing to be a requirement for the proof.

Legacy of Melzak's model is Lambek's simplification and the reappearance of his mnemonic conventions in Cook and Reckhow 1973.

Lambek (1961) atomizes Melzak's model into the Minsky (1961) model: INC and DEC-with-test

Lambek (1961) took Melzak's ternary model and atomized it down to the two unary instructions—X+, X- if possible else jump—exactly the same two that Minsky (1961) had come up with.

However, like the Minsky (1961) model, the Lambek model does execute its instructions in a default-sequential manner—both X+ and X- carry the identifier of the next instruction, and X- also carries the jump-to instruction if the zero-test is successful.

Elgot-Robinson (1964) and the problem of the RASP without indirect addressing

A RASP or Random access stored program machine begins as a counter machine with its "program of instruction" placed in its "registers". Analogous to, but independent of, the finite state machine's "Instruction Register", at least one of the registers (nicknamed the "program counter" (PC)) and one or more "temporary" registers maintain a record of, and operate on, the current instruction's number. The finite state machine's TABLE of instructions is responsible for (i) fetching the current program instruction from the proper register, (ii) parsing the program instruction, (ii) fetching operands specified by the program instruction, and (iv) executing the program instruction.

Except there is a problem: If based on the counter machine chassis this computer-like, von Neumann machine will not be Turing equivalent. It cannot compute everything that is computable. Intrinsically the model is bounded by the size of its (very-) finite state machine's instructions. The counter machine based RASP can compute any primitive recursive function (e.g. multiplication) but not all mu recursive functions (e.g. the Ackermann function ).

Elgot-Robinson investigate the possibility of allowing their RASP model to "self modify" its program instructions. The idea was an old one, proposed by Burks-Goldstine-von Neumann (1946-7), and sometimes called "the computed goto." Melzak (1961) specifically mentions the "computed goto" by name but instead provides his model with indirect addressing.

Computed goto: A RASP program of instructions that modifies the "goto address" in a conditional- or unconditional-jump program instruction.

But this does not solve the problem (unless one resorts to Gödel numbers). What is necessary is a method to fetch the address of a program instruction that lies (far) "beyond/above" the upper bound of the finite state machine's instruction register and TABLE.

Example: A counter machine equipped with only four unbounded registers can e.g. multiply any two numbers ( m, n ) together to yield p—and thus be a primitive recursive function—no matter how large the numbers m and n; moreover, less than 20 instructions are required to do this! e.g. { 1: CLR ( p ), 2: JZ ( m, done ), 3 outer_loop: JZ ( n, done ), 4: CPY ( m, temp ), 5: inner_loop: JZ ( m, outer_loop ), 6: DEC ( m ), 7: INC ( p ), 8: J ( inner_loop ), 9: outer_loop: DEC ( n ), 10 J ( outer_loop ), HALT }
However, with only 4 registers, this machine has not nearly big enough to build a RASP that can execute the multiply algorithm as a program. No matter how big we build our finite state machine there will always be a program (including its parameters) which is larger. So by definition the bounded program machine that does not use unbounded encoding tricks such as Gödel numbers cannot be universal.

Minsky (1967) hints at the issue in his investigation of a counter machine (he calls them "program computer models") equipped with the instructions { CLR (r), INC (r), and RPT ("a" times the instructions m to n) }. He doesn't tell us how to fix the problem, but he does observe that:

"... the program computer has to have some way to keep track of how many RPT's remain to be done, and this might exhaust any particular amount of storage allowed in the finite part of the computer. RPT operations require infinite registers of their own, in general, and they must be treated differently from the other kinds of operations we have considered." (p. 214)

But Elgot and Robinson solve the problem: They augment their P0 RASP with an indexed set of instructions—a somewhat more complicated (but more flexible) form of indirect addressing. Their P'0 model addresses the registers by adding the contents of the "base" register (specified in the instruction) to the "index" specified explicitly in the instruction (or vice versa, swapping "base" and "index"). Thus the indexing P'0 instructions have one more parameter than the non-indexing P0 instructions:

Example: INC ( rbase, index ) ; effective address will be [rbase] + index, where the natural number "index" is derived from the finite-state machine instruction itself.

Hartmanis (1971)

By 1971 Hartmanis has simplified the indexing to indirection for use in his RASP model.

Indirect addressing: A pointer-register supplies the finite state machine with the address of the target register required for the instruction. Said another way: The contents of the pointer-register is the address of the "target" register to be used by the instruction. If the pointer-register is unbounded, the RAM, and a suitable RASP built on its chassis, will be Turing equivalent. The target register can serve either as a source or destination register, as specified by the instruction.

Note that the finite state machine does not have to explicitly specify this target register's address. It just says to the rest of the machine: Get me the contents of the register pointed to by my pointer-register and then do xyz with it. It must specify explicitly by name, via its instruction, this pointer-register (e.g. "N", or "72" or "PC", etc.) but it doesn't have to know what number the pointer-register actually contains (perhaps 279,431).

Cook and Reckhow (1973) describe the RAM

Cook and Reckhow (1973) cite Hartmanis (1971) and simplify his model to what they call a Random access machine ( RAM—i.e. a machine with indirection and the Harvard architecture). In a sense we are back to Melzak (1961) but with a much simpler model than Melzak's.

Precedence

Minsky was working at the M.I.T. Lincoln Labs and published his work there; his paper was received for publishing in the Annals of Mathematics on August 15, 1960 but not published until November 1961. While receipt occurred a full year before the work of Melzak and Lambek was received and published (received, respectively, May and June 15, 1961 and published side-by-side September 1961). That (i) both were Canadians and published in the Canadian Mathematical Bulletin, (ii) neither would have had reference to Minsky's work because it was not yet published in a peer-reviewed journal, but (iii) Melzak references Wang, and Lambek references Melzak, leads one to hypothesize that their work occurred simultaneously and independently.

Almost exactly the same thing happened to Shepherdson and Sturgis. Their paper was received in December 1961—just a few months after Melzak and Lambek's work was received. Again, they had little (at most 1 month) or no benefit of reviewing the work of Minsky. They were careful to observe in footnotes that papers by Ershov, Kaphengst and Peter had "recently appeared" (p. 219). These were published much earlier but appeared in the German language in German journals so issues of accessibility present themselves.

The final paper of Shepherdson and Sturgis did not appear in a peer-reviewed journal until 1963. And as they fairly and honestly note in their Appendix A, the 'systems' of Kaphengst (1959), Ershov (1958), Peter (1958) are all so similar to what results were obtained later as to be indistinguishable to a set of the following:

produce 0 i.e. 0 --> n
increment a number i.e. n+1 --> n
"i.e. of performing the operations which generate the natural numbers" (p. 246)
copy a number i.e. n --> m
to "change the course of a computation", either comparing two numbers or decrementing until 0

Indeed, Shepherson and Sturgis conclude

"The various minimal systems are very similar"( p. 246)

By order of publishing date the work of Kaphengst (1959), Ershov (1958), Peter (1958) were first.

See also

Bibliography

Background texts: The following bibliography of source papers includes a number of texts to be used as background. The mathematics that led to the flurry of papers about abstract machines in the 1950s and 1960s can be found in van Heijenoort (1967)—an assemblage of original papers spanning the 50 years from Frege (1879) to Gödel (1931). Davis (ed.) The Undecidable (1965) carries the torch onward beginning with Gödel (1931) through Gödel's (1964) postscriptum (p. 71); the original papers of Alan Turing (1936-7) and Emil Post (1936) are included in The Undecidable. The mathematics of Church, Rosser and Kleene that appear as reprints of original papers in The Undecidable is carried further in Kleene (1952), a mandatory text for anyone pursuing a deeper understanding of the mathematics behind the machines. Both Kleene (1952) and Davis (1958) are referenced by a number of the papers.

For a good treatment of the counter machine see Minsky (1967) Chapter 11 "Models similar to Digital Computers"—he calls the counter machine a "program computer". A recent overview is found at van Emde Boas (1990). A recent treatment of the Minsky (1961)/Lambek (1961) model can be found Boolos-Burgess-Jeffrey (2002); they reincarnate Lambek's "abacus model" to demonstrate equivalence of Turing machines and partial recursive functions, and they provide a graduate-level introduction to both abstract machine models (counter- and Turing-) and the mathematics of recursion theory. Beginning with the first edition Boolos-Burgess (1970) this model appeared with virtually the same treatment.

The papers: The papers begin with Wang (1957) and his dramatic simplification of the Turing machine. Turing (1936), Kleene (1952), Davis (1958) and in particular Post (1936) are cited in Wang (1957); in turn, Wang is referenced by Melzak (1961), Minsky (1961) and Shepherdson-Sturgis (1961-3) as they independently reduce the Turing tapes to "counters". Melzak (1961) provides his pebble-in-holes counter machine model with indirection but doesn't carry the treatment further. The work of Elgot-Robinson (1964) define the RASP—the computer-like random access stored program machines—and appear to be the first to investigate the failure of the bounded counter machine to calculate the mu-recursive functions. This failure—except with the draconian use of Gödel numbers in the manner of Minsky (1961))—leads to their definition of "indexed" instructions (i.e. indirect addressing) for their RASP model. Elgot-Robinson (1964) and more so Hartmanis (1971) investigate RASPs with self-modifying programs. Hartmanis (1971) specifies an instruction set with indirection, citing lecture notes of Cook (1970). For use in investigations of computational complexity Cook and his graduate student Reckhow (1973) provide the definition of a RAM (their model and mnemonic convention are similar to Melzak's, but offer him no reference in the paper). The pointer machines are an offshoot of Knuth (1968, 1973) and independently Schönhage (1980).

For the most part the papers contain mathematics beyond the undergraduate level—in particular the primitive recursive functions and mu recursive functions presented elegantly in Kleene (1952) and less in depth, but useful nonetheless, in Boolos-Burgess-Jeffrey (2002).

All texts and papers excepting the four starred have been witnessed. These four are written in German and appear as references in Shepherdson-Sturgis (1963) and Elgot-Robinson (1964); Shepherdson-Sturgis (1963) offer a brief discussion of their results in Shepherdson-Sturgis' Appendix A. The terminology of at least one paper (Kaphengst (1959) seems to hark back to the Burke-Goldstine-von Neumann (1946-7) analysis of computer architecture.

Author Year Reference Turing machine Counter machine RAM RASP Pointer machine Indirect addressing Self-modifying program
Goldstine & von Neumann 1947 X X
Kleene 1952 X
*Hermes 1954, 5 ?
Wang 1957 X X hints hints
*Peter 1958 ?
Davis 1958 X X
*Ershov 1959 ?
*Kaphengst 1959 ? X
Melzak 1961 X X hints
Lambek 1961 X
Minsky 1961 X
Shepherdson & Sturgis 1963 X hints
Elgot & Robinson 1964 X X X
Davis- Undecidable 1965 X X
van Heijenoort 1967 X
Minsky 1967 X hints hints
Knuth 1968, 73 X X X X
Hartmanis 1971 X X
Cook & Reckhow 1973 X X X X
Schonhage 1980 X X X
van Emde Boas 1990 X X X X X X
Boolos & Burgess; Boolos, Burgess & Jeffrey 1970–2002 X X X

References

  1. Harold Abelson and Gerald Jay Sussman with Julie Sussman, Structure and Interpretation of Computer Programs, MIT Press, Cambridge, Massachusetts, 2nd Ed, 1996
  2. ". . . a denumerable sequence of registers numbered 1, 2, 3, ..., each of which can sto3e any natural number 0, 1, 2, .... Each particular program, however, involves only a finite number of these registers, the others remaining empty (i.e. containing 0) throughout the computation." Shepherdson and Sturgis 1961:219. Lambek 1961:295 proposed: "a countably infinite set of locations (holes, wires, etc).
  3. For example, Lambek 1961:295 proposed the use of pebbles, beads, etc.
  4. See the "Note" in Shepherdson and Sturgis 1963:219. In their Appendix A the authors follow up with a listing and discussions of Kaphengst's, Ershov's and Péter's instruction sets (cf p. 245ff).
  • George Boolos, John P. Burgess, Richard Jeffrey (2002), Computability and Logic: Fourth Edition, Cambridge University Press, Cambridge, England. The original Boolos-Jeffrey text has been extensively revised by Burgess: more advanced than an introductory textbook. "Abacus machine" model is extensively developed in Chapter 5 Abacus Computability; it is one of three models extensively treated and compared—the Turing machine (still in Boolos' original 4-tuple form) and recursion the other two.
  • Arthur Burks, Herman Goldstine, John von Neumann (1946), Preliminary discussion of the logical design of an electronic computing instrument, reprinted pp. 92ff in Gordon Bell and Allen Newell (1971), Computer Structures: Readings and Examples, mcGraw-Hill Book Company, New York. ISBN 0-07-004357-4 .
  • Stephen A. Cook and Robert A. Reckhow (1972), Time-bounded random access machines, Journal of Computer Systems Science 7 (1973), 354-375.
  • Martin Davis (1958), Computability & Unsolvability, McGraw-Hill Book Company, Inc. New York.
  • Calvin Elgot and Abraham Robinson (1964), Random-Access Stored-Program Machines, an Approach to Programming Languages, Journal of the Association for Computing Machinery, Vol. 11, No. 4 (October, 1964), pp. 365–399.
  • J. Hartmanis (1971), "Computational Complexity of Random Access Stored Program Machines," Mathematical Systems Theory 5, 3 (1971) pp. 232–245.
  • John Hopcroft, Jeffrey Ullman (1979). Introduction to Automata Theory, Languages and Computation, 1st ed., Reading Mass: Addison-Wesley. ISBN 0-201-02988-X. A difficult book centered around the issues of machine-interpretation of "languages", NP-Completeness, etc.
  • Stephen Kleene (1952), Introduction to Metamathematics, North-Holland Publishing Company, Amsterdam, Netherlands. ISBN 0-7204-2103-9.
  • Donald Knuth (1968), The Art of Computer Programming, Second Edition 1973, Addison-Wesley, Reading, Massachusetts. Cf pages 462-463 where he defines "a new kind of abstract machine or 'automaton' which deals with linked structures."
  • Joachim Lambek (1961, received 15 June 1961), How to Program an Infinite Abacus, Mathematical Bulletin, vol. 4, no. 3. September 1961 pages 295-302. In his Appendix II, Lambek proposes a "formal definition of 'program'. He references Melzak (1961) and Kleene (1952) Introduction to Metamathematics.
  • Z. A. Melzak (1961, received 15 May 1961), An informal Arithmetical Approach to Computability and Computation, Canadian Mathematical Bulletin, vol. 4, no. 3. September 1961 pages 279-293. Melzak offers no references but acknowledges "the benefit of conversations with Drs. R. Hamming, D. McIlroy and V. Vyssots of the Bell telephone Laborators and with Dr. H. Wang of Oxford University."
  • One of the biggest reasons investing in a Singapore new launch is an effective things is as a result of it is doable to be lent massive quantities of money at very low interest rates that you should utilize to purchase it. Then, if property values continue to go up, then you'll get a really high return on funding (ROI). Simply make sure you purchase one of the higher properties, reminiscent of the ones at Fernvale the Riverbank or any Singapore landed property Get Earnings by means of Renting

    In its statement, the singapore property listing - website link, government claimed that the majority citizens buying their first residence won't be hurt by the new measures. Some concessions can even be prolonged to chose teams of consumers, similar to married couples with a minimum of one Singaporean partner who are purchasing their second property so long as they intend to promote their first residential property. Lower the LTV limit on housing loans granted by monetary establishments regulated by MAS from 70% to 60% for property purchasers who are individuals with a number of outstanding housing loans on the time of the brand new housing purchase. Singapore Property Measures - 30 August 2010 The most popular seek for the number of bedrooms in Singapore is 4, followed by 2 and three. Lush Acres EC @ Sengkang

    Discover out more about real estate funding in the area, together with info on international funding incentives and property possession. Many Singaporeans have been investing in property across the causeway in recent years, attracted by comparatively low prices. However, those who need to exit their investments quickly are likely to face significant challenges when trying to sell their property – and could finally be stuck with a property they can't sell. Career improvement programmes, in-house valuation, auctions and administrative help, venture advertising and marketing, skilled talks and traisning are continuously planned for the sales associates to help them obtain better outcomes for his or her shoppers while at Knight Frank Singapore. No change Present Rules

    Extending the tax exemption would help. The exemption, which may be as a lot as $2 million per family, covers individuals who negotiate a principal reduction on their existing mortgage, sell their house short (i.e., for lower than the excellent loans), or take part in a foreclosure course of. An extension of theexemption would seem like a common-sense means to assist stabilize the housing market, but the political turmoil around the fiscal-cliff negotiations means widespread sense could not win out. Home Minority Chief Nancy Pelosi (D-Calif.) believes that the mortgage relief provision will be on the table during the grand-cut price talks, in response to communications director Nadeam Elshami. Buying or promoting of blue mild bulbs is unlawful.

    A vendor's stamp duty has been launched on industrial property for the primary time, at rates ranging from 5 per cent to 15 per cent. The Authorities might be trying to reassure the market that they aren't in opposition to foreigners and PRs investing in Singapore's property market. They imposed these measures because of extenuating components available in the market." The sale of new dual-key EC models will even be restricted to multi-generational households only. The models have two separate entrances, permitting grandparents, for example, to dwell separately. The vendor's stamp obligation takes effect right this moment and applies to industrial property and plots which might be offered inside three years of the date of buy. JLL named Best Performing Property Brand for second year running

    The data offered is for normal info purposes only and isn't supposed to be personalised investment or monetary advice. Motley Fool Singapore contributor Stanley Lim would not personal shares in any corporations talked about. Singapore private home costs increased by 1.eight% within the fourth quarter of 2012, up from 0.6% within the earlier quarter. Resale prices of government-built HDB residences which are usually bought by Singaporeans, elevated by 2.5%, quarter on quarter, the quickest acquire in five quarters. And industrial property, prices are actually double the levels of three years ago. No withholding tax in the event you sell your property. All your local information regarding vital HDB policies, condominium launches, land growth, commercial property and more

    There are various methods to go about discovering the precise property. Some local newspapers (together with the Straits Instances ) have categorised property sections and many local property brokers have websites. Now there are some specifics to consider when buying a 'new launch' rental. Intended use of the unit Every sale begins with 10 p.c low cost for finish of season sale; changes to 20 % discount storewide; follows by additional reduction of fiftyand ends with last discount of 70 % or extra. Typically there is even a warehouse sale or transferring out sale with huge mark-down of costs for stock clearance. Deborah Regulation from Expat Realtor shares her property market update, plus prime rental residences and houses at the moment available to lease Esparina EC @ Sengkang
  • 20 year-old Real Estate Agent Rusty from Saint-Paul, has hobbies and interests which includes monopoly, property developers in singapore and poker. Will soon undertake a contiki trip that may include going to the Lower Valley of the Omo.

    My blog: http://www.primaboinca.com/view_profile.php?userid=5889534 In particular see chapter 11: Models Similar to Digital Computers and chapter 14: Very Simple Bases for Computability. In the former chapter he defines "Program machines" and in the later chapter he discusses "Universal Program machines with Two Registers" and "...with one register", etc.
  • John C. Shepherdson and H. E. Sturgis (1961) received December 1961 Computability of Recursive Functions, Journal of the Association of Computing Machinery (JACM) 10:217-255, 1963. An extremely valuable reference paper. In their Appendix A the authors cite 4 others with reference to "Minimality of Instructions Used in 4.1: Comparison with Similar Systems".
  • Kaphengst, Heinz, Eine Abstrakte programmgesteuerte Rechenmaschine', Zeitschrift fur mathematische Logik und Grundlagen der Mathematik:5 (1959), 366-379.
  • Ershov, A. P. On operator algorithms, (Russian) Dok. Akad. Nauk 122 (1958), 967-970. English translation, Automat. Express 1 (1959), 20-23.
  • Péter, Rózsa Graphschemata und rekursive Funktionen, Dialectica 12 (1958), 373.
  • Hermes, Hans Die Universalität programmgesteuerter Rechenmaschinen. Math.-Phys. Semesterberichte (Göttingen) 4 (1954), 42-53.
  • Arnold Schönhage (1980), Storage Modification Machines, Society for Industrial and Applied Mathematics, SIAM J. Comput. Vol. 9, No. 3, August 1980. Wherein Schōnhage shows the equivalence of his SMM with the "successor RAM" (Random Access Machine), etc. resp. Storage Modification Machines, in Theoretical Computer Science (1979), pp. 36–37
  • Peter van Emde Boas, "Machine Models and Simulations" pp. 3–66, in: Jan van Leeuwen, ed. Handbook of Theoretical Computer Science. Volume A: Algorithms and Complexity, The MIT PRESS/Elsevier, 1990. ISBN 0-444-88071-2 (volume A). QA 76.H279 1990. van Emde Boas' treatment of SMMs appears on pp. 32–35. This treatment clarifies Schōnhage 1980—it closely follows but expands slightly the Schōnhage treatment. Both references may be needed for effective understanding.
  • Hao Wang (1957), A Variant to Turing's Theory of Computing Machines, JACM (Journal of the Association for Computing Machinery) 4; 63-92. Presented at the meeting of the Association, June 23–25, 1954.

External links



  • I had like 17 domains hosted on single account, and never had any special troubles. If you are not happy with the service you will get your money back with in 45 days, that's guaranteed. But the Search Engine utility inside the Hostgator account furnished an instant score for my launched website. Fantastico is unable to install WordPress in a directory which already have any file i.e to install WordPress using Fantastico the destination directory must be empty and it should not have any previous installation files. When you share great information, others will take note. Once your hosting is purchased, you will need to setup your domain name to point to your hosting. Money Back: All accounts of Hostgator come with a 45 day money back guarantee. If you have any queries relating to where by and how to use Hostgator Discount Coupon, you can make contact with us at our site. If you are starting up a website or don't have too much website traffic coming your way, a shared plan is more than enough. Condition you want to take advantage of the worldwide web you prerequisite a HostGator web page, -1 of the most trusted and unfailing web suppliers on the world wide web today. Since, single server is shared by 700 to 800 websites, you cannot expect much speed.



    Hostgator tutorials on how to install Wordpress need not be complicated, especially when you will be dealing with a web hosting service that is friendly for novice webmasters and a blogging platform that is as intuitive as riding a bike. After that you can get Hostgator to host your domain and use the wordpress to do the blogging. Once you start site flipping, trust me you will not be able to stop. I cut my webmaster teeth on Control Panel many years ago, but since had left for other hosting companies with more commercial (cough, cough) interfaces. If you don't like it, you can chalk it up to experience and go on. First, find a good starter template design. When I signed up, I did a search for current "HostGator codes" on the web, which enabled me to receive a one-word entry for a discount. Your posts, comments, and pictures will all be imported into your new WordPress blog.
  • Igblan - Minsky Register Machines