Magnetic levitation: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Teapeat
 
en>Flyer22
m Reverted 1 edit by 14.194.15.111 identified as test/vandalism using STiki
Line 1: Line 1:
Registered Nurse (Medical Training ) Hosea from Burns Lake, has hobbies which includes badminton, ganhando dinheiro na internet and frisbee. Last month just attended Boyana Church.<br><br>Feel free to surf to my page - [http://comoconseguirdinheiro.comoganhardinheiro101.com como ficar rico]
{{#if:no|<noinclude>[[Category:Failed DYK nominations&nbsp;from November 2011]]<div style="background-color: #F3F9FF; margin: 2em 0 0 0; padding: 0 10px 0 10px; border: 1px solid #AAAAAA;">
:''The following discussion is an archived discussion of the DYK nomination of the article below. <span style="color:red">'''Please do not modify it.'''</span>  Subsequent comments should be made on the appropriate discussion page (such as [[Template talk:Did you know nominations/{{SUBPAGENAME}}|this nomination's talk page]], [[Talk:{{SUBPAGENAME}}|the article's talk page]] or [[Wikipedia talk:Did you know]]), unless there is consensus to re-open the discussion at this page.  No further edits should be made to this page.''
 
The result was: '''rejected''' by '''[[User:Harrias|<font color="#00cc33">Harrias</font>]]''' <sup>[[User_talk:Harrias|<font color="#009900">talk</font>]]</sup> 23:57, 9 December 2011 (UTC)<br />}}
====Sparse Distributed Memory====
{{DYK nompage links|nompage=Sparse Distributed Memory|Sparse Distributed Memory}}
{{*mp}}... that '''[[Sparse Distributed Memory]]''' is used to mathematically model the human long-term [[memory]]?
:*'''ALT1''':... that you can mathematically model forgetting using '''[[Sparse Distributed Memory]]'''?
<small>Created  by [[User:OneThousandTwentyFour|OneThousandTwentyFour]] ([[User talk:OneThousandTwentyFour|talk]]).  Self nom at 21:12, 3 November 2011 (UTC)</small>
 
:* [[File:Symbol possible vote.svg|16px]] The article in currently ineligible for DYK, because it is so poorly written. The article apparently deals with a particularly [[content-addressable memory]] structure that is used as a model for long-term memory, especially for human semantic-memory. Its first "section" contains exactly one sentence: "The general formula is <math>2^n</math> where ''n'' is the number of dimensions of the space, and <math>2^n</math> is the number of feasible memory items", which tells little to the reader. In the example, I suspect that each sentence should be parsed for semantic information, so that some relation between this "formula" and the "example" be suggested.  
The lede confuses the model and the modelled. There are typographical errors with spacing. The split infinitive should be avoided in the DYK blurb.  The article seems very short 2000 characters, which is only 500 over the minimum size. The formatting of the sources often has just a title linked to a technical report, without mentioning that the working paper was published in a conference proceedings; use Google Scholar for publication data. <small><span style="border:1px solid black;padding:1px;">[[User:Kiefer.Wolfowitz|<font style="color:blue;background:yellow;">&nbsp;'''Kiefer'''</font>]].[[User talk:Kiefer.Wolfowitz#top|Wolfowitz]]</span></small> 09:38, 6 November 2011 (UTC)
 
:*''Comment:'' The nominator has expanded the article a few days after these comments were made but did not leave any comment here. To be honest, I do not believe the article in its current state is ready for Wikipedia, let alone the main page. It omits core parts of the explanation, e.g. why so many dimensions are useful, and it confuses the matter to an extent that, even with a bit of background in mathematics, it becomes totally incomprehensible. For instance, the aim is to retrieve about 1000 bits of information, not 2^1000. We cannot store the latter amount for quite a few <s>years</s> millennia to come, and we will therefore not very soon need an algorithm to retrieve it. 2^1000 is ''a lot'', see [[Wheat and chessboard problem|chessboard paradox]]. I thought of a hoax, but Kanerva's paper has over 600 citations. Suggest to '''close''' this nomination as unsuccessful. --[[User:Pgallert|Pgallert]] ([[User talk:Pgallert|talk]]) 20:35, 6 December 2011 (UTC){{#if:no|</div></noinclude>|{{#ifeq:{{FULLPAGENAME}}|Template talk:Did you know/{{SUBPAGENAME}}|[[Category:Pending DYK nominations]]|{{#ifeq:{{FULLPAGENAME}}|Template:Did you know nominations/{{SUBPAGENAME}}|[[Category:Pending DYK nominations]]}}}}}}<!--Please do not write below this line or remove this line. Place comments above this line.-->

Revision as of 05:37, 1 February 2014

The following discussion is an archived discussion of the DYK nomination of the article below. Please do not modify it. Subsequent comments should be made on the appropriate discussion page (such as this nomination's talk page, the article's talk page or Wikipedia talk:Did you know), unless there is consensus to re-open the discussion at this page. No further edits should be made to this page.

The result was: rejected by Harrias talk 23:57, 9 December 2011 (UTC)

Sparse Distributed Memory

Template:DYK nompage links Template:*mp... that Sparse Distributed Memory is used to mathematically model the human long-term memory?

Created by OneThousandTwentyFour (talk). Self nom at 21:12, 3 November 2011 (UTC)

  • The article in currently ineligible for DYK, because it is so poorly written. The article apparently deals with a particularly content-addressable memory structure that is used as a model for long-term memory, especially for human semantic-memory. Its first "section" contains exactly one sentence: "The general formula is where n is the number of dimensions of the space, and is the number of feasible memory items", which tells little to the reader. In the example, I suspect that each sentence should be parsed for semantic information, so that some relation between this "formula" and the "example" be suggested.

The lede confuses the model and the modelled. There are typographical errors with spacing. The split infinitive should be avoided in the DYK blurb. The article seems very short 2000 characters, which is only 500 over the minimum size. The formatting of the sources often has just a title linked to a technical report, without mentioning that the working paper was published in a conference proceedings; use Google Scholar for publication data.  Kiefer.Wolfowitz 09:38, 6 November 2011 (UTC)

  • Comment: The nominator has expanded the article a few days after these comments were made but did not leave any comment here. To be honest, I do not believe the article in its current state is ready for Wikipedia, let alone the main page. It omits core parts of the explanation, e.g. why so many dimensions are useful, and it confuses the matter to an extent that, even with a bit of background in mathematics, it becomes totally incomprehensible. For instance, the aim is to retrieve about 1000 bits of information, not 2^1000. We cannot store the latter amount for quite a few years millennia to come, and we will therefore not very soon need an algorithm to retrieve it. 2^1000 is a lot, see chessboard paradox. I thought of a hoax, but Kanerva's paper has over 600 citations. Suggest to close this nomination as unsuccessful. --Pgallert (talk) 20:35, 6 December 2011 (UTC)