Nonlinear resonance: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Michael Hardy
Nonlinear Frequency Response Functions: fixing incorrect capitals per WP:MOS
en>Makyen
m Fix various citation errors/issues: "unknown parameter", ISBN, LCCN, URL, simple coauthors (deprecated), simple author-link, etc. Replace hard space per WP:NBSP. Run gen fixes and typos if any found. using AWB
 
Line 1: Line 1:
{{Refimprove|date=September 2013}}
e - Shop Word - Press is a excellent cart for your on the web shopping organization. Good luck on continue learning how to make a wordpress website. This CMS has great flexibility to adapt various extensions and add-ons. After confirming the account, login with your username and password at Ad - Mob. In the most current edition you can customize your retailer layout and display hues and fonts similar to your site or blog. <br><br>Creating a website from scratch can be such a pain. If a newbie missed a certain part of the video then they could always rewind. Which is perfect for building a mobile site for business use. You can add new functionalities and edit the existing ones to suit your changing business needs. The biggest advantage of using a coupon or deal plugin is that it gives your readers the coupons and deals within minutes of them becoming available. <br><br>If you cherished this article and you would like to acquire more info regarding [http://topne.ws/BackupPlugin156536 wordpress dropbox backup] nicely visit our webpage. The least difficult and very best way to do this is by acquiring a Word - Press site. Word - Press has different exciting features including a plug-in architecture with a templating system. all the necessary planning and steps of conversion is carried out in this phase, such as splitting, slicing, CSS code, adding images, header footer etc. Provide the best and updated information to the web searchers and make use of these wonderful free themes and create beautiful websites. Converting HTML to Word - Press theme for your website can allow you to enjoy the varied Word - Press features that aid in consistent growth your online business. <br><br>Numerous bloggers are utilizing Word - Press and with good reason. Find more information about Design To Wordpress here. Websites that do rank highly, do so becaue they use keyword-heavy post titles. So, we have to add our social media sharing buttons in website. Now all you have to do is log into your Word - Press site making use of the very same username and password that you initially had in your previous site. <br><br>Website security has become a major concern among individuals all over the world. Being a Plugin Developer, it is important for you to know that development of Word - Press driven website should be done only when you enable debugging. By the time you get the Gallery Word - Press Themes, the first thing that you should know is on how to install it. Page speed is an important factor in ranking, especially with Google. Definitely when you wake up from the slumber, you can be sure that you will be lagging behind and getting on track would be a tall order.
{{technical|date=January 2014}}
 
{{Probability fundamentals}}
In [[probability theory]], a '''conditional probability''' measures the [[probability]] of an [[Event (probability theory)|event]] given that (by assumption, presumption, assertion or evidence)         
another event has occurred.<ref name="Allan Gut 2013">Allan Gut, ''Probability: A Graduate Course'', 2nd Edition,(2013), Springer Texts in Statistics, Vol. 75, New York, NY  USA, ISBN 13: 9781461447078</ref>
If the events are ''A'' and ''B'' respectively, this is said to be "the probability of ''A'' given ''B''".
It is commonly denoted by ''P''(''A''|''B''), or sometimes ''P''{{sub|''B''}}(''A'').
 
The concept of conditional probability is one of the most fundamental and one of the most important concepts in probability theory
.<ref name="Sheldon Ross 2010">Sheldon Ross, A First Course in Probability, 8th Edition (2010), Pearson Prentice Hall, ISBN 9780136033134</ref> 
But conditional probabilities can be quite slippery and require careful interpretation
.<ref name="Casella and Berger 2002">George Casella and Roger L. Berger, ''Statistical Inference'',(2002), Duxbury Press, ISBN 9780534243128</ref>  
In statistical inference, the conditional probability is an update of the probability of an [[Event (probability theory)|event]] based on new information.<ref name="Casella and Berger 2002"/> Incorporating the new information can be done as follows <ref name="Allan Gut 2013"/>
 
:* We start with a [[probability measure]] on a [[sample space]], say (''X'',''P'').
:* Let the event of interest be ''A''. 
:* If we wish to measure the probability of the event ''A'' knowing that event ''B'' has or will have occurred we need to examine event ''A'' restricted to event ''B''.
:* Since both ''A'' and ''B'' are events in the same sample space, ''A'' restricted to ''B'' is ''A'' <math>\cap</math> ''B''. 
:* Whenever ''P''(''B'')>0 with the original probability measure on the original sample space (''X'',''P''),  ''B'' must be the sure event in the restricted space (''B'',''P''{{sub|''B''}}) and thus ''P''{{sub|''B''}}(''B'') must be 1.
:* To derive ''P''(''A''|''B'')=''P''{{sub|''B''}}(''A'') so that ''P''(''B''|''B'')=1 we re-scale ''P''(''A'' <math>\cap</math> ''B'') by dividing by ''P''(''B'').
:* This results in ''P''(''A''|''B'') = ''P''(''A'' <math>\cap</math> ''B'')/''P''(''B'') whenever ''P''(''B'')>0 and 0 otherwise.
 
Note:  This approach results in a probability measure that is consistent with the original probability measure and satisfies all the [[Probability axioms|Kolmogorov Axioms]].
 
Note:  The phraseology "evidence" or "information" is generally used in the [[Bayesian probability|Bayesian interpretation of probability]]. The conditioning event is interpreted as evidence for the conditioned event. That is, ''P''(''A'') is the probability of ''A'' before accounting for evidence ''E'', and ''P''(''A''|''E'') is the probability of ''A'' after having accounted for evidence ''E'' or after having updated ''P''(''A'').  This is consistent with the frequentist interpretation.
 
Note:  ''P''(''A''|''B'') (the conditional probability of A given B) may or may not be equal to P(A) (the unconditional probability of A). If ''P''(''A''|''B'') = P(A), ''A'' and ''B'' are said to be independent.
 
==Definition==
 
[[File:Conditional probability.svg|thumb|Illustration of conditional probabilities with an [[Euler diagram]]. The unconditional [[probability]] ''P''(''A'') = 0.52. However, the conditional probability ''P''(''A''&#124;''B''{{sub|1}}) = 1, ''P''(''A''&#124;''B''{{sub|2}}) ≈ 0.75, and ''P''(''A''&#124;''B''{{sub|3}}) = 0.]]
 
[[File:Probability tree diagram.svg|thumb|On a [[Tree diagram (probability theory)|tree diagram]], branch probabilities are conditional on the event associated with the parent node.]]
 
[[File:Venn Pie Chart describing Bayes' law.png|thumb|Venn Pie Chart describing conditional probabilities]]
 
===Conditioning on an event===
 
====Kolmogorov definition====
Given two [[event (probability theory)|events]] ''A'' and ''B'' from the sigma-field of a probability space with ''P''(''B'')&nbsp;>&nbsp;0, the conditional probability of ''A'' given ''B'' is defined as the [[quotient]] of the probability of the joint of events ''A'' and ''B'', and the [[probability]] of ''B'':
 
:<math>P(A|B) = \frac{P(A \cap B)}{P(B)}</math>
 
This may be visualized as restricting the sample space to ''B''. The logic behind this equation is that if the outcomes are restricted to ''B'', this set serves as the new sample space.
 
Note that this is a definition but not a theoretical result. We just denote the quantity ''P(AB)/P(B)'' as ''P(A|B)'' and call it the conditional probability of ''A'' given ''B''.
 
====As an axiom of probability====
Some authors, such as [[Bruno de Finetti|De Finetti]], prefer to introduce conditional probability as an [[Probability axioms|axiom of probability]]:
 
:<math>P(A \cap B) = P(A|B)P(B)</math>
 
Although mathematically equivalent, this may be preferred philosophically; under major [[probability interpretations]] such as the [[Subjective probability|subjective theory]], conditional probability is considered a primitive entity. Further, this "multiplication axiom" introduces a symmetry with the summation axiom for [[mutually exclusive events]]:<ref>Gillies, Donald (2000); "Philosophical Theories of Probability"; Routledge; Chapter 4 "The subjective theory"</ref>
 
:<math>P(A \cup B) = P(A) + P(B)</math>
 
===Definition with σ-algebra===
 
If ''P''(''B'')&nbsp;=&nbsp;0, then the simple definition of ''P''(''A''|''B'') is [[defined and undefined|undefined]]. However, it is possible to define a conditional probability with respect to a [[sigma algebra|σ-algebra]] of such events (such as those arising from a [[continuous random variable]]).
 
For example, if ''X'' and ''Y'' are non-degenerate and jointly continuous random variables with density ''ƒ''{{sub|''X'',''Y''}}(''x'',&nbsp;''y'') then, if ''B'' has positive [[Measure (mathematics)|measure]],
:<math>
P(X \in A \mid Y \in B) =
\frac{\int_{y\in B}\int_{x\in A} f_{X,Y}(x,y)\,dx\,dy}{\int_{y\in B}\int_{x\in\Omega} f_{X,Y}(x,y)\,dx\,dy} .</math>
 
The case where ''B'' has zero measure can only be dealt with directly in the case that ''B''&nbsp;=&nbsp;{''y''{{sub|0}}}, representing a single point, in which case
:<math>
P(X \in A \mid Y = y_0) = \frac{\int_{x\in A} f_{X,Y}(x,y_0)\,dx}{\int_{x\in\Omega} f_{X,Y}(x,y_0)\,dx} .
</math>
 
If ''A'' has measure zero then the conditional probability is zero. An indication of why the more general case of zero measure cannot be dealt with in a similar way can be seen by noting that the limit, as all ''δy''{{sub|''i''}} approach zero, of
:<math>
P(X \in A \mid Y \in \cup_i[y_i,y_i+\delta y_i]) \approxeq
\frac{\sum_{i} \int_{x\in A} f_{X,Y}(x,y_i)\,dx\,\delta y_i}{\sum_{i}\int_{x\in\Omega} f_{X,Y}(x,y_i) \,dx\, \delta y_i} , </math>
depends on their relationship as they approach zero. See [[conditional expectation]] for more information.
 
===Conditioning on a random variable===
Conditioning on an event may be generalized to conditioning on a random variable. Let ''X'' be a random variable taking some value from ''x''{{sub|''n''}}. Let ''A'' be an event. The conditional probability of ''A'' given ''X'' is defined as the random variable:
 
:<math>P(A|X) \text{  taking on the value } P(A\mid X=x_n) \text{  if } X=x_n</math>
 
More formally:
:<math>P(A|X)(\omega)=P(A\mid X=X(\omega)) .</math>
 
The conditional probability ''P''(''A''|''X'') is a function of ''X'', e.g., if the function ''g'' is defined as
:<math>g(x)= P(A\mid X=x)</math>,
 
then
:<math>P(A|X) =g\circ X</math>
 
Note that ''P''(''A''|''X'') and ''X'' are now both [[random variable]]s. From the [[law of total probability]], the [[expected value]] of ''P''(''A''|''X'') is equal to the unconditional [[probability]] of ''A''.
 
==Example==
 
Suppose that somebody secretly rolls two fair six-sided [[dice]], and we must predict the outcome.
 
* Let ''A'' be the value rolled on [[dice|die]] 1
* Let ''B'' be the value rolled on [[dice|die]] 2
 
What is the probability that ''A''&nbsp;=&nbsp;2? Table 1 shows the [[sample space]]. ''A''&nbsp;=&nbsp;2 in 6 of the 36 outcomes, thus ''P''(''A''=2)&nbsp;=&nbsp;{{frac|6|36}}&nbsp;=&nbsp;{{frac|1|6}}.
 
:{| class="wikitable" border="1" style="background:silver; text-align:center; width:300px"
|+ Table 1
! scope="col" | +
! scope="col" | B=1
! scope="col" | 2
! scope="col" | 3
! scope="col" | 4
! scope="col" | 5
! scope="col" | 6
|-
! scope="row" | A=1
| 2 || 3 || 4 || 5 || 6 || 7
|- style="background: red;"
! scope="row" | 2
| 3 || 4 || 5 || 6 || 7 || 8
|-
! scope="row" | 3
| 4 || 5 || 6 || 7 || 8 || 9
|-
! scope="row" | 4
| 5 || 6 || 7 || 8 || 9 || 10
|-
! scope="row" | 5
| 6 || 7 || 8 || 9 || 10 || 11
|-
! scope="row" | 6
| 7 || 8 || 9 || 10 || 11 || 12
|}
 
Suppose it is revealed that ''A''+''B''&nbsp;≤&nbsp;5. Table 2 shows that ''A''+''B''&nbsp;≤&nbsp;5 for 10 outcomes. For 3 of these, ''A''&nbsp;=&nbsp;2. So the probability that ''A''&nbsp;=&nbsp;2 ''given that'' ''A''+''B''&nbsp;≤&nbsp;5 is ''P''(''A''=2&nbsp;|&nbsp;''A''+''B''&nbsp;≤&nbsp;5)&nbsp;=&nbsp;{{frac|3|10}}&nbsp;=&nbsp;0.3.
 
:{| class="wikitable" border="1" style="text-align:center; width:300px"
|+ Table 2
! scope="col" | +
! scope="col" | B=1
! scope="col" | 2
! scope="col" | 3
! scope="col" | 4
! scope="col" | 5
! scope="col" | 6
|-
! scope="row" | A=1
| style="background:silver;" | 2 || style="background:silver;" | 3 || style="background:silver;" | 4 || style="background:silver;" | 5 || 6 || 7
|-
! scope="row" | 2
| style="background:red;" | 3 || style="background:red;" | 4 || style="background:red;" | 5 || 6 || 7 || 8
|-
! scope="row" | 3
| style="background:silver;" | 4 || style="background:silver;" | 5 || 6 || 7 || 8 || 9
|-
! scope="row" | 4
| style="background:silver;" | 5 || 6 || 7 || 8 || 9 || 10
|-
! scope="row" | 5
| 6 || 7 || 8 || 9 || 10 || 11
|-
! scope="row" | 6
| 7 || 8 || 9 || 10 || 11 || 12
|}
 
==Statistical independence==
{{Main|Independence (probability theory)}}
 
Events ''A'' and ''B'' are defined to be [[Independence (probability theory)|statistically independent]] if:
 
:<math>P(A \cap B) \ = \ P(A) P(B)</math>
:<math>\Leftrightarrow P(A|B) \ = \ P(A)</math>
:<math>\Leftrightarrow P(B|A) \ = \ P(B)</math>.
 
That is, the occurrence of ''A'' does not affect the probability of ''B'', and vice versa. Although the derived forms may seem more intuitive, they are not the preferred definition as the conditional probabilities may be undefined if ''P''(''A'') or ''P''(''B'') are 0, and the preferred definition is symmetrical in ''A'' and ''B''.
 
==Common fallacies==
 
:''These fallacies should not be confused with Robert K. Shope's 1978 [http://lesswrong.com/r/discussion/lw/9om/the_conditional_fallacy_in_contemporary_philosophy/ "conditional fallacy"], which deals with counterfactual examples that [[beg the question]].''
 
===Assuming conditional probability is of similar size to its inverse=== <!-- Brief example (+ diagram?) would be nice here -->
{{main|Confusion of the inverse}}
 
In general, it cannot be assumed that ''P''(''A''|''B'')&nbsp;≈&nbsp;''P''(''B''|''A''). This can be an insidious error, even for those who are highly conversant with statistics.<ref>Paulos, J.A. (1988) ''Innumeracy: Mathematical Illiteracy and its Consequences'', Hill and Wang. ISBN 0-8090-7447-8 (p. 63 ''et seq.'')</ref> The relationship between ''P''(''A''|''B'') and ''P''(''B''|''A'') is given by [[Bayes' theorem]]:
 
:<math>P(B|A) = \frac{P(A|B) P(B)}{P(A)}.</math>
 
That is, ''P''(''A''|''B'')&nbsp;≈&nbsp;''P''(''B''|''A'') only if ''P''(''B'')/''P''(''A'')&nbsp;≈&nbsp;1, or equivalently, ''P''(''A'')&nbsp;≈&nbsp;''P''(''B'').
 
Alternatively, noting that ''A'' ∩ ''B'' = ''B'' ∩ ''A'', and applying conditional probability:
 
:<math>P(A \cap B) = P(A|B)P(B)= P(B \cap A) = P(B|A)P(A)</math>
 
Rearranging gives the result.
 
===Assuming marginal and conditional probabilities are of similar size=== <!-- Diagram might be nice here  -->
In general, it cannot be assumed that ''P''(''A'')&nbsp;≈&nbsp;''P''(''A''|''B''). These probabilities are linked through the formula for [[total probability]]:
 
:<math>P(A) \, = \, \sum_n P(A \cap B_n) \, = \, \sum_n P(A|B_n)P(B_n)</math>.
 
where :<math> B_n \cap B_{n+1} \, = \, \emptyset \,\forall n </math>. This fallacy may arise through [[selection bias]].<ref>Thomas Bruss, F; Der Wyatt Earp Effekt; Spektrum der Wissenschaft; March 2007</ref> For example, in the context of a medical claim, let ''S''{{sub|''C''}} be the event that a [[sequelae|sequela]] (chronic disease) ''S'' occurs as a consequence of circumstance (acute condition) ''C''. Let ''H'' be the event that an individual seeks medical help. Suppose that in most cases, ''C'' does not cause ''S'' so ''P''(''S''{{sub|''C''}}) is low. Suppose also that medical attention is only sought if ''S'' has occurred due to ''C''. From experience of patients, a doctor may therefore erroneously conclude that ''P''(''S''{{sub|''C''}}) is high. The actual probability observed by the doctor is ''P''(''S''{{sub|''C''}}|''H'').
 
===Over- or under-weighting priors===
 
Not taking prior probability into account partially or completely is called ''[[base rate neglect]]''. The reverse, insufficient adjustment from the prior probability is ''[[conservatism (Bayesian)|conservatism]].
 
==Formal derivation==
 
Formally, ''P''(''A''|''B'') is defined as the probability of ''A'' according to a new probability function on the sample space, such that outcomes not in ''B'' have probability 0 and that it is consistent with all original [[probability measure]]s.<ref>George Casella and Roger L. Berger (1990), ''Statistical Inference'', Duxbury Press, ISBN 0-534-11958-1 (p. 18 ''et seq.'')</ref><ref name="grinstead">[http://math.dartmouth.edu/~prob/prob/prob.pdf Grinstead and Snell's Introduction to Probability], p. 134</ref>
 
Let Ω be a [[sample space]] with [[elementary event]]s {ω}. Suppose we are told the event ''B''&nbsp;⊆&nbsp;Ω has occurred. A new probability distribution (denoted by the conditional notation) is to be assigned on {ω} to reflect this. For events in ''B'', it is reasonable to assume that the relative magnitudes of the probabilities will be preserved. For some constant scale factor α, the new distribution will therefore satisfy:
 
:<math>\text{1. }\omega \in B : P(\omega|B) = \alpha P(\omega)</math>
:<math>\text{2. }\omega \notin B : P(\omega|B) = 0</math>
:<math>\text{3. }\sum_{\omega \in \Omega} {P(\omega|B)} = 1.</math>
 
Substituting 1 and 2 into 3 to select α:
 
:<math>\begin{align}
\sum_{\omega \in \Omega} {P(\omega | B)} &= \sum_{\omega \in B} {\alpha P(\omega)} + \cancelto{0}{\sum_{\omega \notin B} 0} \\
&= \alpha \sum_{\omega \in B} {P(\omega)} \\
&= \alpha \cdot P(B) \\
\end{align}
</math>
 
:<math>\implies \alpha = \frac{1}{P(B)}</math>
 
So the new probability distribution is
 
:<math>\text{1. }\omega \in B : P(\omega|B) = \frac{P(\omega)}{P(B)}</math>
:<math>\text{2. }\omega \notin B : P(\omega| B) = 0</math>
 
Now for a general event ''A'',
 
:<math>\begin{align}
P(A|B) &= \sum_{\omega \in A \cap B} {P(\omega | B)} + \cancelto{0}{\sum_{\omega \in A \cap B^c} P(\omega|B)} \\
&= \sum_{\omega \in A \cap B} {\frac{P(\omega)}{P(B)}} \\
&= \frac{P(A \cap B)}{P(B)}
\end{align}</math>
 
==See also==
*[[Borel–Kolmogorov paradox]]
*[[Chain rule (probability)]]
*[[Class membership probabilities]]
*[[Conditional probability distribution]]
*[[Conditioning (probability)]]
*[[Joint probability distribution]]
*[[Monty Hall problem]]
*[[Posterior probability]]
 
==References==
{{Reflist}}
 
==External links==
* {{MathWorld | urlname=ConditionalProbability | title=Conditional Probability}}
*[[F. Thomas Bruss]] Der Wyatt-Earp-Effekt oder die betörende Macht kleiner Wahrscheinlichkeiten (in German), Spektrum der Wissenschaft (German Edition of Scientific American), Vol 2, 110&ndash;113, (2007).
*[http://eduflix.tv/ask/tags/probability/ Conditional Probability Problems with Solutions]
 
[[Category:Probability theory]]
[[Category:Logical fallacies]]
[[Category:Conditionals]]
[[Category:Statistical ratios]]

Latest revision as of 03:55, 4 May 2014

e - Shop Word - Press is a excellent cart for your on the web shopping organization. Good luck on continue learning how to make a wordpress website. This CMS has great flexibility to adapt various extensions and add-ons. After confirming the account, login with your username and password at Ad - Mob. In the most current edition you can customize your retailer layout and display hues and fonts similar to your site or blog.

Creating a website from scratch can be such a pain. If a newbie missed a certain part of the video then they could always rewind. Which is perfect for building a mobile site for business use. You can add new functionalities and edit the existing ones to suit your changing business needs. The biggest advantage of using a coupon or deal plugin is that it gives your readers the coupons and deals within minutes of them becoming available.

If you cherished this article and you would like to acquire more info regarding wordpress dropbox backup nicely visit our webpage. The least difficult and very best way to do this is by acquiring a Word - Press site. Word - Press has different exciting features including a plug-in architecture with a templating system. all the necessary planning and steps of conversion is carried out in this phase, such as splitting, slicing, CSS code, adding images, header footer etc. Provide the best and updated information to the web searchers and make use of these wonderful free themes and create beautiful websites. Converting HTML to Word - Press theme for your website can allow you to enjoy the varied Word - Press features that aid in consistent growth your online business.

Numerous bloggers are utilizing Word - Press and with good reason. Find more information about Design To Wordpress here. Websites that do rank highly, do so becaue they use keyword-heavy post titles. So, we have to add our social media sharing buttons in website. Now all you have to do is log into your Word - Press site making use of the very same username and password that you initially had in your previous site.

Website security has become a major concern among individuals all over the world. Being a Plugin Developer, it is important for you to know that development of Word - Press driven website should be done only when you enable debugging. By the time you get the Gallery Word - Press Themes, the first thing that you should know is on how to install it. Page speed is an important factor in ranking, especially with Google. Definitely when you wake up from the slumber, you can be sure that you will be lagging behind and getting on track would be a tall order.