Cryoscopic constant: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>ZéroBot
m r2.7.1) (Robot: Adding ca:Constant crioscòpica
 
No edit summary
Line 1: Line 1:
== ότι μια κόκκινη σημαία Ralph Lauren Greece ==
In [[probability theory]], two [[random variable]]s being [[Pearson product-moment correlation coefficient|uncorrelated]] does not imply their [[statistical independence|independence]].  In some contexts, uncorrelatedness implies at least [[pairwise independence]] (as when the random variables involved have [[Bernoulli distribution]]s).


Ενώ είμαστε σε αυτό, ας αφαιρέστε τα αστρικά συστήματα από το γαλαξιακό πυρήνα / διόγκωση από την εξέταση πάρα πολύ (περίπου το 10% του συνόλου), αφήνοντας μας με 246 δισεκατομμυρίων συστήματα. Για να χάσετε βάρος και να το κρατήσετε μακριά, είναι ζωτικής σημασίας ότι θα πρέπει να δοθούν κίνητρα, πραγματικά θέλουν να χάσουν βάρος, και θέλουν να βελτιώσουν τις πτυχές της ζωής σας.. <br><br>Δεν θα μπορούσε να ζητήσει περισσότερα. Αυτό το βράδυ, πήγα πάνω από το πρώτο κεφάλαιο του μυθιστορήματός μου σκέφτηκα πίσω σε εκείνες τις ημέρες στο St. Το μόνο πράγμα που μπορώ να κάνω είναι να προσπαθήσουμε και να επιταχύνει την έξωση δείχνοντας βλάβη και κακεντρέχεια από την πλευρά τους, να πάρει το πολυτεχνίτης να επιθεωρήσει την AC και να δείξει όπου μπλοκάρει [http://www.atholidays.gr/Upload/html/update.html Ralph Lauren Greece] το σωλήνα αποστράγγισης για να το έχει πίσω ροή να καταστρέψει τη μονάδα. Τα παιδιά Linux/390 εντός της IBM πατούσαν σε όλα τα είδη των ναρκών ξηράς εσωτερικά. <br><br>Ήταν μόλις πριν από λίγα χρόνια ότι κοίταξα μια ιστοσελίδα μόνο και μόνο για να μάθετε πώς να ράβω, επειδή είχα ένα πράγματα που είχα να επιδιορθώσει τη δική μου. Ένα δίκλινο δωμάτιο με ιδιωτικό μπάνιο και ένα με δύο μονά κρεβάτια δωμάτιο με ξεχωριστό μπάνιο. [http://www.dioptracooking.gr/templates/Config.htm Timberland Shoes] <br><br>Πληροί μου εργάζεται για να αποφευχθεί αυτό που συνέβη στην οικογένειά μου από το να συμβεί σε άλλους. Μέχρι στιγμής έχω κάνει $ 400 ή έτσι από μία εβδομάδα αμοιβή πραγματική δουλειά μου, αλλά σε μετρητά θα έκανα είχε διαφορετικά.. Αλλά αν ετερόκλητα αλλά εξίσου δημιουργικές προσεγγίσεις για την DIY κατασκευή, ίσως το επόμενο βήμα είναι για τα δύο για να ενώσουν τις δυνάμεις τους: Ανδρέα, αν διαβάζετε αυτό, θα θέλαμε να σας δω να συνεργαστούν σε μια σειρά IKEA hacks με κατά παραγγελία 3D εμπριμε και οδηγίες. ".. <br><br>Το γενετικό αρχείο μας δείχνει ότι όλοι οι άνθρωποι καταγόμαστε από ένα μικρό πληθυσμό περίπου 600 άτομα αναπαραγωγής .. Μάθετε τους καλύτερους τρόπους για να ψαλιδίσει τα νύχια μιας γάτας, ενώ το ζώο παραμένει άνετα και άνετα σε αυτό το ελεύθερο βίντεο. <br><br>Δείτε τις προτάσεις [http://www.dioptracooking.gr/templates/Config.htm Timberland Θεσσαλονικη] για βελτίωση να κάνει το φορτίο σελίδα γρηγορότερα. Αυτή θα είναι η πρώτη φορά που το Super Bowl θα πραγματοποιηθεί σε υπαίθριο [http://www.kalamakiwbc.gr/templates/router.htm Oakley Holbrook] χώρο weater κρύο που δεν έχουν bck επάνω γενικά θόλο που καλύπτει.. Buff και σφραγίστε το βύσμα με το TR 301 και TR 311. Απλά βάλτε στη διαφήμιση, «αν η διαφήμιση είναι επάνω, είναι διαθέσιμο". <br><br>Δυνατά αυτά μου δεν είναι καλύτερα από ό, τι ήσυχες σας. Δεδομένου ότι συνιστά The World Future Συμβουλίου, ότι μια κόκκινη σημαία, συστήνω σκεπτικιστές να διαβάσετε προσεκτικά. Κάθε ασημί χωρίς ένα κατάλληλο σήμα κατατεθέν Sterling Εγγύησης θα πρέπει να αντιμετωπίζονται ως λευκό μέταλλο .. Τι είναι ένα CMS; Σας επιτρέπει να ανεβάσετε περιεχόμενο στο δικτυακό σας τόπο, όπου είναι αποθηκευμένα σε μια βάση δεδομένων.<ul>
It is sometimes mistakenly thought that one context in which uncorrelatedness implies independence is when the random variables involved are [[normal distribution|normally distributed]]. However, this is incorrect if the variables are merely [[Marginal distribution|marginally]] normally distributed but not [[multivariate normal distribution|jointly normally distributed]].
 
 
  <li>[http://www.nnzqm.com/forum.php?mod=viewthread&tid=694210 http://www.nnzqm.com/forum.php?mod=viewthread&tid=694210]</li>
Suppose two random variables ''X'' and ''Y'' are ''jointly'' normally distributed.  That is the same as saying that the random vector (''X'',&nbsp;''Y'') has a [[multivariate normal distribution]].  It means that the [[joint probability distribution]] of ''X'' and ''Y'' is such that each linear combination of ''X'' and ''Y'' is normally distributed, i.e. for any two constant (i.e., non-random) scalars ''a'' and ''b'', the random variable ''aX''&nbsp;+&nbsp;''bY'' is normally distributed.  ''In that case'' if ''X'' and ''Y'' are uncorrelated, i.e., their [[covariance]] cov(''X'',&nbsp;''Y'') is zero, ''then'' they are independent.<ref>{{cite book |title=Probability and Statistical Inference |year=2001 |last1=Hogg |first1=Robert |authorlink1=Robert Hogg |last2=Tanis |first2=Elliot | authorlink2=Elliot Tanis |edition=6th |chapter=Chapter 5.4 The Bivariate Normal Distribution |pages=258–259 |isbn=0130272949}}</ref> ''However'', it is possible for two random variables ''X'' and ''Y'' to be so distributed jointly that each one alone is marginally normally distributed, and they are uncorrelated, but they are not independent; examples are given below.
 
 
  <li>[http://enseignement-lsf.com/spip.php?article64#forum18645196 http://enseignement-lsf.com/spip.php?article64#forum18645196]</li>
==Examples==
 
 
  <li>[http://enseignement-lsf.com/spip.php?article65#forum18642787 http://enseignement-lsf.com/spip.php?article65#forum18642787]</li>
=== A symmetric example ===
 
 
  <li>[http://www.78xyx.com/bbs/home.php?mod=space&uid=3521 http://www.78xyx.com/bbs/home.php?mod=space&uid=3521]</li>
[[File:uncorrelated sym.png|thumb|alt=Two normally distributed, uncorrelated but dependent variables.|Joint range of ''X'' and ''Y''. Darker indicates higher value of the density funtion.]]
 
 
</ul>
Suppose ''X'' has a normal distribution with [[expected value]] 0 and variance 1. Let ''W'' have the [[Rademacher distribution]], so that ''W'' = 1 or &minus;1, each with probability 1/2, and assume ''W'' is independent of ''X''. Let ''Y''&nbsp;=&nbsp;''WX''.  Then<ref>[http://www.math.uiuc.edu/~r-ash/Stat/StatLec21-25.pdf UIUC, Lecture 21. ''The Multivariate Normal Distribution''], 21.6:"Individually Gaussian Versus Jointly Gaussian".</ref>  
* ''X'' and ''Y'' are uncorrelated;
* Both have the same normal distribution; and
* ''X'' and ''Y'' are not independent.
Note that the distribution of the simple linear combination ''X''&nbsp;+&nbsp;''Y'' concentrates positive probability at 0: Pr(''X''&nbsp;+&nbsp;''Y''&nbsp;=&nbsp;0)&nbsp; =&nbsp;1/2 and so is not normally distributed. By the definition above, ''X'' and ''Y'' are not jointly normally distributed.
 
To see that ''X'' and ''Y'' are uncorrelated, consider
 
: <math> \begin{align}
\operatorname{cov}(X,Y) &{} = E(XY) - E(X)E(Y) = E(XY) - 0 = E(E(XY\mid W)) \\
& {} = E(X^2)\Pr(W=1) + E(-X^2)\Pr(W=-1) \\
& {} = 1\cdot\frac12 + (-1)\cdot\frac12 = 0.
\end{align}
</math>
 
To see that ''Y'' has the same normal distribution as&nbsp;''X'', consider
 
: <math>\begin{align}
\Pr(Y \le x) & {} = E(\Pr(Y \le x\mid W)) \\
& {} = \Pr(X \le x)\Pr(W = 1) + \Pr(-X\le x)\Pr(W = -1) \\
& {} = \Phi(x) \cdot\frac12 + \Phi(x)\cdot\frac12
\end{align}</math>
 
(since ''X'' and &minus;''X'' both have the same normal distribution), where <math>\Phi(x)</math> is the [[cumulative distribution function]] of the normal distribution..
 
To see that ''X'' and ''Y'' are not independent, observe that |''Y''|&nbsp;=&nbsp;|''X''| or that Pr(''Y''&nbsp;>&nbsp;1&nbsp;|&nbsp;''X''&nbsp;=&nbsp;1/2)&nbsp;=&nbsp;0.
 
=== An asymmetric example ===
 
[[File:Uncorrelated asym.png|thumb|The joint density of X and Y. Darker indicates a higher value of the density.]]
 
Suppose ''X'' has a normal distribution with [[expected value]] 0 and variance 1. Let
 
: <math>Y=\left\{\begin{matrix} X & \text{if } \left|X\right| \leq c \\
-X & \text{if } \left|X\right|>c \end{matrix}\right.</math>
 
where ''c'' is a positive number to be specified below. If ''c'' is very small, then the [[correlation]] corr(''X'',&nbsp;''Y'') is near &minus;1; if ''c'' is very large, then corr(''X'',&nbsp;''Y'') is near 1. Since the correlation is a [[continuous function]] of ''c'', the [[intermediate value theorem]] implies there is some particular value of ''c'' that makes the correlation&nbsp;0. That value is approximately&nbsp;1.54.  In that case, ''X'' and ''Y'' are uncorrelated, but they are clearly not independent, since ''X'' completely determines&nbsp;''Y''.
 
To see that ''Y'' is normally distributed&mdash;indeed, that its distribution is the same as that of ''X''&mdash;let us find its [[cumulative distribution function]]:
 
: <math>\begin{align}\Pr(Y \leq x) &= \Pr(\{|X| \leq c\text{ and }X \leq x\}\text{ or }\{|X|>c\text{ and }-X \leq x\})\\
&= \Pr(|X| \leq c\text{ and }X \leq x) + \Pr(|X|>c\text{ and }-X \leq x)\\
&= \Pr(|X| \leq c\text{ and }X \leq x) + \Pr(|X|>c\text{ and }X \leq x) \\
&= \Pr(X \leq x). \end{align}\,</math>
 
where the next-to-last equality follows from the symmetry of the distribution of ''X'' and the symmetry of the condition that |''X''|&nbsp;≤&nbsp;''c''.
 
Observe that the difference ''X''&nbsp;−&nbsp;''Y'' is nowhere near being normally distributed, since it has a substantial probability (about&nbsp;0.88) of it being equal to&nbsp;0, whereas the normal distribution, being a continuous distribution, has no discrete part, i.e., does not concentrate more than zero probability at any single point. Consequently ''X'' and ''Y'' are not ''jointly'' normally distributed, even though they are separately normally distributed.<ref>Edward L. Melnick and Aaron Tenenbein, "Misspecifications of the Normal Distribution", ''[[The American Statistician]]'', volume 36, number 4 November 1982, pages 372&ndash;373</ref>
 
== References ==
{{reflist}}
 
[[Category:Theory of probability distributions]]
[[Category:Statistical dependence]]

Revision as of 07:40, 7 January 2014

In probability theory, two random variables being uncorrelated does not imply their independence. In some contexts, uncorrelatedness implies at least pairwise independence (as when the random variables involved have Bernoulli distributions).

It is sometimes mistakenly thought that one context in which uncorrelatedness implies independence is when the random variables involved are normally distributed. However, this is incorrect if the variables are merely marginally normally distributed but not jointly normally distributed.

Suppose two random variables X and Y are jointly normally distributed. That is the same as saying that the random vector (XY) has a multivariate normal distribution. It means that the joint probability distribution of X and Y is such that each linear combination of X and Y is normally distributed, i.e. for any two constant (i.e., non-random) scalars a and b, the random variable aX + bY is normally distributed. In that case if X and Y are uncorrelated, i.e., their covariance cov(XY) is zero, then they are independent.[1] However, it is possible for two random variables X and Y to be so distributed jointly that each one alone is marginally normally distributed, and they are uncorrelated, but they are not independent; examples are given below.

Examples

A symmetric example

Two normally distributed, uncorrelated but dependent variables.
Joint range of X and Y. Darker indicates higher value of the density funtion.

Suppose X has a normal distribution with expected value 0 and variance 1. Let W have the Rademacher distribution, so that W = 1 or −1, each with probability 1/2, and assume W is independent of X. Let Y = WX. Then[2]

  • X and Y are uncorrelated;
  • Both have the same normal distribution; and
  • X and Y are not independent.

Note that the distribution of the simple linear combination X + Y concentrates positive probability at 0: Pr(X + Y = 0)  = 1/2 and so is not normally distributed. By the definition above, X and Y are not jointly normally distributed.

To see that X and Y are uncorrelated, consider

To see that Y has the same normal distribution as X, consider

(since X and −X both have the same normal distribution), where is the cumulative distribution function of the normal distribution..

To see that X and Y are not independent, observe that |Y| = |X| or that Pr(Y > 1 | X = 1/2) = 0.

An asymmetric example

The joint density of X and Y. Darker indicates a higher value of the density.

Suppose X has a normal distribution with expected value 0 and variance 1. Let

where c is a positive number to be specified below. If c is very small, then the correlation corr(XY) is near −1; if c is very large, then corr(XY) is near 1. Since the correlation is a continuous function of c, the intermediate value theorem implies there is some particular value of c that makes the correlation 0. That value is approximately 1.54. In that case, X and Y are uncorrelated, but they are clearly not independent, since X completely determines Y.

To see that Y is normally distributed—indeed, that its distribution is the same as that of X—let us find its cumulative distribution function:

where the next-to-last equality follows from the symmetry of the distribution of X and the symmetry of the condition that |X| ≤ c.

Observe that the difference X − Y is nowhere near being normally distributed, since it has a substantial probability (about 0.88) of it being equal to 0, whereas the normal distribution, being a continuous distribution, has no discrete part, i.e., does not concentrate more than zero probability at any single point. Consequently X and Y are not jointly normally distributed, even though they are separately normally distributed.[3]

References

43 year old Petroleum Engineer Harry from Deep River, usually spends time with hobbies and interests like renting movies, property developers in singapore new condominium and vehicle racing. Constantly enjoys going to destinations like Camino Real de Tierra Adentro.

  1. 20 year-old Real Estate Agent Rusty from Saint-Paul, has hobbies and interests which includes monopoly, property developers in singapore and poker. Will soon undertake a contiki trip that may include going to the Lower Valley of the Omo.

    My blog: http://www.primaboinca.com/view_profile.php?userid=5889534
  2. UIUC, Lecture 21. The Multivariate Normal Distribution, 21.6:"Individually Gaussian Versus Jointly Gaussian".
  3. Edward L. Melnick and Aaron Tenenbein, "Misspecifications of the Normal Distribution", The American Statistician, volume 36, number 4 November 1982, pages 372–373