|
|
Line 1: |
Line 1: |
| In [[statistics]], particularly [[analysis of variance]] and [[linear regression]], an orthogonal '''contrast''' is a linear combination of two or more factor level means ([[average]]s) whose coefficients add up to zero.<ref name="nist">[http://www.itl.nist.gov/div898/handbook/prc/section4/prc426.htm NIST/SEMATECH e-Handbook of Statistical Methods]</ref><ref>Everitt, B.S. (2002) ''Cambridge Dictionary of Statistics'', CUP, ISBN 0-521-81099-X (Entry for "Contrast"</ref> Non-orthogonal contrasts do not necessarily sum to 0. Contrasts should be constructed "to answer specific research questions", and do not necessarily have to be orthogonal.<ref name=Kuehl>{{cite book|last=Kuehl|first=Robert O.|title=Design of experiments : statistical principles of research design and analysis|date=2000|publisher=Duxbury/Thomson Learning|location=Pacific Grove, CA|isbn=0534368344|edition=2nd ed.}}</ref>
| | I would like to introduce myself to you, I am Jayson Simcox but I don't like when people use my complete title. Invoicing is what I do. She is really fond of caving but she doesn't have the time recently. Ohio is where his home is and his family members loves it.<br><br>my web blog clairvoyant psychic ([http://www.indosfriends.com/profile-253/info/ indosfriends.com]) |
| | |
| A contrast is defined as the sum of each group mean multiplied by a coefficient for each group (i.e., a signed number, c<sub>j</sub>).<ref name="Clark">{{cite book|last=Clark|first=James M.|title=Intermediate Data Analysis: Multiple Regression and Analysis of Variance|year=2007|location=University of Winnipeg}}</ref> In equation form,
| |
| <math> L = c_1 \bar X_1 + c_2 \bar X_2 + ... + c_k \bar X_k = \sum c_j \bar X_j </math>, where L is the weighted sum of group means, the c<sub>j</sub> coefficients represent the assigned weights of the means (these must sum to 0 for orthogonal contrasts), and <math>\bar X</math><sub>i</sub> represents the group means.<ref name="Howell">{{cite book|last=Howell|first=David C.|title=Statistical methods for psychology|year=2010|publisher=Thomson Wadsworth|location=Belmont, CA|isbn=978-0-495-59784-1|edition=7th ed.}}</ref> Coefficients can be positive or negative, and fractions or whole numbers, depending on the comparison of interest. Linear contrasts are very useful and can be used to test complex hypotheses when used in conjunction with ANOVA or multiple regression. In essence, each contrast defines and tests for a particular pattern of differences among the means.<ref name="Clark" />
| |
| | |
| ==Background==
| |
| | |
| A simple (non-orthogonal) contrast is the difference between two means. A more complex contrast can test the difference between several means (i.e., if you have four means, assign coefficients of -3, -1, +1, and +3), or test the difference between a single mean and the combined mean of several groups (i.e., if you have four means assign coefficients of -3, +1, +1, and +1) or test the difference between the combined mean of several groups and the combined mean of several other groups (i.e., if you have four means assign coefficients of -1, -1, +1, and +1).<ref name="Howell" /> The coefficients for the means to be combined (or averaged) must be the same in magnitude and direction, in other words, they are weighted equally. When means are assigned different coefficients (either in magnitude or direction, or both), the contrast is testing for a difference between those means. A ''contrast'' may be any of: the set of coefficients used to specify a comparison; the specific value of the linear combination obtained for a given study or experiment; the [[random variable|random quantity]] defined by applying the linear combination to treatment effects when these are themselves considered as random variables. In the last context here, the term '''contrast variable''' is sometimes used.
| |
| | |
| Contrasts are sometimes used to compare [[mixed model|mixed effects]]. A common example can be the difference between two test scores — one at the beginning of the semester and one at its end. Note that we are not interested in one of these scores by itself, but only in the contrast (in this case — the difference). Since this is a linear combination of independent variables, its variance will match accordingly, as the weighted sum of the variances; in this case both weights are one. This "blending" of two variables into one might be useful in many cases such as [[ANOVA]], [[Regression analysis|regression]], or even as descriptive statistics in its own right.
| |
| | |
| An example of a complex contrast would be comparing 5 standard treatments to a new treatment, hence giving each old treatment mean a weight of 1/5, and the new sixth treatment mean a weight of −1 (using the equation above). If this new linear combination has a mean zero, this will mean that the old treatments are not different from the new treatment on average. If the sum of the new linear combination is positive, this will mean that the combined mean of the 5 standard treatments is higher than the new treatment mean. If the sum of the new linear combination is negative, this will mean the combined mean of the 5 standard treatments is lower than the new treatment mean.<ref name="Clark" /> However, the sum of the linear combination is not a significance test, see testing significance (below) to learn how to determine if your contrast is significant.
| |
| | |
| The usual results for linear combinations of [[Statistical independence|independent random variables]] mean that the variance of a contrast is equal to the weighted sum of the variances.<ref name="nist"/> If two contrasts are [[Orthogonality|orthogonal]], estimates created by using such contrasts will be [[uncorrelated]]. This helps to minimize the [[Type I and type II errors|Type I Error Rate]], the rate of falsely rejecting a true [[null hypothesis]]. Because orthogonal contrasts test different aspects of the data, they are independent, the results of one contrast has no effect on the results of the other contrasts. When contrasts are not orthogonal, they are not testing completing different aspects of the data, the results of one contrast can then influence the results of other contrasts. This can increase the chance of falsely rejecting a true null hypothesis.<ref name="Howell" />
| |
| | |
| If orthogonal contrasts are available, it is possible to summarize the results of a statistical analysis in the form of a simple analysis of variance table, in such a way that it contains the results for different test statistics relating to different contrasts, each of which are statistically independent. Linear contrasts can be easily converted into [[partition of sums of squares|sums of squares]]. SS<sub>contrast</sub> = <math>\tfrac{n(\sum c_j \bar X_j)^2 }{\sum c_j^2} </math>, with 1 [[Degrees of freedom (statistics)|degree of freedom]], where ''n'' represents the number of observations per group. If the contrasts are orthogonal, the sum of the SS<sub>contrasts</sub> = SS<sub>treatment</sub>. Testing the significance of a contrast requires the computation of SS<sub>contrast</sub>.<ref name="Howell" /> A recent development in statistical analysis is the [[standardized mean of a contrast variable]]. This makes a comparison between the size of the differences between groups, as measured by a contrast and the accuracy with which that contrast can be measured by a given study or experiment.<ref name=ZhangBook2011>{{cite book
| |
| |author= Zhang XHD
| |
| |year=2011
| |
| |title= Optimal High-Throughput Screening: Practical Experimental Design and Data Analysis for Genome-scale RNAi Research
| |
| |publisher =Cambridge University Press
| |
| |url=
| |
| |isbn=978-0-521-73444-8}}</ref>
| |
| | |
| ==Types of contrast==
| |
| | |
| *'''Orthogonal contrasts''' are a set of contrasts in which, for any distinct pair, the sum of the cross-products of the coefficients is zero(Assume sample sizes are equal). <ref name=EV>Everitt, B.S. (2002) ''The Cambridge Dictionary of Statistics'', CUP. ISBN 0-521-81099-X (entry for "Orthogonal contrasts")</ref> Although there are potentially infinite sets of orthogonal contrasts, within any given set there will always be a maximum of exactly ''k'' - 1 possible orthogonal contrasts (where ''k'' = the number of group means available).<ref name="Howell" />
| |
| *'''Polynomial contrasts''' are a special set of orthogonal contrasts that test polynomial patterns in data with more than 2 means (e.g., linear, quadratic, cubic, quartic, etc).<ref>{{cite web|last=Kim|first=Jong Sung|title=Orthogonal Polynomial Contrasts|url=http://www.mth.pdx.edu/~jkim/Teaching/S4566/S2012/Handout/Hd6_1%28Polycnst_RCBD%29.pdf|accessdate=27 April 2012}}</ref>
| |
| *'''Orthonormal contrasts''' are orthogonal contrasts which satisfy the additional condition that, for each contrast, the sum squares of the coefficients add up to one.<ref name=EV/>
| |
| | |
| ==Testing Significance==
| |
| | |
| SS<sub>contrast</sub> also happens to be a mean square because all contrasts have 1 degree of freedom. Dividing MS<sub>contrast</sub> by MS<sub>error</sub> produces an [[F-test|F-statistic]] with one and df<sub>error</sub> degrees of freedom, the [[statistical significance]] of F<sub>contrast</sub> can be determined by comparing the obtained F statistic with a critical value of F with the same degrees of freedom.<ref name="Howell" />
| |
| | |
| ==References==
| |
| {{reflist}}
| |
| | |
| == External links ==
| |
| * [http://www.soton.ac.uk/~cpd/anovas/datasets/Orthogonal%20contrasts.htm Examples of orthogonal contrasts for analysis of variance]
| |
| * [http://www.utdallas.edu/~herve/abdi-contrasts2010-pretty.pdf Contrast Analysis (Abdi & Williams, 2010)]
| |
| | |
| [[Category:Analysis of variance]]
| |
| [[Category:Statistical terminology]]
| |
| [[Category:Multiple comparisons]]
| |
I would like to introduce myself to you, I am Jayson Simcox but I don't like when people use my complete title. Invoicing is what I do. She is really fond of caving but she doesn't have the time recently. Ohio is where his home is and his family members loves it.
my web blog clairvoyant psychic (indosfriends.com)