Bayesian information criterion: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
update ref format to full citation
en>TheSeven
→‎References: include Findley (1991)
 
(One intermediate revision by one other user not shown)
Line 1: Line 1:
In [[statistics|statistical data analysis]] the '''total sum of squares''' (TSS or SST) is a quantity that appears as part of a standard way of presenting results of such analyses. It is defined as being the sum, over all observations, of the squared differences of each observation from the overall [[mean]].<ref>Everitt, B.S. (2002) ''The Cambridge Dictionary of Statistics'', CUP, ISBN 0-521-81099-X</ref>
Myrtle Benny is how I'm known as and I feel comfortable when individuals use the full name. Bookkeeping is my profession. Years in the past we moved to North Dakota. One of the extremely best issues in the globe for him is to gather badges but he is struggling to discover time for it.<br><br>Here is my website :: home std test ([http://203.250.78.160/zbxe/?document_srl=817810&mid=gallery redirected here])
 
In [[statistics|statistical]] [[linear model]]s, (particularly in standard [[regression model]]s), the '''TSS''' is the [[sum]] of the [[square (algebra)|square]]s of the difference of the dependent variable and its [[mean]]:
 
:<math>\sum_{i=1}^{n}\left(y_{i}-\bar{y}\right)^2</math>
 
where <math>\bar{y}</math> is the mean.
 
For wide classes of linear models, the total sum of squares equals the [[explained sum of squares]] plus the [[residual sum of squares]]. For a proof of this in the multivariate OLS case, see [[Explained sum of squares#Partitioning in the general OLS model|partitioning in the general OLS model]].
 
In [[analysis of variance]] (ANOVA) the total sum of squares is the sum of the so-called "within-samples" sum of squares and "between-samples" sum of squares, i.e., partitioning of the sum of squares.
In [[multivariate analysis of variance]] (MANOVA) the following equation applies<ref name="MardiaK1979Multivariate">{{Cite book
| author = [[K. V. Mardia]], J. T. Kent and J. M. Bibby
| title = Multivariate Analysis
| publisher = [[Academic Press]]
| year = 1979
| isbn = 0-12-471252-5
}} Especially chapters 11 and 12.</ref>
:<math>\mathbf{T} = \mathbf{W} + \mathbf{B},</math>
where '''T''' is the total sum of squares and products (SSP) [[Matrix (mathematics)|matrix]], '''W''' is the within-samples SSP matrix and '''B''' is the between-samples SSP matrix.
Similar terminology may also be used in [[linear discriminant analysis]], where '''W''' and '''B''' are respectively referred to as the within-groups and between-groups SSP matrics.<ref name="MardiaK1979Multivariate"/>
 
==See also==
*[[Sum of squares (statistics)]]
*[[Lack-of-fit sum of squares]]
 
==References==
{{Reflist}}
 
[[Category:Regression analysis]]
[[Category:Least squares]]

Latest revision as of 15:39, 12 January 2015

Myrtle Benny is how I'm known as and I feel comfortable when individuals use the full name. Bookkeeping is my profession. Years in the past we moved to North Dakota. One of the extremely best issues in the globe for him is to gather badges but he is struggling to discover time for it.

Here is my website :: home std test (redirected here)