KMS state: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Addbot
m Bot: Migrating 1 interwiki links, now provided by Wikidata on d:q3591812
en>Headbomb
m Various citation cleanup + AWB fixes using AWB
 
Line 1: Line 1:
{{Regression bar}}
The writer is recognized by the name of Numbers Wunder. To collect cash is what her family members and her enjoy. For many years I've been working as a payroll clerk. My family life in Minnesota and my family members enjoys it.<br><br>my web-site: [http://www.neweracinema.com/tube/user/D69E std home test]
{{context|date=February 2012}}
[[File:Isotonic regression.svg|thumb|400px|An example of isotonic regression]]
 
In [[numerical analysis]], '''isotonic regression''' ('''IR''') involves finding a weighted [[least-squares]] fit <math>x\in \Bbb{R}^n</math> to a [[Euclidean space|vector]] <math>a\in \Bbb{R}^n</math> with weights vector <math>w\in \Bbb{R}^n</math> subject to a set of non-contradictory constraints of kind <math>x_i \ge x_j</math>.
 
Such constraints define [[partial order]] or [[total order]] and can be represented as a [[directed graph]] <math>G=(N,E)</math>, where N is the set of variables involved, and E is the set of pairs (i, j) for each constraint <math>x_i \ge x_j</math>. Thus, the IR problem corresponds to the following [[quadratic programming|quadratic program]] (QP):
 
:<math>\min \sum_{i=1}^n w_i (x_i - a_i)^2</math> <math>\text{subject to }x_i\ge x_j~ \text{ for all } (i,j)\in E.</math>
 
In the case when <math>G=(N,E)</math> is a [[total order]], a simple [[iterative algorithm]] for solving this QP is called the [[pool adjacent violators algorithm]] (PAVA). Best and Chakravarti (1990) have studied the problem as an active set identification problem, and have proposed a primal algorithm in O(n), the same [[computational complexity theory|complexity]] as the PAVA, which can be seen as a dual algorithm.<ref>{{Cite journal | doi = 10.1007/BF01580873 | author = Best, M.J.; & Chakravarti N. | year = 1990 | title = Active set algorithms for isotonic regression; a unifying framework | url = | journal = Mathematical Programming | volume = 47 | issue = | pages = 425–439 }}</ref>
 
IR has applications in [[statistical inference]], for example, to fit of an isotonic curve to mean experimental results when an order is expected. A benefit of isotonic regression is that it does not assume any form for the target function, such as linearity assumed by [[linear regression]].
 
Another application is nonmetric [[multidimensional scaling]],<ref>{{Cite journal | doi = 10.1007/BF02289694 | author = [[Joseph Kruskal|Kruskal, J. B.]] | year = 1964 | title = Nonmetric Multidimensional Scaling: A numerical method | url = | journal = Psychometrika | volume = 29 | issue = 2| pages = 115–129 }}</ref> where a low-dimensional [[embedding]] for data points is sought such that order of distances between points in the embedding matches [[order of dissimilarity]] between points.  Isotonic regression is used iteratively to fit ideal distances to preserve relative dissimilarity order.
 
Isotonic regression is also sometimes referred to as ''[[monotonic]] regression''.  Correctly speaking, ''isotonic'' is used when the direction of the  trend is strictly increasing, while ''monotonic'' could imply a trend that is either strictly increasing or strictly decreasing.
 
Isotonic Regression under the <math>L_p</math> for <math>p>0</math> is defined as follows:
 
:<math>\min \sum_{i=1}^n w_i |x_i - a_i|^p</math> <math>\mathrm{subject~to~}x_i\ge x_j~ \text{ for all } (i,j)\in E.</math>
 
== Simply ordered case ==
To illustrate the above, let
<math>x_1 \leq x_2 \leq \ldots \leq x_n</math>, and <math>f(x_1) \leq f(x_2) \leq \ldots \leq f(x_n)</math>, and <math>w_i \geq 0 </math>.
 
The isotonic estimator, <math>g^*</math>, minimizes the weighted least squares-like condition:
 
:<math>\min_g \sum_{i=1}^n w_i (g(x_i) - f(x_i))^2</math>
 
Where <math>g</math> is the unknown function we are estimating, and <math>f</math> is a known function.
 
Software has been developed in the R statistical package for computing isotone (monotonic) regression.<ref>{{cite journal|last=De Leeuw|first=Jan|coauthors=K. Hornik, P. Mair|title=Isotone Optimization in R: Pool-Adjacent-Violators Algorithm (PAVA) and Active Set Methods|journal=Journal of statistical software|year=2009|volume=32|issue=5|pages=1}}</ref>
 
== References ==
{{Reflist}}
 
== Further reading ==
* {{cite book |last=Robertson |first=T. |last2=Wright |first2=F. T. |last3=Dykstra |first3=R. L. |year=1988 |title=Order restricted statistical inference |location=New York |publisher=Wiley |isbn=0-471-91787-7 }}
* {{cite book |last=Barlow |first=R. E. |last2=Bartholomew |first2=D. J. |last3=Bremner |first3=J. M. |last4=Brunk |first4=H. D. |year=1972 |title=Statistical inference under order restrictions; the theory and application of isotonic regression |location=New York |publisher=Wiley |year=1972 |isbn=0-471-04970-0 }}
* {{Cite journal | doi = 10.1111/j.1467-9868.2008.00677.x | author = Shively, T.S., Sager, T.W., Walker, S.G. | year = 2009 | title = A Bayesian approach to non-parametric monotone function estimation | url = | journal = Journal of the Royal Statistical Society, Series B | volume = 71 | issue = 1| pages = 159–175 }}
* {{Cite journal | doi = 10.1093/biomet/88.3.793 | author = Wu, W. B.; Woodroofe, M.; & Mentz, G. | year = 2001 | title = Isotonic regression: Another look at the changepoint problem | url = | journal = Biometrika | volume = 88 | issue = 3| pages = 793–804 }}
 
{{Statistics|correlation}}
 
{{DEFAULTSORT:Isotonic Regression}}
[[Category:Regression analysis]]
[[Category:Nonparametric regression]]
[[Category:Non-parametric Bayesian methods]]
[[Category:Numerical analysis]]

Latest revision as of 20:02, 20 August 2014

The writer is recognized by the name of Numbers Wunder. To collect cash is what her family members and her enjoy. For many years I've been working as a payroll clerk. My family life in Minnesota and my family members enjoys it.

my web-site: std home test