Sorting algorithm: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Pakaraki
Shell sort: tidy wikilink
en>Altenmann
Reverted 1 edit by 122.53.16.2 (talk). (TW)
 
(One intermediate revision by one other user not shown)
Line 1: Line 1:
{{Statistical mechanics}}
The writer is known by the name of Numbers Wunder. Doing ceramics is what her family members and her enjoy. Minnesota has always been his house but his wife desires them to move. In her professional lifestyle she is a payroll clerk but she's always needed her personal business.<br><br>Feel free to visit my webpage; [http://youplaisir.com//blog/54226 youplaisir.com]
 
'''Statistical mechanics''' is a branch of [[mathematical physics]] that studies, using [[probability theory]], the average behaviour of a [[mechanics|mechanical]] system where the state of the system is uncertain.<ref name="gibbs"/><ref name="tolman"/><ref name="balescu"/><ref group=note>The term ''statistical mechanics'' is sometimes used to refer to only ''statistical thermodynamics''. This article takes the broader view. By some definitions, ''[[statistical physics]]'' is an even broader term which statistically studies any type of physical system, but is often taken to be synonymous with statistical mechanics.</ref>
 
The present understanding of the universe indicates that its fundamental laws are mechanical in nature, and that all physical systems are therefore governed by mechanical laws at a microscopic level. These laws are precise equations of motion that map any given initial state to a corresponding future state at a later time. There is however a disconnect between these laws and everyday life, as we do not find it necessary (nor easy) to know exactly at a microscopic level the simultaneous positions and velocities of each molecule while carrying out processes at the human scale (for example, when performing a chemical reaction). Statistical mechanics is a collection of mathematical tools that are used to fill this disconnect between the laws of mechanics and the practical experience of incomplete knowledge.
 
A common use of statistical mechanics is in explaining the [[thermodynamics|thermodynamic]] behaviour of large systems. Microscopic mechanical laws do not contain concepts such as temperature, heat, or entropy, however statistical mechanics shows how these concepts arise from the natural uncertainty that arises about the state of a system when that system is prepared in practice. The benefit of using statistical mechanics is that it provides exact methods to connect thermodynamic quantities (such as [[heat capacity]]) to microscopic behaviour, whereas in [[classical thermodynamics]] the only available option would be to just measure and tabulate such quantities for various materials. Statistical mechanics also makes it possible to ''extend'' the laws of thermodynamics to cases which are not considered in classical thermodynamics, for example microscopic systems and other mechanical systems with few degrees of freedom.<ref name="gibbs"/> This branch of statistical mechanics which treats and extends classical thermodynamics is known as '''statistical thermodynamics''' or '''equilibrium statistical mechanics'''.
 
Statistical mechanics also finds use outside equilibrium. An important subbranch known as '''non-equilibrium statistical mechanics''' deals with the issue of microscopically modelling the speed of [[irreversible process]]es that are driven by imbalances. Examples of such processes include chemical reactions, or flows of particles and heat. Unlike with equilibrium, there is no exact formalism that applies to non-equilibrium statistical mechanics in general and so this branch of statistical mechanics remains an active area of theoretical research.
 
==Principles: mechanics and ensembles==
 
{{main|Mechanics|Statistical ensemble (mathematical physics)|l2=Statistical ensemble}}
 
In physics there are two types of mechanics usually examined: [[classical mechanics]] and [[quantum mechanics]]. For both types of mechanics, the standard mathematical approach is to consider two ingredients:
# The complete state of the mechanical system at a given time, mathematically encoded as a [[phase space|phase point]] (classical mechanics) or a pure [[quantum state vector]] (quantum mechanics).
# An equation of motion which carries the state forward in time: [[Hamilton's equations]] (classical mechanics) or the [[time-dependent Schrödinger equation]] (quantum mechanics)
Using these two ingredients, the state at any other time, past or future, can in principle be calculated.
 
Whereas ordinary mechanics only considers the behaviour of a single state, statistical mechanics introduces the [[Statistical ensemble (mathematical physics)|statistical ensemble]], which is a large collection of virtual, independent copies of the system in various states. The statistical ensemble is a [[probability distribution]] over all possible states of the system. In classical statistical mechanics, the ensemble is a probability distribution over phase points (as opposed to a single phase point in ordinary mechanics), usually represented as a distribution in a [[phase space]] with [[canonical coordinates]]. In quantum statistical mechanics, the ensemble is a probability distribution over pure states,<ref group=note>The probabilities in quantum statistical mechanics should not be confused with [[quantum superposition]]. While a quantum ensemble can contain states with quantum superpositions, a single quantum state cannot be used to represent an ensemble.</ref> and can be compactly summarized as a [[density matrix]].
 
As is usual for probabilities, the ensemble can be interpreted in different ways:<ref name="gibbs"/>
* an ensemble can be taken to represent the various possible states that a ''single system'' could be in ([[epistemic probability]], a form of knowledge), or
* the members of the ensemble can be understood as the states of the systems in experiments repeated on independent systems which have been prepared in an similar but imperfectly controlled manner ([[empirical probability]]), in the limit of an infinite number of trials.
These two meanings are equivalent for many purposes, and will be used interchangeably in this article.
 
However the probability is interpreted, each state in the ensemble evolves over time according to the equation of motion. Thus, the ensemble itself (the probability distribution over states) also evolves, as the virtual systems in the ensemble continually leave one state and enter another. The ensemble evolution is given by the [[Liouville's theorem (Hamiltonian)|Liouville equation]] (classical mechanics) or the [[von Neumann equation]] (quantum mechanics). These equations are simply derived by the application of the mechanical equation of motion separately to each virtual system contained in the ensemble, with the probability of the virtual system being conserved over time as it evolves from state to state.
 
One special class of ensemble is those ensembles that do not evolve over time. These ensembles are known as ''equilibrium ensembles'' and their condition is known as ''statistical equilibrium''. Statistical equilibrium occurs if, for each state in the ensemble, the ensemble also contains all of its future and past states with probabilities equal to that state.<ref group=note>Statistical equilibrium should not be confused with ''[[mechanical equilibrium]]''. The latter occurs when a mechanical system has completely ceased to evolve even on a microscopic scale, due to being in a state with a perfect balancing of forces. Statistical equilibrium generally involves states that are very far from mechanical equilibrium.</ref> The study of equilibrium ensembles of isolated systems is the focus of statistical thermodynamics. Non-equilibrium statistical mechanics addresses the more general case of ensembles that change over time, and/or ensembles of non-isolated systems.
 
==Statistical thermodynamics==
 
The primary goal of statistical thermodynamics (also known as equilibrium statistical mechanics) is to explain the [[classical thermodynamics]] of materials in terms of the properties of their constituent particles and the interactions between them. In other words, statistical thermodynamics provides a connection between the macroscopic properties of materials in [[thermodynamic equilibrium]], and the microscopic behaviours and motions occurring inside the material.
 
As an example, one might ask what is it about a [[thermodynamic system]] of [[ammonia|NH<sub>3</sub>]] molecules that determines the [[thermodynamic free energy|free energy]] characteristic of that compound?  Classical thermodynamics does not provide the answer.  If, for example, we were given spectroscopic data, of this body of gas molecules, such as [[bond length]], [[bond angle]], [[bond rotation]], and flexibility of the bonds in NH<sub>3</sub> we should see that the free energy could not be other than it is.  To prove this true, we need to bridge the gap between the microscopic realm of atoms and molecules and the macroscopic realm of classical thermodynamics.  Statistical mechanics demonstrates how the [[thermodynamic parameters]] of a system, such as temperature and pressure, are related to microscopic behaviours of such constituent atoms and molecules.<ref>{{cite book | author=Nash, Leonard K. | title=Elements of Statistical Thermodynamics, 2nd Ed. | publisher=Dover Publications, Inc. | year=1974 | isbn=0-486-44978-5 | oclc=61513215}}</ref>
 
Although we may understand a system generically, in general we lack information about the state of a specific instance of that system. For this reason the notion of [[statistical ensemble (mathematical physics)|statistical ensemble]] (a probability distribution over possible states) is necessary. Furthermore, in order to reflect that the material is in a thermodynamic equilibrium, it is necessary to introduce a corresponding statistical mechanical definition of equilibrium. The analogue of thermodynamic equilibrium in statistical thermodynamics is the ensemble property of statistical equilibrium, described in the previous section. An additional assumption in statistical thermodynamics is that the system is isolated (no varying external forces are acting on the system), so that its total energy does not vary over time. A [[sufficient condition|sufficient]] (but not necessary) condition for statistical equilibrium with an isolated system is that the probability distribution is a function only of conserved properties (total energy, total particle numbers, etc.).<ref name="gibbs"/>
 
===Fundamental postulate===
 
There are many different equilibrium ensembles that can be considered, and only some of them correspond to thermodynamics.<ref name="gibbs"/> An additional postulate is necessary to motivate why the ensemble for a given system should have one form or another.
 
A common approach found in many textbooks is to take the ''equal a priori probability postulate''.<ref name="tolman"/> This postulate states that
: ''For an isolated system with an exactly known energy and exactly known composition, the system can be found with ''equal probability'' in any [[microstate (statistical mechanics)|microstate]] consistent with that knowledge.''
The equal a priori probability postulate therefore provides a motivation for the [[microcanonical ensemble]] described below. There are various arguments in favour of the equal a priori probability postulate:
* [[Ergodic hypothesis]]: An ergodic state is one that evolves over time to explore "all accessible" states: all those with the same energy and composition. In an ergodic system, the microcanonical ensemble is the only possible equilibrium ensemble with fixed energy. This approach has limited applicability, since most systems are not ergodic.
* [[Principle of indifference]]: In the absence of any further information, we can only assign equal probabilities to each compatible situation.
* [[Maximum entropy thermodynamics|Maximum information entropy]]: A more elaborate version of the principle of indifference states that the correct ensemble is the ensemble that is compatible with the known information and that has the largest [[Gibbs entropy]] ([[information entropy]]).<ref>{{cite doi|10.1103/PhysRev.106.620}}</ref>
Other fundamental postulates for statistical mechanics have also been proposed.<ref name="uffink"/>
 
In any case, the reason for establishing the microcanonical ensemble is mainly [[axiom]]atic.<ref name="uffink"/> The microcanonical ensemble itself is mathematically awkward to use for real calculations, and even very simple finite systems can only be solved approximately. However, it is possible to use the microcanonical ensemble to construct a hypothetical infinite [[thermodynamic reservoir]] that has an exactly defined notion of temperature and chemical potential. Once this reservoir has been established, it can be used to justify exactly the [[canonical ensemble]] or [[grand canonical ensemble]] (see below) for any other system by considering the contact of this system with the reservoir.<ref name="gibbs"/> These other ensembles are those actually used in practical statistical mechanics calculations as they are mathematically simpler and also correspond to a much more realistic situation (energy not known exactly).<ref name="tolman"/>
 
===Three thermodynamic ensembles===
{{main|Microcanonical ensemble|Canonical ensemble|Grand canonical ensemble}}
 
There are three equilibrium ensembles with a simple form that can be defined for any [[isolated system]] bounded inside a finite volume.<ref name="gibbs"/> These are the most often discussed ensembles in statistical thermodynamics. In the macroscopic limit (defined below) they all correspond to classical thermodynamics.
* The [[microcanonical ensemble]] describes a system with a precisely given energy and fixed composition (precise number of particles). The microcanonical ensemble contains with equal probability each possible state that is consistent with that energy and composition.
* The [[canonical ensemble]] describes a system of fixed composition that is in [[thermal equilibrium]]<ref group=note>The transitive thermal equilibrium (as in, "X is thermal equilibrium with Y") used here means that the ensemble for the first system is not perturbed when the system is allowed to weakly interact with the second system.</ref> with a [[heat bath]] of a precise [[thermodynamic temperature|temperature]]. The canonical ensemble contains states of varying energy but identical composition; the different states in the ensemble are accorded different probabilities depending on their total energy.
* The [[grand canonical ensemble]] describes a system with non-fixed composition (uncertain particle numbers) that is in thermal and chemical equilibrium with a [[thermodynamic reservoir]]. The reservoir has a precise temperature, and precise [[chemical potential]]s for various types of particle. The grand canonical ensemble contains states of varying energy and varying numbers of particles; the different states in the ensemble are accorded different probabilities depending on their total energy and total particle numbers.
 
:{| class="wikitable"
|-
! rowspan="2"|
! colspan="3"| Thermodynamic ensembles<ref name="gibbs"/>
|-
! [[Microcanonical ensemble|Microcanonical]]
! [[Canonical ensemble|Canonical]]
! [[Grand canonical ensemble|Grand canonical]]
|-
! Fixed variables
| <center>{{math|''N, V, E''}}</center>
| <center>{{math|''N, V, T''}}</center>
| <center>{{math|''μ, V, T''}}</center>
|-
! Microscopic features
| <div class="plainlist">
*<center>Number of [[Microstate (statistical mechanics)|microstates]]</center>
*<center><math> W </math></center>
</div>
| <div class="plainlist">
*<center>[[Canonical partition function]]</center>
*<center><math>Z = \sum_k e^{- E_k / k_B T}</math></center>
</div>
| <div class="plainlist">
*<center>[[Grand partition function]]</center>
*<center><math>\mathcal Z = \sum_k e^{ ( \mu N_k - E_k) /k_B T}</math></center>
</div>
|-
! Macroscopic function
| <div class="plainlist">
*<center>[[Boltzmann entropy]]</center>
*<center><math>S =  k_B \ln W</math></center>
</div>
| <div class="plainlist">
*<center>[[Helmholtz free energy]]</center>
*<center><math>F = - k_B T \ln Z</math></center>
</div>
| <div class="plainlist">
*<center>[[Grand potential]]</center>
*<center><math>\Omega =- k_B T \ln \mathcal Z </math></center>
</div>
|-
|}
 
====Statistical fluctuations and the macroscopic limit====
 
{{main|Thermodynamic limit}}
 
The thermodynamic ensembles' most significant difference is that they either admit uncertainty in the variables of energy or particle number, or that those variables are fixed to particular values. While this difference can be observed in some cases, for macroscopic systems the thermodynamic ensembles are usually observationally equivalent.
 
The limit of large systems in statistical mechanics is known as the [[thermodynamic limit]]. In the thermodynamic limit the microcanonical, canonical, and grand canonical ensembles tend to give identical predictions about thermodynamic characteristics. This means that one can specify either total energy or temperature and arrive at the same result; likewise one can specify either total particle number or chemical potential. Given these considerations, the best ensemble to choose for the calculation of the properties of a macroscopic system is usually just the ensemble which allows the result to be derived most easily.<ref name="Reif">{{cite isbn|9780070518001}}</ref>
 
Important cases where the thermodynamic ensembles ''do not'' give identical results include:
* Systems at a phase transition.
* Systems with long-range interactions.
* Microscopic systems.
In these cases the correct thermodynamic ensemble must be chosen as there are observable differences between these ensembles not just in the size of fluctuations, but also in average quantities such as the distribution of particles. The correct ensemble is that which corresponds to the way the system has been prepared and characterized—in other words, the ensemble that reflects the knowledge about that system.<ref name="tolman"/>
 
===Illustrative example (a gas)===
 
The above concepts can be illustrated for the specific case of one liter of ammonia gas at standard conditions. (Note that statistical thermodynamics is not restricted to the study of macroscopic gases, and the example of a gas is given here to illustrate concepts. Statistical mechanics and statistical thermodynamics apply to all mechanical systems (including microscopic systems) and to all phases of matter: [[liquid]]s, [[solid]]s, [[plasma (physics)|plasma]]s, [[gas]]es, [[nuclear matter]], [[quark matter]].)
 
A simple way to prepare one litre sample of ammonia in a standard condition is to take a very large reservoir of ammonia at those standard conditions, and connect it to a previously evacuated one-litre container. After ammonia gas has entered the container and the container has been given time to reach thermodynamic equilibrium with the reservoir, the container is then sealed and isolated. In thermodynamics, this is a repeatable process resulting in a very well defined sample of gas with a precise description. We now consider the corresponding precise description in statistical thermodynamics.
 
Although this process is well defined and repeatable in a macroscopic sense, we have no information about the exact locations and velocities of each and every molecule in the container of gas. Moreover, we do not even know exactly how many molecules are in the container; even supposing we knew exactly the average density of the ammonia gas in general, we do not know how many molecules of the gas happened to be inside our container at the moment when we sealed it. The sample is in equilibrium and is in equilibrium with the reservoir: we could reconnect it to the reservoir for some time, and then re-seal it, and our knowledge about the state of the gas would not change. In this case, our knowledge about the state of the gas is precisely described by the [[grand canonical ensemble]]. Provided we have an accurate microscopic model of the ammonia gas, we could in principle compute all thermodynamic properties of this sample of gas by using the distribution provided by the grand canonical ensemble.
 
Hypothetically, we could use an extremely sensitive weight scale to measure exactly the mass of the container before and after introducing the ammonia gas, so that we can exactly know the number of ammonia molecules. After we make this measurement, then our knowledge about the gas would correspond to the [[canonical ensemble]]. Finally, suppose by some hypothetical apparatus we can measure exactly the number of molecules and also measure exactly the total energy of the system. Supposing furthermore that this apparatus gives us no further information about the molecules' positions and velocities, our knowledge about the system would correspond to the [[microcanonical ensemble]].
 
Even after making such measurements, however, our expectations about the behaviour of the gas do not change appreciably. This is because the gas sample is macroscopic and approximates very well the [[thermodynamic limit]], so the different ensembles behave similarly. This can be demonstrated by considering how small the actual fluctuations would be.
Suppose that we knew the number density of ammonia gas was exactly {{val|3.04|e=22}} molecules per liter inside the reservoir of ammonia gas used to fill the one-litre container. In describing the container with the grand canonical ensemble, then, the average number of molecules would be <math>\langle N\rangle = 3.04\times 10^{22}</math> and the uncertainty ([[standard deviation]]) in the number of molecules would be <math>\sigma_N = \sqrt{\langle N \rangle} \approx 2\times 10^{11}</math> (assuming [[Poisson distribution]]), which is relatively very small compared to the total number of molecules. Upon measuring the particle number (thus arriving at a canonical ensemble) we should find very nearly {{val|3.04|e=22}} molecules. For example the probability of finding more than {{val|3.040001|e=22}} or less than {{val|3.039999|e=22}} molecules would be about 1 in 10<sup>3000000000</sup>.<ref group=note>This is so unlikely as to be practically impossible. The statistical physicist [[Émile Borel]] noted that, compared to the improbabilities found in statistical mechanics, it would be more likely that monkeys typing randomly on a typewriter would happen to reproduce the books of the world. See [[infinite monkey theorem]].</ref>
 
===Calculation methods===
 
Once the characteristic state function for an ensemble has been calculated for a given system, that system is 'solved' (macroscopic observables can be extracted from the characteristic state function). Calculating the characteristic state function of a thermodynamic ensemble is not necessarily a simple task, however, since it involves considering every possible state of the system. While some hypothetical systems have been exactly solved, the most general (and realistic) case is too complex for exact solution. Various approaches exist to approximate the true ensemble and allow calculation of average quantities.
 
====Exact====
 
There are some cases which allow exact solutions.
 
* For very small microscopic systems, the ensembles can be directly computed by simply enumerating over all possible states of the system (using [[exact diagonalization]] in quantum mechanics, or integral over all phase space in classical mechanics).
* Some large systems consist of many separable microscopic systems, and each of the subsystems can be analysed independently. Notably, idealized gases of non-interacting particles have this property, allowing exact derivations of [[Maxwell–Boltzmann statistics]], [[Fermi–Dirac statistics]], and [[Bose–Einstein statistics]].<ref name="tolman"/>
* A few large systems with interaction have been solved. By the use of subtle mathematical techniques, exact solutions have been found for a few [[toy model]]s.<ref>{{cite isbn|9780120831807}}</ref> Some examples include the [[Bethe ansatz]], [[square-lattice Ising model]] in zero field, [[hard hexagon model]].
 
====Monte Carlo====
 
{{main|Monte Carlo method}}
 
One approximate approach that is particularly well suited to computers is the [[Monte Carlo method]], which examines just a few of the possible states of the system, with the states chosen randomly (with a fair weight). As long as these states form a representative sample of the whole set of states of the system, the approximate characteristic function is obtained. As more and more random samples are included, the errors are reduced to an arbitrarily low level.
 
* The [[Metropolis–Hastings algorithm]] is a classic Monte Carlo method which was initially used to sample the canonical ensemble.
* [[Path integral Monte Carlo]], also used to sample the canonical ensemble.
 
====Other====
 
* For rarefied non-ideal gases, approaches such as the [[cluster expansion]] use [[perturbation theory]] to include the effect of weak interactions, leading to a [[virial expansion]].<ref name="balescu"/>
* For dense fluids, another approximate approach is based on [[reduced distribution function]]s, in particular the [[radial distribution function]].<ref name="balescu"/>
* [[Molecular dynamics]] computer simulations can be used to calculate [[microcanonical ensemble]] averages, in ergodic systems. With the inclusion of a connection to a stochastic heat bath, they can also model canonical and grand canonical conditions.
* Mixed methods involving non-equilibrium statistical mechanical results (see below) may be useful.
 
==Non-equilibrium statistical mechanics==
 
{{seealso|Non-equilibrium thermodynamics}}
 
There are many physical phenomena of interest that involve quasi-thermodynamic processes out of equilibrium, for example:
* [[Thermal conduction|heat transport by the internal motions in a material]], driven by a temperature imbalance,
* [[Electrical conduction|electric currents carried by the motion of charges in a conductor]], driven by a voltage imbalance,
* spontaneous [[chemical reaction]]s driven by a decrease in free energy,
* [[friction]], [[dissipation]], [[quantum decoherence]],
* systems being pumped by external forces ([[optical pumping]], etc.),
* and irreversible processes in general.
All of these processes occur over time with characteristic rates, and these rates are of importance for engineering. The field of non-equilibrium statistical mechanics is concerned with understanding these non-equilibrium processes at the microscopic level. (Statistical thermodynamics can only be used to calculate the final result, after the external imbalances have been removed and the ensemble has settled back down to equilibrium.)
 
In principle, non-equilibrium statistical mechanics could be mathematically exact: ensembles for an isolated system evolve over time according to deterministic equations such as [[Liouville's theorem (Hamiltonian)|Liouville's equation]] or its quantum equivalent, the [[von Neumann equation]]. These equations are the result of applying the mechanical equations of motion independently to each state in the ensemble. Unfortunately, these ensemble evolution equations inherit much of the complexity of the underling mechanical motion, and so exact solutions are very difficult to obtain. Moreover, the ensemble evolution equations are fully reversible and do not destroy information (the ensemble's [[Gibbs entropy]] is preserved). In order to make headway in modelling irreversible processes, it is necessary to add additional ingredients besides probability and reversible mechanics.
 
Non-equilibrium mechanics is therefore an active area of theoretical research as the range of validity of these additional assumptions continues to be explored. A few approaches are described in the following subsections.
 
=== Stochastic methods ===
 
One approach to non-equilibrium statistical mechanics is to incorporate [[stochastic]] (random) behaviour into the system. Stochastic behaviour destroys information contained in the ensemble. While this is technically inaccurate (aside from [[Black hole information paradox|hypothetical situations involving black holes]], a system cannot in itself cause loss of information), the randomness is added to reflect that information of interest becomes converted over time into subtle correlations within the system, or to correlations between the system and environment. These correlations appear as [[chaotic]] or [[pseudorandom]] influences on the variables of interest. By replacing these correlations with randomness proper, the calculations can be made much easier.
 
{{unordered list
|1= ''[[Boltzmann transport equation]]'': An early form of stochastic mechanics appeared even before the term "statistical mechanics" had been coined, in studies of [[kinetic theory]]. [[James Clerk Maxwell]] had demonstrated that molecular collisions would lead to apparently chaotic motion inside a gas. [[Ludwig Boltzmann]] subsequently showed that, by taking this [[molecular chaos]] for granted as a complete randomization, the motions of particles in a gas would follow a simple [[Boltzmann transport equation]] that would rapidly restore a gas to an equilibrium state (see [[H-theorem]]).
 
The [[Boltzmann transport equation]] and related approaches are important tools in non-equilibrium statistical mechanics due to their extreme simplicity. These approximations work well in systems where the "interesting" information is immediately (after just one collision) scrambled up into subtle correlations, which essentially restricts them to rarefied gases. The Boltzmann transport equation has been found to be very useful in simulations of electron transport in lightly doped [[semiconductor]]s (in [[transistor]]s), where the electrons are indeed analogous to a rarefied gas.
 
A quantum technique related in theme is the [[random phase approximation]].
 
|2= ''[[BBGKY hierarchy]]'':
In liquids and dense gases, it is not valid to immediately discard the correlations between particles after one collision. The [[BBGKY hierarchy]] (Bogoliubov–Born–Green–Kirkwood–Yvon hierarchy) gives a method for deriving Boltzmann-type equations but also extending them beyond the dilute gas case, to include correlations after a few collisions.
 
|3= ''[[Keldysh formalism]]'' (a.k.a. NEGF—non-equilibrium Green functions):
A quantum approach to including stochastic dynamics is found in the Keldysh formalism. This approach often used in electronic [[quantum transport]] calculations.
}}
 
=== Near-equilibrium methods ===
 
Another important class of non-equilibrium statistical mechanical models deals with systems that are only very slightly perturbed from equilibrium. With very small perturbations, the response can be analysed in [[linear response theory]]. A remarkable result, as formalized by the [[fluctuation-dissipation theorem]], is that the response of a system when near equilibrium is precisely related to the [[Statistical fluctuations|fluctuations]] that occur when the system is in total equilibrium. Essentially, a system that is slightly away from equilibrium—whether put there by external forces or by fluctuations—relaxes towards equilibrium in the same way, since the system cannot tell the difference or "know" how it came to be away from equilibrium.<ref name="balescu"/>{{rp|664}}
 
This provides an indirect avenue for obtaining numbers such as [[Ohm's law|ohmic conductivity]] and [[thermal conductivity]] by extracting results from equilibrium statistical mechanics. Since equilibrium statistical mechanics is mathematically well defined and (in some cases) more amenable for calculations, the fluctuation-dissipation connection can be a convenient shortcut for calculations in near-equilibrium statistical mechanics.
 
A few of the theoretical tools used to make this connection include:
* [[Fluctuation–dissipation theorem]]
* [[Onsager reciprocal relations]]
* [[Green–Kubo relations]]
* [[Landauer–Büttiker formalism]]
* [[Mori–Zwanzig formalism]]
 
=== Hybrid methods ===
 
An advanced approach uses a combination of stochastic methods and linear response theory. As an example, one approach to compute quantum coherence effects ([[weak localization]], [[conductance fluctuations]]) in the conductance of an electronic system is the use of the Green-Kubo relations, with the inclusion of stochastic [[dephasing]] by interactions between various electrons by use of the Keldysh method.<ref>{{cite doi|10.1088/0022-3719/15/36/018}}</ref><ref>{{cite doi|10.1103/PhysRevB.65.115317}}</ref>
 
==Applications outside thermodynamics==
 
The ensemble formalism also can be used to analyze general mechanical systems with uncertainty in knowledge about the state of a system. Ensembles are also used in:
* [[propagation of uncertainty]] over time,<ref name="gibbs"/>
* [[regression analysis]] of gravitational [[orbit]]s,
* [[ensemble forecasting]] of weather,
* dynamics of [[neural networks]].
 
==History==
 
In 1738, Swiss physicist and mathematician [[Daniel Bernoulli]] published ''Hydrodynamica'' which laid the basis for the [[kinetic theory of gases]].  In this work, Bernoulli posited the argument, still used to this day, that gases consist of great numbers of molecules moving in all directions, that their impact on a surface causes the gas pressure that we feel, and that what we experience as [[heat]] is simply the kinetic energy of their motion.<ref name="uffink"/>
 
In 1859, after reading a paper on the diffusion of molecules by [[Rudolf Clausius]], Scottish physicist [[James Clerk Maxwell]] formulated the [[Maxwell distribution]] of molecular velocities, which gave the proportion of molecules having a certain velocity in a specific range.  This was the first-ever statistical law in physics.<ref>{{cite book|author=Mahon, Basil |title=The Man Who Changed Everything – the Life of James Clerk Maxwell|location=Hoboken, NJ | publisher=Wiley|year=2003|isbn=0-470-86171-1|oclc=52358254 62045217}}</ref>  Five years later, in 1864, [[Ludwig Boltzmann]], a young student in Vienna, came across Maxwell’s paper and was so inspired by it that he spent much of his life developing the subject further.
 
Statistical mechanics proper was initiated in the 1870s with the work of Ludwig Boltzmann, much of which was collectively published in Boltzmann's 1896 ''Lectures on Gas Theory''.<ref>{{cite book|title=Statistical Thermodynamics and Stochastic Theory of Nonequilibrium Systems| last1=Ebeling| first1=Werner| last2= Sokolov| first2= Igor M.| year=2005| publisher=World Scientific Publishing Co. Pte. Ltd.| isbn=978-90-277-1674-3| pages=3–12|url=http://books.google.de/books?id=KUjFHbid8A0C}} (section 1.2)</ref> Boltzmann's original papers on the statistical interpretation of thermodynamics, the [[H-theorem]], [[transport theory (statistical physics)|transport theory]], [[thermal equilibrium]], the [[equation of state]] of gases, and similar subjects, occupy about 2,000 pages in the proceedings of the Vienna Academy and other societies. Boltzmann introduced the concept of an equilibrium statistical ensemble and also investigated for the first time non-equilibrium statistical mechanics, with his [[H-theorem|''H''-theorem]].
 
The term "statistical mechanics" was coined by the American mathematical physicist [[Josiah Willard Gibbs|J. Willard Gibbs]] in 1902.<ref group=note>According to Gibbs, the term "statistical", in the context of mechanics, i.e. statistical mechanics, was first used by the Scottish physicist [[James Clerk Maxwell]] in 1871.</ref> "Probabilistic mechanics" might today seem a more appropriate term, but "statistical mechanics" is firmly entrenched.<ref>{{cite book|title=The enigma of probability and physics| last=Mayants| first=Lazar| year=1984| publisher=Springer| isbn=978-90-277-1674-3| page=174|url=http://books.google.com/books?id=zmwEfXUdBJ8C&pg=PA174}}</ref> Whereas Boltzmann had focussed almost entirely on the case of a macroscopic ideal gas, Gibbs' 1902 book formalized statistical mechanics as a fully general approach to address all mechanical systems—macroscopic or microscopic, gaseous or non-gaseous.<ref name="gibbs"/> Gibbs' methods were initially derived in the framework [[classical mechanics]], however they were of such generality that they were found to adapt easily to the later [[quantum mechanics]], and still form the foundation of statistical mechanics to this day.<ref name="tolman"/>
 
==See also==
{{Commons category|{{PAGENAME}}}}
{{colbegin|2}}
* [[Thermodynamics]]: [[Non-equilibrium thermodynamics|non-equilibrium]], [[Chemical thermodynamics|chemical]]
* [[Mechanics]]: [[Classical mechanics|classical]], [[Quantum mechanics|quantum]]
* [[Probability]], [[Statistical ensemble (mathematical physics)|statistical ensemble]]
* Numerical methods: [[Monte Carlo method]], [[molecular dynamics]]
* [[Statistical physics]]
* [[Quantum statistical mechanics]]
* [[List of notable textbooks in statistical mechanics]]
* [[List of publications in physics#Statistical mechanics|List of important publications in statistical mechanics]]
{{colend}}
{{Wikipedia books link|Fundamentals of Statistical Mechanics}}
 
==Notes==
{{reflist|group=note}}
 
==References==
{{reflist|refs=
<ref name="gibbs">{{cite book |last=Gibbs |first=Josiah Willard |authorlink=Josiah Willard Gibbs |title=[[Elementary Principles in Statistical Mechanics]] |year=1902 |publisher=[[Charles Scribner's Sons]] |location=New York}}</ref>
<ref name="tolman">{{cite isbn|9780486638966}}</ref>
<ref name="balescu">{{cite isbn|9780471046004}}</ref>
<ref name="uffink">J. Uffink, "[http://philsci-archive.pitt.edu/2691/1/UffinkFinal.pdf Compendium of the foundations of classical statistical physics.]" (2006)</ref>
}}
 
==External links==
* [http://plato.stanford.edu/entries/statphys-statmech/ Philosophy of Statistical Mechanics] article by Lawrence Sklar for the [[Stanford Encyclopedia of Philosophy]].
* [http://www.sklogwiki.org/ Sklogwiki - Thermodynamics, statistical mechanics, and the computer simulation of materials.]  SklogWiki is particularly orientated towards liquids and soft condensed matter.
*[http://history.hyperjeff.net/statmech.html Statistical Thermodynamics] - Historical Timeline
* [http://farside.ph.utexas.edu/teaching/sm1/statmech.pdf Thermodynamics and Statistical Mechanics] by Richard Fitzpatrick
* [http://arxiv.org/abs/1107.0568 Lecture Notes in Statistical Mechanics and Mesoscopics] by Doron Cohen
 
{{Physics-footer}}
{{Statistical mechanics topics}}
 
[[Category:Concepts in physics]]
[[Category:Physics]]
[[Category:Statistical mechanics|*]]
[[Category:Thermodynamics]]
 
[[fr:Physique statistique]]

Latest revision as of 07:27, 12 January 2015

The writer is known by the name of Numbers Wunder. Doing ceramics is what her family members and her enjoy. Minnesota has always been his house but his wife desires them to move. In her professional lifestyle she is a payroll clerk but she's always needed her personal business.

Feel free to visit my webpage; youplaisir.com