HRU (security): Difference between revisions
en>Phil Boswell m convert dodgy URL to ID using AWB |
en>Addbot m Bot: Migrating 1 interwiki links, now provided by Wikidata on d:q3318032 |
||
Line 1: | Line 1: | ||
'''Loop entropy''' is the entropy lost upon bringing together two residues of a polymer within a prescribed distance. For a single loop, the entropy varies logarithmically with the number of residues <math>N</math> in the loop | |||
:<math> | |||
\Delta S = \alpha k_{B} \ln N \, | |||
</math> | |||
where <math>k_{B}</math> is Boltzmann's constant and <math>\alpha</math> is a coefficient that depends on the properties of the polymer. This entropy formula corresponds to a power-law distribution <math>P \sim N^{-\alpha}</math> for the probability of the residues contacting. | |||
The loop entropy may also vary with the position of the contacting residues. Residues near the ends of the polymer are more likely to contact (quantitatively, have a lower <math>\alpha</math>) than those in the middle (i.e., far from the ends), primarily due to excluded volume effects. | |||
==Wang-Uhlenbeck entropy== | |||
The loop entropy formula becomes more complicated with multiples loops, but may be determined for a Gaussian polymer using a matrix method developed by Wang and Uhlenbeck. Let there be <math>M</math> contacts among the residues, | |||
which define <math>M</math> loops of the polymers. The Wang-Uhlenbeck matrix <math>\mathbf{W}</math> is an <math>M \times M</math> symmetric, real matrix whose elements <math>W_{ij}</math> equal the number of common residues between loops <math>i</math> and <math>j</math>. The entropy of making the specified contacts equals | |||
:<math> | |||
\Delta S = \alpha k_{B} \ln \det \mathbf{W} | |||
</math> | |||
As an example, consider the entropy lost upon making the contacts between residues 26 and 84 and residues 58 and 110 in a polymer (cf. [[ribonuclease A]]). The first and second loops have lengths 58 (=84-26) and 52 (=110-58), respectively, and they have 26 (=84-58) residues in common. The corresponding Wang-Uhlenbeck matrix is | |||
:<math> | |||
\mathbf{W}\ \overset{\underset{\mathrm{def}}{}}{=}\begin{bmatrix} | |||
58 && 26 \\ | |||
26 && 52 | |||
\end{bmatrix} | |||
</math> | |||
whose determinant is 2340. Taking the logarithm and multiplying by the constants <math>\alpha k_{B}</math> gives the entropy. | |||
==References== | |||
* Wang MC and Uhlenbeck GE. (1945) ''Rev. Mod. Phys.'', '''17''', 323. | |||
{{polymer-stub}} | |||
[[Category:Thermodynamic entropy]] | |||
[[Category:Polymer physics]] |
Revision as of 15:29, 15 March 2013
Loop entropy is the entropy lost upon bringing together two residues of a polymer within a prescribed distance. For a single loop, the entropy varies logarithmically with the number of residues in the loop
where is Boltzmann's constant and is a coefficient that depends on the properties of the polymer. This entropy formula corresponds to a power-law distribution for the probability of the residues contacting.
The loop entropy may also vary with the position of the contacting residues. Residues near the ends of the polymer are more likely to contact (quantitatively, have a lower ) than those in the middle (i.e., far from the ends), primarily due to excluded volume effects.
Wang-Uhlenbeck entropy
The loop entropy formula becomes more complicated with multiples loops, but may be determined for a Gaussian polymer using a matrix method developed by Wang and Uhlenbeck. Let there be contacts among the residues, which define loops of the polymers. The Wang-Uhlenbeck matrix is an symmetric, real matrix whose elements equal the number of common residues between loops and . The entropy of making the specified contacts equals
As an example, consider the entropy lost upon making the contacts between residues 26 and 84 and residues 58 and 110 in a polymer (cf. ribonuclease A). The first and second loops have lengths 58 (=84-26) and 52 (=110-58), respectively, and they have 26 (=84-58) residues in common. The corresponding Wang-Uhlenbeck matrix is
whose determinant is 2340. Taking the logarithm and multiplying by the constants gives the entropy.
References
- Wang MC and Uhlenbeck GE. (1945) Rev. Mod. Phys., 17, 323.