Talk:An Intuitive Guide to the Concept of Entropy Arising in Various Sectors of Science

I am new to wikibooks and not so proficient in its formatting language. The equations in the page looks rather unsatisfactory. Could someone help me with that? Thank you. Tschijnmotschau (talk) 14:22, 24 November 2010 (UTC)

Please enhance readability by avoiding 'run-on' sentences...
http://en.wikipedia.org/wiki/Run-on_sentence

I am very interested to read your book, but I find the following stats a bit hard to deal with:

The first 3 sentences only (character count includes the 'thought-terminating period' at the end of the sentence):

The concept of entropy, traditionally derived as the only function satisfying certain criteria for a consistent measure of the "amount of uncertainty" in information theory and derived by classical analogy in statistical mechanics, can in fact be given a unified and intuitive interpretation as the logarithm of the "effective number of states" of a system or the "effective number of possible values" for a random variable, for the reason that systems with that number of equally probable states or equally probable random variables with that number of possible values can be shown to behave in a way identical with the system or the random variable under consideration when a certain aspect of their behavior is focused upon, and the number of possible states of the reference equally probable system or the number of all possible values for the reference equally probable random variable can be used to characterize the system or random variable under consideration because some aspects of their behavior can be encapsulated in this number.

1043 char, 166 words

The concepts of entropy in three particular sectors of science, namely the statistical mechanical entropy used to characterize the disorderedness of a particular system, the information entropy used to measure the amount of information that a particular message conveys and the information entropy which can be used to give the most unbiased statistical inference, can all be shown to be able to be interpreted in that way. And within the framework of that interpretation, the possible non-uniqueness for the definition of entropy arises in a natural way.

556 char, 86 words

Since its early inception in the field of thermodynamics in the early 1850s by Rudolf Clausius [1], the concept of entropy had been introduced into a range of sectors of science and had been proved to be a quantity of great importance. Notably among them are the statistical mechanical entropy proposed by Boltzmann, Planck [2]and Gibbs [3], which gives the phenomenological thermodynamical entropy an insightful microscopic understanding, the information entropy devised by Shannon [4] to measure the information content of a particular message, and its later further development by Jaynes [5] that the Shannon entropy for a probability distribution of a random variable can be used to quantize our uncertainty about it and the distribution which maximizes the entropy in compliance with a set of predefined constraints gives the most unbiased estimate of the probability distribution of the random variable.

909 char, 139 words

I did not intend to seem rude, this is only a suggestion. I believe people (including highly intelligent people) need shorter sentences to aid retention of technical material. You are already familiar with the material that you are writing about, so you might not see the complexity of the sentences you are crafting, but to someone (like me...) who is reading your material for the first time, by the time I get to the end of the sentence, I have to skim back through the sentence to identify the subject and predicate, then filter the clauses, just to see what you are saying...

If you want people to read your book, please, help us out just a little!!!

Sincerely,

RMartin-2