User talk:Vamshisuram

Entropy - The main basic information
Entropy basically represents a loss of information which is energy. Assume you gave some energy to a material. Here we cannot be sure how many atoms took how much energy, and also in one atom how much energy the electrons have absorbed. So, in general we add this loss of information in the name of entropy in the equations. This entropy is basically microscopic concept and this is calculated as follows where kB is the Boltzmann constant, equal to 1.38065×10−23 J K−1. The summation is over all the microstates the system can be in, and Pi is the probability that the system is in the ith microstate.

The entropy of a system in which all states, of number Ω, are equally likely, is given by