Physics Asked by Xeнεi Ξэnвϵς on March 26, 2021
Entropy is a measure of disorder, the higher the entropy the greater the disorder.
Uniform systems have less entropy than random systems.
Data is expressed in binary in computers, expressed as ones and zeros. Eight bits are a byte, data is stored in bytes, if a stream of data is less than a byte, it is padded by adding zeros before it. A unsigned byte can express numbers between 00000000 and 11111111, which is 0 to 255 in decimal and 00 to FF in hexadecimal.
A formatted "empty" disk has uniform bytes, uniform bytes have no information, by writing data to it, its randomness has increased and thus entropy is increased.
Now, I am wondering, 1, do freshly formatted disks, with all bytes the same (i.e. 00), have entropy of 0?
2, Lets say the disk with all bytes 00 is empty, when all bytes of it are used the disk is full. When does the disk have the maximum entropy? When it is half full? Or equal number of all possible 256 bytes randomly distributed?
By entropy I mean this: https://en.wikipedia.org/wiki/Entropy_(order_and_disorder)
By hard drive I specifically mean Hard Disk Device, which is the spinning electro-magnetic disk.
As far as I know, binary data is stored in such a device by changing the magnetic microstate of the platter, changing magnetic polarity of a small area by magnetization from the read/write head. This process requires energy. Though information itself is hard to be associated with energy, storing, retrieving and processing information all requires energy.
So the individual bytes of information do have energy, magnetic energy, which is a form of electro-magnetic energy, the energy is equal to energy difference of the hard disk caused by the process to store information. And they do have mass, no matter how small it is, by E=m*c^2.
And there are four elemental forces: electromagnetic, gravitation, strong nuclear and weak nuclear. Every force in the macrocosm that is not gravitation is EM, and thermodynamic energy is the average kinetic energy of the random motion of the constituent particles, particles are waves, and changes of EM cause changes of thermodynamic energy, so thermodynamic energy is a form of EM.
Thus the binary data is related to thermodynamic energy, QED.
Entropy is a measure of disorder is not a useful statement. Because when you follow that path what happens usually is that you define entropy by disorder and disorder by entropy.
A more useful way to understand is to note that entropy is defined as such only because there are more possibilities we consider to be disordered rather than orthered. let's clear this up with an example.
Imagine a child's nursery, there are lots of toys and clothes in the room. And note that we can arrange them in nearly infinite ways. Imagine a computer simulation randomly replacing the stuff in the room. By random I truely mean random, the rugs can be on the walls, the crib can be upside down, the toys all over the place etc. We can place the toys in countless ways in the room but most of these arrengements we consider disordered. Only a few is considered to be organized; the few where the rug is on the floor, the toys are in their properly placed chest etc.
This is why the entropy of the universe always increases, because the arrengement of atoms to construct someting we consider ordered, say a human, is so little compared to the ones we consider random lumps of atoms; the possibility of atoms moving to an ordered state is practically impossible.
But note that if we are to consider a specific arrangement an ordered one the law does not break. For an 8bit string there are 256 possible arrangements but 00000000 isn't any different from the others. 00000000 and 10101010 both have 1 in 256 of a possibility to occur. So there isn't any specific order of bits that you can consider to be more ordered or disordered unless you need specific arrangements that you consider to be "ordered".
Correct answer by GUNDOGAN on March 26, 2021
From the point of algorithmic information theory, the information content of a hard drive is the shortest description to reproduce it. If all bits are equal it could be, "all 0'. If it looks random it could be the digits of pi (short to describe using an algorithm), or an also random-looking uncompressible chain, which happen in most cases.
Now, if you have an ensemble of hard drives with each hard drive initialized in a random state, short descriptions would be the rarest, and the lower entropic ones.
Answered by Wolphram jonny on March 26, 2021
Entropy is a measure of disorder.
This is one of the most ambiguous and potentially misleading statements in the whole of Physics. The reason is twofold. On the one hand, it is not specified which entropy is referring to, and unfortunately, more than one single concept named entropy is around. Even worse, such concepts have a limited equivalence. On the other hand, without specifying how the order is defined it would remain an empty statement even in the case the concept entropy has been specified.
Here, you are dealing with the entropy of a hard disk, in connection with the sequence of the stored data.
Certainly, this has nothing to do with thermodynamic entropy. Thermodynamic entropy requires a thermodynamic system, i.e. an equilibrium system able to exchange heat and work with the surrounding. Moreover, thermodynamic entropy is defined for equilibrium states described by a set of thermodynamic variables.
The hard disk material is a thermodynamic system, of course, but the stored bits do not play any role in its thermodynamic behavior. They do not control the exchanges of the energy of the disk and there is no concept of spontaneous reaching of equilibrium that can be meaningfully associated with them.
One could argue that the bits represent the microstates in a statistical description of the system of binary digits. This possibility too has to be excluded. First of all, the statistical entropy is associated with macrostates, not with single microstates. Furthermore, there is no spontaneous underlying dynamics associated with the sequence of all the possible static bit sequences.
Is there any way to associate entropy to the sequence of bits of a hard disk? The possibility exists, in principle, but one has to use the algorithmic entropy (or Kolmogorov complexity), which has only a loose relation with the statistical and thermodynamic entropy. The definition of such entropy is based on the size of the shortest possible description of the sequence in some description language.
However, it is necessary to note that evaluating the algorithmic entropy in practice may be quite a complex task for any real hard disk size.
Answered by GiorgioP on March 26, 2021
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP