TransWikia.com

How can "information" be a useful physical quantity given that its value is model-dependent?

Physics Asked by Chuquicamata on April 2, 2021

From @Humble’s answer to “What is information?”:

Information contained in a physical system = the number of yes/no questions you need to get answered to fully specify the system.

That is, however, relative to a given model. So either an “infinite, countable” or a “continuous” model would have infinite (possible) information content.

The notion of information seems to find many uses in modern statistical mechanics, quantum computing and other fields, so how do physicists formulate a sound and unambiguous definition of information, given that, naïvely, it would seem to be model-dependent?

2 Answers

Algorithmic information theory defines the complexity of a given chain of bits (which could represent either the number $pi$ or a full axiomatic theory) as the smallest program that can give that string as an output. Some irrational numbers like $pi$ are short on information as you can compress them into a short algorithm that outputs one digit after another with no end in sight. But most irrational numbers are not compressible in that way: for most reals there is no formula that can compute all its digits.

Any formal system, or theory, in physics is a finite set of strings. Thus you can map it to the natural numbers and always find that its complexity is low, as compared at least with the complexity of random strings, even of the same size.

But to specifically answer your question, it is a theorem of set theory that if a countable first-order theory has an infinite model, then for every infinite cardinal number $kappa$ it has a model of size $kappa$.

To make it short, for most theories you might come up with, there is an infinite number of models that satisfy it, with any cardinality of your choice.

Answered by Wolphram jonny on April 2, 2021

how do physicists formulate a sound and unambiguous definition of information, given that, naïvely, it would seem to be model-dependent?

The practice is, take the model you are interested in, with set of possible states, and work with probability distributions on those states and optionaly discuss information entropy or information content of those probability distributions. This is useful even if in another, more refined or just different model of the same physical thing, uses different states and thus assigns different information entropy.

In short, it is model dependent. There is no universal answer to question: what is information entropy of ice cube of 1cm side. It depends on the model of the cube. If the model cares only about which side is up, it has only 6 possible states, then maximum entropy is $ln 6$. If the model is molecular simulation and state involves positions and orientations of all water molecules in the cube, at 1atm and zero Celsius, the state space is immensely bigger and the information entropy is much higher number.

Answered by Ján Lalinský on April 2, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP