Jump to content
Join the Unexplained Mysteries community today! It's free and setting up an account only takes a moment.
- Sign In or Create Account -

Conservation of Information?


StarMountainKid

Recommended Posts

I'm not sure if the following questions make sense or not.

Is there a law of conservation of information?

For instance, if we build a complex mechanism we are increasing or ordering information, as the device contains more information than the space it occupied before it was built. if we then destroy the mechanism, what happens to the information inherent in its construction? When we build another identical device, are we using the information released by the destruction of the original device?

In other words, by building the device we are adding order (information) to the universe. When we destroy the device are we creating disorder (disinformation), or returning to the state of entropy that existed before we constructed the device?

I think the concepts of information and order are interchangeable. An ordered system contains more coherent information than a disordered system.

Is the information of anything we can build already enfolded in the universe? Where does this information come from?

Entropy as the loss of information. In this sense, lowest entropy contains more information. As entropy increases, is this information spread into the universe or is it lost forever?

  • Like 1
Link to comment
Share on other sites

 

Thanks for the article, synchronomy.

Link to comment
Share on other sites

I'm not sure if the following questions make sense or not.

Is there a law of conservation of information?

For instance, if we build a complex mechanism we are increasing or ordering information, as the device contains more information than the space it occupied before it was built. if we then destroy the mechanism, what happens to the information inherent in its construction? When we build another identical device, are we using the information released by the destruction of the original device?

In other words, by building the device we are adding order (information) to the universe. When we destroy the device are we creating disorder (disinformation), or returning to the state of entropy that existed before we constructed the device?

I think the concepts of information and order are interchangeable. An ordered system contains more coherent information than a disordered system.

Is the information of anything we can build already enfolded in the universe? Where does this information come from?

Entropy as the loss of information. In this sense, lowest entropy contains more information. As entropy increases, is this information spread into the universe or is it lost forever?

Information is heat and heat energy is conserved.

Link to comment
Share on other sites

In physics there is a commonly accepted ``conservation of information'' law, just like there is a ``conservation of energy'' law.

Just like conservation of energy, ``conservation of information'' does not imply that the information is useful, or easily accessible.

For example, a book is a good source of information. If we burn the book, do we lose the information?

In theory, if we had perfect detectors everywhere we could deduce, by the infinitesimally small variations in temperature, colour, and smoke of the fire, what the original content of the book was (including the text!).

Of course this is not possible to do in practice. (You can read up on Maxwell's demon for a classic example of this line of thinking.)

On a quantum level, information is intimately tied to energy.

In fact, the conservation of information is tied to the black hole information paradox, which suggests that if a book falls into a black hole it is impossible to retrieve any of the information from that book (unlike burning the book - in which the information is not really accessible, but technically is still present dispersed throughout the environment).

Link to comment
Share on other sites

If conservation of information was absolute, then you'd be able to tell which two numbers I just multiplied together to get 64.

Link to comment
Share on other sites

If conservation of information was absolute, then you'd be able to tell which two numbers I just multiplied together to get 64.

I disagree. Conservation of information does not imply that information of a ``final state'' is the sum of all the information of the formative ``initial states'', any more than conservation of energy implies that the energy of a ``final state'' is the sum of all the energy of all the formative ``initial states''.

In an open system, the energy in that system is often ``lost'' or ``created'' by interaction with the environment, the same is true for information.

If I had access to the full final state (i.e. not just the number 64, but also all the information about your state of mind) then presumably I could determined how you calculated that number.

However I am pretty sure that in quantum mechanics it is more a matter that information capacity is conserved, rather than the actual information content (since spontaneous wavefunction collapse does sort of destroy information content).

Link to comment
Share on other sites

Information capacity I can believe is conserved. Actual information, not so much. Entropy pretty much guarantees information loss over time, surely?

Link to comment
Share on other sites

Information capacity I can believe is conserved. Actual information, not so much. Entropy pretty much guarantees information loss over time, surely?

I guess it depends on what you mean by ``loss''. If you mean that the information is inaccessible, then I agree completely.

For example, in the classical limit if you had complete knowledge of the position, momentum, mass, charge, etc. of every particle (which is theoretically possible in classical physics) and complete knowledge of the boundary conditions (if any) at a particular instant in time, and knew how to analytically solve the many-body problem, then you could theoretically calculate the state of the system at any past or future time.

Obviously we don't know how to analytically solve the many-body problem, and even if we could, we couldn't set up detectors to determine the positions, momenta, etc. of every particle the system (because obviously the detectors can't detect themselves, and they are also composed of particles that are now within the system).

It might even be the case that if you knew how to analytically solve the many-body problem you would find that the evolution of a many-body quantum wavefunction also preserves information; ``spontaneous wavefunction collapse'' is only a by-product of treating a many-body system as a discrete interaction between multiple single-body systems. (But this is a bit of a stretch, I admit.)

But even with quantum mechanics taken into account, I still think you can extract ``data'' (i.e. useful information) from entropic ensembles, just like you can extract work (i.e. useful energy) from heat baths; and with the similar consequences (neither process can reach 100% yield, extracting data from entropy requires a lot of energy, and extracting energy from heat baths generates a lot of entropy).

Link to comment
Share on other sites

"An expenditure of energy is required to acquire information." Information is inaccessible if there is not the appropriate energy available to acquire it. In a sense, information is "out there", waiting for enough of the correct organization of energy to be used to acquire that information.

So, in order to solve, for example, the many-body problem, all we have to do is organize energy properly and we will have the correct information that solves the problem!

I could use this idea in one of my Zarkor and Zerak stories. A device that organizes energy. I suppose a computer or a brain does this already, but working from information is maybe the long way 'round. Just the correct manipulation of energy would bypass the step by step process of gaining incremental bits of information to eventually solve complex problems.

I'll have to start working on that device.

Link to comment
Share on other sites

I'm not sure if the following questions make sense or not.

Is there a law of conservation of information?

"Through various observations it can be seen that the universe is created through a variety of complex patterns but nonetheless patterns, much like a spiderweb.

If the law of conservation applies to X then it can said that it also applies to Y if both X and Y belong to the same universe in which the law of conservation applies."

That said it can be said that if X is the Law of Conservation of Energy and Y is The Law of Conservation of Information then it is highly probable.

Although we can not be sure if Y holds the same process of conservation as X.

Edited by Halmista
Link to comment
Share on other sites

I guess it depends on what you mean by ``loss''. If you mean that the information is inaccessible, then I agree completely.

By loss, I mean a decline in the number of bits required to store the value of a system.

For example, in the classical limit if you had complete knowledge of the position, momentum, mass, charge, etc. of every particle (which is theoretically possible in classical physics) and complete knowledge of the boundary conditions (if any) at a particular instant in time, and knew how to analytically solve the many-body problem, then you could theoretically calculate the state of the system at any past or future time.

Obviously we don't know how to analytically solve the many-body problem, and even if we could, we couldn't set up detectors to determine the positions, momenta, etc. of every particle the system (because obviously the detectors can't detect themselves, and they are also composed of particles that are now within the system).

While I agree that from a single snapshot in a clockwork universe it should be possible to perfectly predict any future state of the system - I don't see how it's possible to perfectly predict the past state of that system - at least not from a single snapshot.

The reason that I'm having trouble seeing that is because reversing the entropy in the system increases the amount of information within the system (the number of bits required to hold that information).

In other words - If you're currently given a room where the air is uniformly 70 degrees and still - wouldn't accurately predicting what the temperature of each molecule was last Wednesday be impossible?

Link to comment
Share on other sites

 

In other words - If you're currently given a room where the air is uniformly 70 degrees and still - wouldn't accurately predicting what the temperature of each molecule was last Wednesday be impossible?

(I assume we are speaking in the classical limit here).

What you describe is an ensemble average. The air molecules are obviously not still if they are at 70 degrees, and to have a complete picture you would need to know the momentum, rotation, position, etc. of each molecule. (AND the momentum, position, etc. of every particle in the walls, AND of every particle that could have entered or left the room during the last few days.)

I would argue that just like you can say Energy = Heat + Work, you can say Information = Entropy + Data (not sure if ``Data'' is the right word here, but whatever).

There is a vast amount of energy in a room full of air at 70 F, but it isn't really accessible; the same goes for the information.

----------

I think our argument is ultimately stemming from a difference in definitions; and it seems like you are defining ``entropy'' as the opposite of ``information''. In that case, I agree completely with your arguments, and there should not be a law of ``conservation of information''.

In my arguments I have defined ``entropy'' as a type of ``information'' (i.e. the least informative type).

Link to comment
Share on other sites

I think our argument is ultimately stemming from a difference in definitions; and it seems like you are defining ``entropy'' as the opposite of ``information''. In that case, I agree completely with your arguments, and there should not be a law of ``conservation of information''.

In my arguments I have defined ``entropy'' as a type of ``information'' (i.e. the least informative type).

Information and Entropy have been closely related, ever since Shannon. It's probably my confusion, rather than yours.

Link to comment
Share on other sites

In physics and cosmology, digital physics is a collection of theoretical perspectives based on the premise that the universe is, at heart, describable by information, and is therefore computable. Therefore, the universe can be conceived of as either the output of a computer program, a vast, digital computation device, or mathematically isomorphic to such a device.

Some try to identify single physical particles with simple bits. For example, if one particle, such as an electron, is switching from one quantum state to another, it may be the same as if a bit is changed from one value (0, say) to the other (1). A single bit suffices to describe a single quantum switch of a given particle. As the universe appears to be composed of elementary particles whose behavior can be completely described by the quantum switches they undergo, that implies that the universe as a whole can be described by bits. Every state is information, and every change of state is a change in information (requiring the manipulation of one or more bits). Setting aside dark matter and dark energy, which are poorly understood at present, the known universe consists of about 1080protons and the same number of electrons. Hence, the universe could be simulated by a computer capable of storing and manipulating about 1090 bits. If such a simulation is indeed the case, then hypercomputation would be impossible.

http://en.wikipedia.org/wiki/Digital_physics

There are objections to this view, as outlined in the article. However, in my view, understanding the universe as information seems plausible with our without a computer generating it. What lies behind or determines the behavior of elementary particles, the forces and the laws that govern may be the imperative of the articulation of information, whether there is an observer present or not.

The prime characteristic of the universe is that it expresses information about itself. The universe is this information. If this information were hidden, the universe could not exist, as no event could convey its consequence to any other event. It is the universe's ability to communicate within itself that enables it to function. This exchange of information is primary to its operation and its very existence.

Link to comment
Share on other sites

Information and Entropy have been closely related, ever since Shannon. It's probably my confusion, rather than yours.

That is true, but I think the concept of ``Conservation of Information'' treats information as something analogous to energy, and thus is slightly different than the concept of entropy described by Shannon for classical channels (or von Neumann for quantum channels) as something that reduces the information content.

To illustrate further: since it is possible to extract work (i.e. useful energy) from two heat baths (as long as the two baths are at different temperatures) via a Carnot engine (or a less efficient real-life implementation), I think it should be possible to extract data (i.e. useful information about the microstate of a system) from two ensembles (as long as the two ensembles are at different specific entropies) via a time-resolved measurements of lattice distortions and excitations of a perfect (or near perfect) crystal forming the boundary between the two ensembles.

A perfect crystal at near-zero temperatures is an ideal zero-entropy state, by measuring the response of this crystal to interactions with a high-entropy bath (i.e. particle collisions) one could start to estimate the positions and momenta of these particles (obviously you could not know the details of all the particles in the system, just like you could not extract all the energy from a high temperature heat bath); the crystal would be kept in a near ideal state by translating the excitations into the low-entropy bath.

Of course in practice this would not be even less efficient, but a Carnot engine is impossible to realize as well; and neither are 100% efficient even in theory.

Link to comment
Share on other sites

If conservation of information was absolute, then you'd be able to tell which two numbers I just multiplied together to get 64.

Yes, if we can locate the information wherever it currently is in the universe.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.