See: Maxwells Demon

See: Entropy

Information storage is a way of temporarily inhibiting entropy. Entropy is only realized when said information is destroyed.

In the demon example then, the trick is to determine how an entity can maintain near-infinite information storage. If this storage can be stable over time, you can modify the laws of the universe.

Woah, this is essentially what computers do. They store information, which creates a digital environment where we can modify the laws of the universe.

Im going to assume that the door in the demon example is actuated with no energy. even though it begs the question of how it would work. If we assume perfect awareness of information then may as well assume the potential of a massless door (both of which seem equally impossible).

How is entropy modified when one cleans their room? When I put things away I essentially associate a thing with information about it. i.e. its location. This supposedly makes it easier to actuate in the future. In a theoretically perfect clean room, you know exactly where everything is and you use less energy existing in this space. Does this come at the expense of putting the things away? Yes. A stochastic system where everything is known is theoretically more efficient.

Lets take an example where two rooms have the exact same amount of things. One room is perfectly organized, the other is perfectly messy. Both require the same amount of memory storage. In the clean room, there is essentially a framework. A pre determined place where things “belong”. This initial framework must be stored in memory. Then energy must be used to place the things within the framework. Now, my questions would be does a stochastic framework require the same amount of memory storage? Or, is it some how more “lightweight”? My intuition leads me to believe that a stochastic framework requires much less information storage if any at all.

Now I want to use the things in my room. In both examples I have perfect memory of where my things are located. which requires more energy to use my things? Lets assume equal distance needed to be physically traveled and restrict the analysis to energy of cognition.

Information Storage -> Sort -> Search -> Retrieve -> Actuate/Modify -> Replace

Memory < Compute = Stochastic

Memory > Compute = Organized

In the stochastic example, more energy is needed in real time to search and retrieve. But less energy was needed in the first place to organize the information. In addition, you do not have to replace.

In the organized example, less energy is needed to search and retrieve the information. But energy is required to continually maintain the initial framework and replace the things as needed.

Are these two systems theoretically equivalent??

The organized framework might fall apart at scale. There is a point where memory becomes unstable, breaks down.

Because the entire system must be maintained or “rendered” at all times. In the stochastic example you only have to render the items in question when they are needed.

In a warehouse, a stochastic system is superior because you do not have to place an item in a specific place. It can be placed in the nearest location and has no dependence on the locations previous occupant. Furthermore. If you determine a framework (like alphabetical) with a given set of inputs, then what happens when you receive new inputs, or, delete old ones? Energy is wasted in unused cells or in the re-organizing of the framework.

In a shop, an organized system tends to be superior. That is because it is difficult to maintain memory of locations between the human nodes in the network. Even if I remember where I put my tools, others will not. Now, if there is a predetermined location for the tool, then independent nodes can predict where the tool is supposed to be. This system breaks down when no one puts away the tools which are supposed to be shared. This model also becomes obsolete if tools have commonality between nodes. i.e. if we all have our own drill then we can leave it wherever we want. This comes at the expense of initial energy required to establish the tool base (in the form of capital expense).

Predetermined locations require energy to maintain.

The difference hinges on wether or not there is perfect memory of the stochastic framework. If not, much energy is needed to compute search and retrieve. In a computer analogy: one is RAM and the other is disk.

I don’t know which data storage mechanism is more efficient.

As it relates to ML (machine learning), it appears that the stochastic model is preferred and the trick is to come up with unique ways to compute in real time. This is “learning” Because information storage is far to big to maintain its memory. So you trade for real time compute and eventually sacrifice memory entirely. In the case of IOT, this means that eventually, all the worlds data will be streamed and none will be stored. You would then use statistical probabilities to make inferences about the past and future.

In the critique of Maxwells demon, the experiment breaks down when the demon can no longer maintain the perfect information in its memory. At this point, the memory is lost and entropy is manifest, albeit delayed.

So, to solve maxwells demon, one must either come up with infinite memory storage or perfect real time compute.

In the thought experiment, energy is needed to both decide when to open the door as well as to actuate the door. In the classical example, both compute and memory are required:

Knowledge of the particles position & velocity and the ability to decide to open the door. It is impossible with memory alone. But may be possible with compute alone.

Can isolation of the particles be achieved with only stochastic opening and closing of the door??????

Can you use the previous result of the open door to inform the next opening? Is this pure compute, or must it be understood as memory. Random and short lived.

Is compute really just real time memory? Are memory and compute just two sides of the same coin?

Can you achieve the particle separation of maxwells demon with infinite iterations of stochastic compute? Does this break thermodynamics? Because there never was any information stored to be eventually lost.

Which method of information processing requires less energy: storage or compute?

Which method then is more cohesive to entropy? Does either slow entropy?

In this case, the thought experiment is relative to the idea of “hacking” entropy. After all, that is the interesting thing. Can we manipulate the supposed laws of the universe?