Let me clarify what I mean:
If you put 80W of electrical energy into an ideal electrical heater, it will put out 80W of heat energy.
But what if you put 80W of electrical energy into a CPU and let it calculate things, creating new information?
Will it also output exactly 80W of heat?
Or is some energy transformed into “information”, so the CPU will radiate less heat?
My instinct is that if information isn’t energy, then you could theoretically create it (thereby reducing entropy) without expending energy, and that’s a no-no.
But if it is energy, then a CPU running a random number generator (creating no information) at max load would get hotter than one doing actual calculations. Which also sounds wrong.
(I’m neither a physicist nor a computer scientist, in case that wasn’t obvious)


Hehe, I can see each comment have their own definition of ‘information’. Not sure mine is better…
There seem to me to be two types of ‘information’ depending on perspective: ‘energy entropy’ and ‘information entropy’. Information-entropy are what is ‘interesting’ to normal people, observers. After all we evolved to observe our environment. A small black dot on a white wall are almost indifferent in overall energy-entropy, but is high information-entropy to us, the observer (especially those that says “Don’t Touch!” !). I think that dichotomy causes some confusion ?
We can view ‘information’ as the ‘configurations of X energy units’. Configurations can be complex, multilayered and interdependent with emergent configurations popping in/out. Each configuration is a system that are more or less stable, so information can come and go, while the energy units are staying - somewhere else. An ‘oscillator’ is one of the most stable and simple 1D (a->middle->b) information systems that just changes configuration states at regular intervals in a regular order.
However, we cant have or detect information without energy in some configuration. That strongly indicates that we indeed have a point where energy and information converge and at that point, information is energy. At that convergence point 1 energy unit = 1 bit.
So all information is energy, but the black dot on the wall says “no! there’s a difference”, so we are back to ‘interestingness’.
I think the problem is that life evolved as observers. Observers needs something like the ‘free energy’ principle and reacts to surprise. We notice sharp gradients of input. The black dot on the wall have potential entropy but only combined with an observers configuration. So the low energy black dot wall influenced an external high energy system much much more than its energy configuration indicated. Observers filter reality through ‘observational frames’, and information as energy-entropy, gets transformed to information-entropy - “interestingness”.
So, we have energy, complexity and an observer axis in this 3D information model. However, that is hard to imagine, so if we perhaps imagine two separate 2D planes: An ‘energy-entropy’ plane, with energy vs configurations axis, and let that evolve enough complexity. Then at some point somewhere, the first observer configuration evolves and instantly creates an orthogonal (right angles) 2D ‘information-entropy’ plane with energy vs interestingness axis. A plane where energy gradients is the information. Our observer reality is both a subset of the whole, but are also creating new novel information of both configuration types in the universe. It is a new branch of information configurations growing, actuated by energy entropy, but steered by information entropy.
Boolean logic is a natural outgrowth from the converging 1-bit configuration proto-state at 0.0, while waves and thermo dynamics comes from many of these emergent configurations influencing each other. That is all Entropy and Information, but observers add an ‘potential entropy’ dimension - interestingness - to raw ‘kinetic’ energy entropy. Observers notice deviation from average as information. It is notable that both very high and very low energy entropy is very alluring to us, so such phenomenons create the same high value in the information entropy plane. Maybe it would be valuable to divide entropy into kinetic and potential types?
A bit speculative/abstract oc, but interesting (pun intended)… Anyway, both information types/sets seem valid to me, and both evolve from tiny 1+ energy units interacting and configuring itself fractally.
I completely forgot the question. If you reach the same state in the cpu as before, then you have just spend a lot of energy changing other states around it in a pattern. Each bit flip in the cpu is influencing everything else by either waves or radiation, so a bit flip will be exposed by heat generation or other ‘noise’ from that source. All of that combined state changes in the energy input, in the processing, and the influences delivered to its environment as different forms of radiation, puts out new information entropy in the universe projected from all the processing happening. It doesn’t matter if you keep a copy of the result or not, the universe KNOWS that you spend 80W/H watching fap-movies, and playing endless DK64 speedruns instead of donating cpu to save the world ! (just saying ;), but it IS a reason for concern in hw security !) ).
I would assume that if you leave a single bit on, then the 80W transformed a local bit to a high nano-watt entropy state, and maybe triggered you to wonder why only one bit survived all that work.