Let me clarify what I mean:
If you put 80W of electrical energy into an ideal electrical heater, it will put out 80W of heat energy.
But what if you put 80W of electrical energy into a CPU and let it calculate things, creating new information?
Will it also output exactly 80W of heat?

Or is some energy transformed into “information”, so the CPU will radiate less heat?

My instinct is that if information isn’t energy, then you could theoretically create it (thereby reducing entropy) without expending energy, and that’s a no-no.

But if it is energy, then a CPU running a random number generator (creating no information) at max load would get hotter than one doing actual calculations. Which also sounds wrong.

(I’m neither a physicist nor a computer scientist, in case that wasn’t obvious)

  • lime!@feddit.nu
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    1
    ·
    6 days ago

    energy is work, and cpus do no work. from a purely electrical standpoint, they are resistive heaters. some energy goes into switching the state of transistors, but that state change generates heat.

    basically, if you flip a power switch, the energy doesn’t go into the state of the switch (which is information), but the act of flipping it (which is work).

    • BussyGyatt@feddit.org
      link
      fedilink
      English
      arrow-up
      3
      ·
      6 days ago

      yea, they do work on the flow of electrons. switching the channels of electron flow is work.

  • raicon@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    6 days ago

    Information is entropy. The 80W is going out as heat, noise, vibrations, radiation.

    If we could run a theoretical computer which is 100% efficient, it could compute forever with the same electrons.

      • courgette@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 days ago

        Reversible computing try to counter this effect. The idea is to avoid any loss of information. It does not required quantum computing.

    • Successful_Try543@feddit.org
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      6 days ago

      Information is entropy.

      Precisely, information has lower entropy than chaos.

      If we could run a theoretical computer which is 100% efficient, it could compute forever with the same electrons.

      Yet, it would still need extra energy to move the electons around.

      Do not summon Maxwell’s Demon (in German, the English article doesn’t elaborate on this that explicitly):

      Die Mindestenergie E, um n Bit Information zu verarbeiten, beträgt E=nkT ln(2).

      The minimum energy E necessary to process n Bit of information is E=nkT ln(2).

  • nialv7@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    6 days ago

    Information doesn’t hold energy themselves, so yes the CPU produces 80W of heat, no matter what computation they do.

    (Also it’s pretty hard to define what’s information, people generally talk about information entropy, which is different from thermodynamic entropy. This can be confusing).

    But your instinct is correct, one cannot reduce entropy without spending energy. And the kind of computation computers do today does change entropy, so some energy must be dissipated as heat to do computation. This is the Landauer’s principle: https://en.wikipedia.org/wiki/Landauer’s_principle

    Also you need to remember that energy can not be destroyed or created. Expending energy to do something doesn’t mean there will be less energy in the end.

    and that’s a no-no.

    Not sure why you think that? If information holds no energy then creating it without converting energy isn’t against any physical laws.

  • Sims@lemmy.ml
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    6 days ago

    Hehe, I can see each comment have their own definition of ‘information’. Not sure mine is better…

    There seem to me to be two types of ‘information’ depending on perspective: ‘energy entropy’ and ‘information entropy’. Information-entropy are what is ‘interesting’ to normal people, observers. After all we evolved to observe our environment. A small black dot on a white wall are almost indifferent in overall energy-entropy, but is high information-entropy to us, the observer (especially those that says “Don’t Touch!” !). I think that dichotomy causes some confusion ?

    We can view ‘information’ as the ‘configurations of X energy units’. Configurations can be complex, multilayered and interdependent with emergent configurations popping in/out. Each configuration is a system that are more or less stable, so information can come and go, while the energy units are staying - somewhere else. An ‘oscillator’ is one of the most stable and simple 1D (a->middle->b) information systems that just changes configuration states at regular intervals in a regular order.

    However, we cant have or detect information without energy in some configuration. That strongly indicates that we indeed have a point where energy and information converge and at that point, information is energy. At that convergence point 1 energy unit = 1 bit.

    So all information is energy, but the black dot on the wall says “no! there’s a difference”, so we are back to ‘interestingness’.

    I think the problem is that life evolved as observers. Observers needs something like the ‘free energy’ principle and reacts to surprise. We notice sharp gradients of input. The black dot on the wall have potential entropy but only combined with an observers configuration. So the low energy black dot wall influenced an external high energy system much much more than its energy configuration indicated. Observers filter reality through ‘observational frames’, and information as energy-entropy, gets transformed to information-entropy - “interestingness”.

    So, we have energy, complexity and an observer axis in this 3D information model. However, that is hard to imagine, so if we perhaps imagine two separate 2D planes: An ‘energy-entropy’ plane, with energy vs configurations axis, and let that evolve enough complexity. Then at some point somewhere, the first observer configuration evolves and instantly creates an orthogonal (right angles) 2D ‘information-entropy’ plane with energy vs interestingness axis. A plane where energy gradients is the information. Our observer reality is both a subset of the whole, but are also creating new novel information of both configuration types in the universe. It is a new branch of information configurations growing, actuated by energy entropy, but steered by information entropy.

    Boolean logic is a natural outgrowth from the converging 1-bit configuration proto-state at 0.0, while waves and thermo dynamics comes from many of these emergent configurations influencing each other. That is all Entropy and Information, but observers add an ‘potential entropy’ dimension - interestingness - to raw ‘kinetic’ energy entropy. Observers notice deviation from average as information. It is notable that both very high and very low energy entropy is very alluring to us, so such phenomenons create the same high value in the information entropy plane. Maybe it would be valuable to divide entropy into kinetic and potential types?

    A bit speculative/abstract oc, but interesting (pun intended)… Anyway, both information types/sets seem valid to me, and both evolve from tiny 1+ energy units interacting and configuring itself fractally.

    • Sims@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 days ago

      Will it also output exactly 80W of heat?

      Or is some energy transformed into “information”

      I completely forgot the question. If you reach the same state in the cpu as before, then you have just spend a lot of energy changing other states around it in a pattern. Each bit flip in the cpu is influencing everything else by either waves or radiation, so a bit flip will be exposed by heat generation or other ‘noise’ from that source. All of that combined state changes in the energy input, in the processing, and the influences delivered to its environment as different forms of radiation, puts out new information entropy in the universe projected from all the processing happening. It doesn’t matter if you keep a copy of the result or not, the universe KNOWS that you spend 80W/H watching fap-movies, and playing endless DK64 speedruns instead of donating cpu to save the world ! (just saying ;), but it IS a reason for concern in hw security !) ).

      I would assume that if you leave a single bit on, then the 80W transformed a local bit to a high nano-watt entropy state, and maybe triggered you to wonder why only one bit survived all that work.

  • CombatWombat@feddit.online
    link
    fedilink
    English
    arrow-up
    3
    ·
    6 days ago

    Energy can neither be created nor destroyed, per the law of conservation of energy. But information certainly can.

  • Brkdncr@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    6 days ago

    You calculate server room hvac based on the electrical usage. There’s no difference between a server and a resistive heater. If they are consuming 1800w then it’s all turning into heat.