Let me clarify what I mean:
If you put 80W of electrical energy into an ideal electrical heater, it will put out 80W of heat energy.
But what if you put 80W of electrical energy into a CPU and let it calculate things, creating new information?
Will it also output exactly 80W of heat?

Or is some energy transformed into “information”, so the CPU will radiate less heat?

My instinct is that if information isn’t energy, then you could theoretically create it (thereby reducing entropy) without expending energy, and that’s a no-no.

But if it is energy, then a CPU running a random number generator (creating no information) at max load would get hotter than one doing actual calculations. Which also sounds wrong.

(I’m neither a physicist nor a computer scientist, in case that wasn’t obvious)

  • nialv7@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    13 days ago

    Information doesn’t hold energy themselves, so yes the CPU produces 80W of heat, no matter what computation they do.

    (Also it’s pretty hard to define what’s information, people generally talk about information entropy, which is different from thermodynamic entropy. This can be confusing).

    But your instinct is correct, one cannot reduce entropy without spending energy. And the kind of computation computers do today does change entropy, so some energy must be dissipated as heat to do computation. This is the Landauer’s principle: https://en.wikipedia.org/wiki/Landauer’s_principle

    Also you need to remember that energy can not be destroyed or created. Expending energy to do something doesn’t mean there will be less energy in the end.

    and that’s a no-no.

    Not sure why you think that? If information holds no energy then creating it without converting energy isn’t against any physical laws.