In the framework of information entropy, one common question is how to develop a code that minimizes the cost of transmission of a message, given that each bit as a fixed unit price. If the distribution of words is known, a Huffman code will then provide the required minimum. The price paid for the message is then proportional to the information entropy of the message.
I'm interested in understanding how this framework can be applied in the context of thermodynamic entropy. Can entropy (or rather, the product TS) be seen as the minimum amount of energy required to measure the micro-state of a system, given its ensemble? If this is the case, why does measuring a high-temperature state cost more energy?