My book makes the following claim: Suppose you drop a bunch of large and small ice cubes of temperature 0 degrees in a beaker of water, then you start heating the beaker, the temperature of the beaker will only increase after all the ice cubes have melted.
When something changes phase from solid to liquid, or from liquid to gas, it takes energy to break the intermolecular interactions. These interactions between the water molecules are what make it solid. As far as I know, a finite amount of heat that is supplied to the ice at 0 degrees Celsius (~ 80 cal/g) is used up to convert the whole amount of ice in the beaker to water. This is called the latent heat of fusion. After the process is complete, the temperature again starts rising.
Why is this true? Intuitively energy should be used in breaking the bonds in the ice cubes, but why should all of it be used to break the bonds, it is almost as if the energy “knows” it has to break the bonds first…
Here, the noticeable fact is that the melting of ice is essentially a "bulk" phenomenon. Particles from the bulk (whole) of the ice change into water. So, in a way of saying, ice cubes won't "let" the heat energy supplied contribute to an increase in temperature until and unless all the bonds are broken. That's the way it is.