My guess would be that light with a higher energy such as visible or UV would feel hotter, but this is not the case!
Is this something to do with human senses or is there a physics explanation?
My guess would be that light with a higher energy such as visible or UV would feel hotter, but this is not the case!
Is this something to do with human senses or is there a physics explanation?
The short answer is: of course we do.
The long answer has a few parts.
Absorption
Different wavelengths have different absorption ratios in the same materials. The typical example is a plastic bag, which is transparent to visible light, but opaque to infrared light. This means that it mostly lets visible light through (no absorption, no heating), while capturing infrared light (absorption, heating).
The human body is mostly transparent to both very high and very low frequency light. Radio passes straight through, and so do e.g. X-rays for the most part (don't try hiding from a nuclear blast behind another human - not a lot of protection). There could be kilowatts of radio waves passing right through your body without you noticing any heating, because your body only absorbs very little of those frequencies. Infrared is very important because it's readily absorbed in water - and there's a lot of water in a human body. Still, visible light is readily absorbed in the human body as well - you do in fact feel the heat of visible light (if you've ever tried focusing a lens on a piece of paper - you're mainly doing this with visible light; infrared light of course has a different focus). However, under normal conditions, this tends to be dwarfed by infrared light, because...
Emission
... most of the light sources around us are pretty close to black body emitters. You may be familiar with the rather distinctive curve derived from Planck's Law for the photon emission of a black body. Now compare the area under the curve in the IR region with the one in the visible or UV region - for low-temperature sources (simple incandescent lightbulbs) IR utterly dominates, and even for sunlight, you can see that even before accounting for all the trickiness of the atmosphere etc., we get a whole lot more IR light than visible light. While the per-photon energy of UV light is much higher than for visible light, the total amount of energy carried by all the photons is much smaller - and most of UV light is absorbed in the atmosphere anyway.
In fact, even modern high-efficiency light bulbs still tend to produce more IR light than visible light; light sources with efficiency higher than 50% are quite rare. A decent LED light bulb might have an efficiency around 20%, which means that for each watt of light, it emits four watts of heat (either direct IR radiation or cascading through its surroundings).
IR is everywhere
The feeling of heat on your skin is a relatively simple matter of comparing two temperatures - the temperature of upper skin, with the temperature of lower skin. If the upper skin is hotter, we feel warm, if it's colder, we feel cold.
All objects emit IR light. All of them - and in proportion to their temperature. That's why IR is commonly associated with heat - the room around you is hot with IR radiation, the computer under your desk is hot with IR radiation, you are hot with IR radiation. That's what makes passive thermal vision work - different objects have different temperatures and different emissivity, which makes them stand out against each other on an IR sensor.
How much heat are we talking about? Let's compare to the Sun, just for fun. Sunlight gives about 1100 W per square meter on ground level (there's plenty of different averages - this is basically the value at noon on the equator with average cloud cover). Out of this, about 55% is infrared light and about 42% is visible (see? Even after all so much IR is absorbed in the atmosphere, it still dominates on ground-level :)). So let's say you get about 500 W of IR heat on surface level per square meter. Not something to sneeze at, certainly. Let's put it in human terms, though.
Take a naked human and angle him to the Sun. The human surface area on average is about two square meters, and one half of that is facing away from the Sun, so on a great day, you might absorb as much as that 500 W of IR light. Close enough for our purposes :) But you have to consider something else - the human body is also an IR emitter, and quite a good one at that. How much energy does a typical human emit when idle? About 1000 W. Yes - almost the entire incoming sunlight in the most sunlit place on Earth at noon. So why do we feel warm anyway?
Because sunlight is not the only source of radiation on Earth. Humans radiate a huge amount of energy, true - but so do our surroundings. If you close yourself in a dark room at room temperature, you'll get about 900 W back. So your net radiative loss is only 100 W, rather than 1000 W. And it so happens that the average idle heat loss of the human body is around 100 W, which is why a 25° C room with no direct sunlight feels comfortable - it's more or less a perfect balance between the inefficiencies of human metabolism and the difference of temperature between the human body and the room. Of course, this changes a lot depending on clothing and other factors. Add a 100 W light bulb, and you're outright warm :)
IR photons have a very low energy
Now, this might sound counter-intuitive, and that's because this mostly targets human thinking, rather than reality. But for completeness: infra-red light has negligible effects beyond heating. It's not energetic enough to affect atoms or chemical bonds. The only thing it maps well to is the random motions of atoms and molecules - which add up to what we call heat.
On the other hand, if you take something like visible light, beyond the vibrations you also get chemical changes - electrons being bumped into excited states, (relatively weak) chemical bonds changing; in fact, that's why we see visible light in particular - it's more or less in the sweet spot of "strong enough to excite electrons, but weak enough not to destroy the photoreceptors and their proteins" (the animals that are sensitive to IR use a different mechanism than electro-chemistry). UV light can be easily absorbed, but it's strong enough to break even rather strong chemical bonds, which leads to substantial damage - that's how UV light destroys the DNA in your cells, for example (though again, there are animals that have UV senses - many insects do).
So there's this weird bias in the human mind - you see all those different kinds of light, and they all have interesting properties... except for IR. It just heats stuff, and not much more. Go to even deeper IR (like microwave or radio waves), and you get other interesting behaviours - and a lot less direct heating, since they are less easily absorbed.
Conclusion
We mostly care about infra-red radiation in terms of heat, simply because there's so much of it everywhere, and most of the sources of visible light also involve a higher amount of infrared light. However, take a pure visible light source of enough wattage (say, a cold, high power LED bulb) and point it at yourself, and you'll feel the heat. We use a lot of high-powered visible light lasers, and they're quite obviously pretty good at heating things.
A typical photo-voltaic solar panel captures most of its power production from visible light, as do photosynthetic plants (while some plants also need UV light, that's really more of an catalyst, rather than the primary source of energy; consider how well your house plant is doing despite getting no UV light at all). You need an energy gradient to do useful work, and that makes visible light a lot more interesting than IR for most plants - take a look at an IR photograph of trees or plants; there's quite a decent chance their leaves are actually reflecting incident IR light rather than absorbing it, simply because it's basically waste heat you do not want. That said, there are photosynthetic organisms that are different - working with IR, red or blue light, depending on their niche.
This is probably due to Planck's radiation law and Wien's displacement law giving the wavelength of maximum energy emission, which shows that for temperatures of usual very hot bodies on the order of (a couple) $1000K$, the radiation energy emitted in the infrared/visible ($\lambda>380 nm$)region is much larger than in the ultraviolet region ($\lambda<380 nm$).
We do. Here are two ways to demonstrate it.
First the recommended way: Get a really bright white LED (e.g. a 1200 lumen bike light) and look at the spectrum either on a data sheet or with a spectrometer. If you don't trust that, put some IR-blocking glass in front (e.g. KG1). Put your hand in the beam. You'll feel some warnth especially outside on a cold night. A variation is to get an extremely bright single-colour visible LED. These days (2020 addition) high-power LEDs can give several watts over a very small visible wavelength range, so you can easily feel the heat from (e..g.) blue light
Now the not-recommended way: Put your hand in the beam of a visible laser of at least 50mW (more if the beam is wide). 120mW of 532nm (green) into a sub-millimetre spot on the back of your hand gives quite a sting. With this sort of power your should be wearing goggles, but then you can't see the beam and can get your hand in it accidentally when aligning. But don't try this at home.
I post this in response to the answer posted by Quantumwhisp, which explains a higher body heating by infrared light(IR) compared to visible/ultraviolet(UV)light by an increase of light absorption coefficient in water with increasing wavelength $\lambda$ from UV to visible and IR light (see graph in cited answer). In my opinion, this is not correct (See my comment to the cited answer.) To compare the heating effect of incident light in different wave length regions, you have to know how much energy of the incident light is absorbed in the human skin. I found a scientific article reporting such measurements of the relative energy absorption of light in skins of different males and females in the wave length range from UV ($\lambda=200nm$) to near IR ($\lambda=1000nm$)(Penjweini et al.2013) which shows a decrease of light absorption with wave length in this region. An example of this decrease of relative light absorption with $\lambda$ (skin of a male) is seen in the graph. This shows that the absorption of incident light energy in the skin decreases with wave length from UV to visible and near IR light and doesn't increase as suggested by the absorption coefficient of water. This supports my earlier explanation that the stronger heating sensed by IR radiation is probably due to the energy emission spectrum of hot bodies which is similar to Planck's blackbody radiation spectrum.
If I understand your question correctly, you might be confusing two different but related concepts namely that of temperature and that of heat. When I say I feel "heat" I am mixing together the notion of temperature and heat. There is molecular motion in a substance, the more molecular motion then the higher the temperature. But this is not the same thing as heat (in the physics sense). Heat is the spontaneous transfer of energy one system to another that can't be attributed to work done on or by the system. $\textit{The way}$ in which this transfer is made, is in our common experience seen through radiation. So in practice we could have a really hot material (high kinetic energy molecules) some of this energy can then be released in the optical or infrared spectrum.
The sun releases some ultraviolet rays and while they can be dangerous to the skin, a red hot iron is the preferred method of a sadistic torturer. The pain I feel from the red hot iron can be portended by the red hue but the pain I feel is as a result of the molecular motion of molecules and atoms in the iron that transfer their mechanical energy to the atoms on my skin.
Most of the heat from the Sun you feel when you get outside on a sunny day IS actually the visible light. Most of the heat from the Sun comes in the form of short-wave (that means mainly visible but also UV and some short-wave IR radiation). This is in contrast with the Earth's surface and most objects around us (including our bodies), which mostly radiate their energy in the form of the long-wave (mostly IR) radiation.
There is very little overlap between the spectra radiated by the sun and by objects around us. This is because of the laws for the black-body radiation mentioned in other answers (the Wien's law and the Planck's spectrum) and because the very different temperatures.
You can compare the two spectra at https://en.wikipedia.org/wiki/Outgoing_longwave_radiation but don't forget that atmospheric gases cause absorption in various bands and modify the shape to some extent.
author Rhwentworth from https://commons.wikimedia.org/wiki/File:Sun-Earth_Logarithmic_Spectrums_with_Accurate_Scaling.svg CC-BY-SA-4.0
With objects with temperatures below say 1000 K most of the radiation is infra-red and therefore we have associated the thermal radiation with the IR band but the visible radiation from the Sun is no different and everyone can feel how hot it can make you.
N.B. the very little overlap of the two spectra is useful when determining temperatures of objects from their radiation because you can use a thermal camera even at daytime and you can simply ignore the shortwave part of the spectrum. Similarly IR imaging of the Earth and of the clouds by satellites is useful even in the daytime.