Light waves are a type of electromagnetic wave and they fall between 400-700 nm long. Microwaves are less energetic but seem to be more dangerous than visible light. Is visible light dangerous at all and why not?
-
Related : Why do we use microwaves in microwave ovens ? – StephenG - Help Ukraine Jan 07 '19 at 05:17
-
1Comments are not for extended discussion; this conversation has been moved to chat. – David Z Jan 09 '19 at 09:12
13 Answers
Your question contains a premise that is false: Microwaves do not have less energy than visible light per se. They only have less energy per photon, as per the Planck–Einstein relation, $E = hf$. In other words, you can raise the power of electromagnetic radiation to a dangerous level at any wavelength, if only you generate enough photons – as your microwave oven does.
That very much includes visible light. You can easily verify this by waiting for a sunny day, getting out your magnifying glass, and using it to focus sunlight on a piece of paper. Watch it char and maybe even burn. (Make sure there's nothing around that piece of paper that can burn.) In conclusion, then, sunlight is dangerous!

- 113

- 1,377
-
12When a physicist (or anybody who works closely with physicists) talks about the energy of any kind of radiation, they are virtually always talking about the energy per quantum. So, according to that definition of "energy", a microwave source does have substantially less energy than a visible light source, regardless of how much power either source puts out. – Solomon Slow Jan 07 '19 at 18:49
-
Comments are not for extended discussion; this conversation has been moved to chat. – David Z Jan 14 '19 at 08:05
If you stare at the Sun you’ll go blind. And if you spend a lot of time in the sun, you’re likely to get skin cancers. So visible light seems plenty dangerous to me.
Some of the damage may actually be from infrared and ultraviolet light, but these are close in frequency to visible light and very far from microwaves.
By the way, the intensity also matters, not just the frequency. In terms of photons, it matters not only how energetic each photon is, but also how many photons are arriving per second.

- 51,534
-
27This answer could be improved by addressing the OP's comparison with microwaves and discussing ways in which microwaves are dangerous (heating) and are not (cancer). – Jan 07 '19 at 08:10
-
4
-
8@BenCrowell The thing is, there's nothing really dangerous about microwaves - the only damage they do is through heating, which doesn't depend on the wavelength as long as the material absorbs most of the energy anyway. The same amount of energy from visible light does far more damage. But adding it to the answer is probably a good idea, since it seems suse doesn't understand that part. – Luaan Jan 07 '19 at 09:21
-
25"And if you spend a lot of time in the sun, you’re likely to get skin cancers. So visible light seems plenty dangerous to me." - Skin cancer is caused by the UV radiation the sun emits, not visible light. – marcelm Jan 07 '19 at 15:17
-
3@Luaan And here I thought microwave effects do depend on wavelength. Namely, that the wavelength used in microwave ovens is specifically tuned to resonate with H2O – Hagen von Eitzen Jan 07 '19 at 22:05
-
1
-
@HagenvonEitzen Wavelength determines type of damage. Intensity depends on weather damage happens or if the radiation is harmless. Remember, Wifi is microwave. – slebetman Jan 08 '19 at 01:50
-
1@HagenvonEitzen The amount of energy absorbed by a given thickness of a given material depends on the wavelength. Water doesn't resonate with microwaves in the microwave oven; rather, microwave ovens exploit the fact that polar molecules will align themselves with the magnetic field. Keep switching the polarity, and you absorb a lot of the incident energy as kinetic energy (to be dissipated as heat). But that's not really where most of the heat comes from - it really is as simple as "the photons are absorbed". Microwaves still heat from the outside in, and still have rather small penetration. – Luaan Jan 08 '19 at 11:08
-
@HagenvonEitzen Regardless of the mechanism that transfers the energy from the magnetron to the food item, the only damage done is still through heating. This is not the case with something like UV light or even visible light - visible light can easily affect molecular bonds, and UV light is energetic enough to directly affect even double and triple bonds. What's the difference between thermal damage and this? Thermal damage doesn't care about the source of the heat, only the amount. But no matter how much red light you shine on your DNA, you're not going to do the damage UV light does. – Luaan Jan 08 '19 at 11:12
-
1@ half the people responding: Any radiation strong enough to damage your skin can give you skin cancer. – lilHar Jan 08 '19 at 22:15
-
1@liljoshu Can you elaborate or provide some source for that? As far as I'm aware, all visible light really would do is heat up your skin. I don't think something like a blister is going to be a cause of cancer. You would need stronger radiation; such as the UV from the sun that harms our skin. – JMac Jan 09 '19 at 12:14
-
@Hagen von Eitzen You are correct. As has been mentioned in other posts, but not these comments, UV is ionizing; visible light is not. Ionizing radiation has enough energy to force electrons to leave atoms. Visible light will never give you cancer via DNA damage as UV light can, at any intensity. It can absolutely set you on fire at high intensities (magnifying glass, lasers, etc.), but it won't damage your DNA as some UV does. – Necoras Jan 09 '19 at 20:54
-
1@slebetman That's an incomplete statement. Low intensity x-rays (ie: in a dentist office) are carcinogenic regardless of intensity. That's why you wear a lead vest. Non-ionizing radiation (radio waves, infrared, microwaves, visible light, and some UV) is not dangerous at low intensities, but ionizing radiation is dangerous at any intensity. – Necoras Jan 09 '19 at 20:56
-
1@JMac Anything that cause harm to your body increases cancer risk, period. A blister will increase cancer risk. A cut that scars increases cancer risk. Being stressed which fatigues your cells and cause some to fail increases cancer risk. The reason is because cancer isn't a disease, but the result of when your body fails to repair perfectly, which in turn damages the DNA, and your cells accidentally revert to a more primal state. Anytime DNA is damaged, it may become cancer; this is why it seems sometimes that "everything causes cancer" because anything that harms you increases its risk. – lilHar Jan 09 '19 at 21:53
-
@liljoshu I think that's a bit of a stretch to actually try to equate the two. What if we also account for the fact that the body requires light, and therefore without it the body would be stressed, and this could cause cancer. To try to compare something so low risk for cancer like a blister to something with obvious correlation such as ionizing radiation, seems intellectually dishonest. Also, I'd point out that, for example, the National Cancer Institute goes as far as saying that visible light as not been known to cause cancer, and they only consider chronic inflammation as a risk. – JMac Jan 09 '19 at 22:06
-
@JMac You are correct that also not getting enough light causes cancer. Also, the National Cancer Institute in their reports for public notices dumbs it down as for most practical applications, visible light isn't intense enough to cause damage. – lilHar Jan 09 '19 at 23:02
-
@Necoras But it's the same with x-rays. Just as we today bathe ourselves in microwave radiation via Wifi, my parents bathed me in low intensity x-rays when I was growing up.. via TV. Before the advent of flat screen displays modern humans voluntarily bathed themselves daily in low intensity x-rays generated via CRT displays. I'm still around typing this message and I don't have cancer. – slebetman Jan 10 '19 at 08:27
The dose (or, in this case, the intensity) makes the poison. You're constantly exposed to microwaves since that's what 99% of wireless communication devices use, and you're also constantly exposed to visible light unless you sleep in an isolation tank. Both can be dangerous if you increase the intensity sufficiently.

- 4,809
-
3For that matter, you are constantly exposed to microwaves from the sun and, for that matter, the CMB. Just at very low levels. – WhatRoughBeast Jan 07 '19 at 15:56
-
6Let me put some numbers behind that. A standard microwave oven puts 650W of electromagnetic energy into whatever you put inside it (many ovens actually do more these days). A wifi router is limited to 4 watts EIRP of RF radiation, which falls off with distance on the familiar inverse-square law. A modern bicycle headlamp, operating from a 3-watt dynamo, uses LEDs with a thermal efficiency of about 50%, so emits about 1.5 watts of visible light energy - and that's pretty bright. Now imagine looking at 650W of visible light… – Chromatix Jan 08 '19 at 13:35
-
2For that matter, constantly sleeping in an isolation tank and never seeing any light whatsoever is also dangerous, but for entirely different reasons. – gerrit Jan 08 '19 at 15:39
-
@Chromatix: It appears you're mixing energy, power and irradiance (power/area). You might have a point but you're making it with wrong units. – Eric Duminil Jan 09 '19 at 10:45
-
1@EricDuminil I don't see where I'm using energy in my comment, rather than power. Irradiance probably is the more relevant measure for this question, but is harder to describe in layman's terms. Using raw power as a proxy gets the point across to within an order of magnitude. – Chromatix Jan 09 '19 at 11:29
-
-
@EricDuminil I'm clearly describing a continuous process of energy emission and using units of power. – Chromatix Jan 09 '19 at 12:14
-
1@Chromatix: It's still clearly the wrong description for the given unit. ¯\(ツ)/¯ – Eric Duminil Jan 09 '19 at 12:19
-
1@Chromatix You absolutely need to put in irradiance. "Imagine looking at 650 W of visible light", that's actually close to the visible light irradiance of the sun, ~650 W/m^2. Clearly I can make water boil in the microwave, but it doesn't merely by putting it outside. – user71659 Jan 11 '19 at 00:55
Visible light is dangerous if you have the same power output as a microwave in a confined space
There are several factors to consider here.
One is that, per photon visible light has more energy than microwave radiation. But this is misleading: a microwave oven puts about 1 kW of power into a confined space. That's a lot of power. You don't often see that much power from visible or near-visible light in a small volume: if you did, you would be just as worried about cooking yourself. In other words, if you put a 1 kW source of visible light in a small box, it would cook the stuff in the box.
Microwaves may also be absorbed in different ways to visible light. Microwaves penetrate flesh far more deeply than light and can therefore have a more immediate effect on temperatures throughout the flesh. On the other hand the same intensity of visible light will ultimately generate the same amount of heating (the same amount of power is being dissipated) but in a much thinner layer on the surface of the flesh. Is that less dangerous? Only if you prefer to be broiled rather than fried.
And visible light is often dangerous to various body parts when concentrated enough. Handheld laser pointers typically operate at <1 mW of power output but will leave holes in your retina if you stare at them.
Visible light is dangerous. But for a fair comparison to microwaves you need to look at the total amount of energy involved. Microwave ovens dump a lot of energy into their contents and there is little reason to think that doing the same with visible light would be much less harmful.

- 2,562
- 12
- 13
-
1
-
1kW source of visible light in a small box is a toaster, right? (well: mostly infra-red, but some visible light) – smci Jan 12 '19 at 22:39
-
1@smci Yes, but bear in mind that traditional toasters are not closed boxes and much of the heat escapes from the box. – matt_black Jan 13 '19 at 10:44
-
-
1@forest I'm happy to sponsor any experiments you would like to do on that like volunteering to stare at one to prove it isn't harmless. – matt_black Jan 14 '19 at 14:57
-
1@matt_black If you actually paid me, I'd absolutely look into a 1 mW red laser, assuming it was actually 1 mW (many lasers that claim to be <5 mW are actually quite a bit higher, creepily enough). – forest Jan 17 '19 at 06:35
-
1@forest Fair point: my anecdotal stories may be based on the claimed <1mW lasers that forget to filter their IR component (many are driven by much more powerful IR lasers frequency doubled into the visible with non-linear optical crystals). – matt_black Jan 17 '19 at 21:48
Assuming the intensity is the same, microwaves are more dangerous than visible light because they penetrate the skin to a greater depth (1-2 cm; more info is in Wikipedia).
Humans have more adaptation to visible light than to microwave radiation, because they were exposed to light for millions of years. This is expressed in two ways:
- Proteins in the epidermis (outer skin layer) are more resistant to heat than those in the deeper skin layers
- There are more nerve endings in the outer layers of skin, so dangerous heating by visible light causes more pain, urging you to escape the dangerous situation
Oh, and the most obvious difference: visible light is visible. Dangerous levels of visible light (e.g. in a solar cooker), to our eyes, look blinding and obviously dangerous. Dangerous levels of microwave radiation are invisible.

- 866
-
In general, anything that penetrates more is less dangerous; since it means it's less ionising. For example, gamma radiation; when it does ionise does a whole lot of damage... however it's such high energy, most of them are able to pass through the body without impacting it at all. While something like an alpha particle (sure, it's not EM, but concept applies) becomes VERY dangerous because it doesn't penetrate at all. Obviously, reflecting and passing through are the same in that the energy isn't absorbed by the organs; meaning no ionisation occurs. – UKMonkey Jan 07 '19 at 11:45
-
2Wikipedia doesn't seem to support your claim. For instance, the microwave intensity found to cause cataracts in rabbits is 150 mW/cm2 for 100 minutes. That is roughly equal to the intensity of natural sunlight. I believe staring at the Sun will make you blind much faster. – Dmitry Grigoryev Jan 07 '19 at 12:17
-
4@UKMonkey Re "alpha is more dangerous than gamma because it penetrates less": Simply not true: "exposure to most alpha particles originating outside the body is not a serious hazard." Or maybe a Briton would trust the BBC. The same applies to light: The skin absorbs/reflects lots of it and protects underlying tissue; while microwaves can penetrate it and reach living cells. – Peter - Reinstate Monica Jan 07 '19 at 13:26
-
1@PeterA.Schneider Sure - emphasis on outside the body. Inside the body is a very different story. That's because our skin has been very well adapted to protect us. The point is that you can't just say "it's more penetrating -> it's more dangerous" EM Radiation isn't a gun - if it goes through you, it's harmless. Now there's a sweet spot where all the radiation is absorbed in the body, and it penetrates some distance - which is how cancer treatment works; however attempting to suggest that masters in physics is related to the media is insulting, and I'd be thankful if you could refrain. – UKMonkey Jan 07 '19 at 13:34
-
@UKMonkey I didn't mean to insult, and I didn't know you had a Master's degree in Physics.-- "Outside the body": Well, that's the setting of the question: Obviously the radiation (light or microwave) is meant to originate outside the body, and the main reason light does less harm is that it doesn't penetrate the skin much. – Peter - Reinstate Monica Jan 07 '19 at 13:45
-
@PeterA.Schneider no harm done :) and maybe my example was poor ... using alpha. – UKMonkey Jan 07 '19 at 13:47
-
1I can vouch for the heating effect of visible light on skin. A 2kW theatrical followspot with an IR-reducing coating (so probably 100-200W in 15-20cm diameter) has a beam powerful enough that you wouldn't leave your hand in it for long or choose to put it back in. For pure visible light, even a couple of hundred mW of 532nm laser (parallel beam) certainly stings enough to deter longer-term testing – Chris H Jan 07 '19 at 16:25
-
@UKMonkey, If you google the phrase "percent depth dose curve," which is an important concept in radiation oncology, it may lead you to reconsider your claim that "most [gamma photons] are able to pass through the body without impacting it at all." – Solomon Slow Jan 07 '19 at 18:59
-
@UKMonkey: your reasoning is correct for scenarios in which ionization causes the damage you need worry about. If you stick your hand into intense microwaves, it is heat, not ionization which does the damage. – Menno Jan 07 '19 at 21:04
-
@DmitryGrigoryev That's not really the point. If you look at the Sun, your eye will adapt to the incident sunlight - the pupils contract, you start squinting etc. This significantly reduces the intensity of light that actually hits your retina (and lens, in the case of cataracts). Microwaves do not cause such a response, so they can do more damage than visible light to a healthy eye. This is also one reason why even a very weak laser still causes blinding - the light is focused in a small area, and there's no response from the eye to counter the intensity. – Luaan Jan 08 '19 at 11:18
-
@DmitryGrigoryev Incidentally, this is also the reason why you can usually look directly at the Sun with little danger to your eyesight (don't try this at home, though). The real danger comes when the sun is the only bright spot in your vision and you're focusing on it - such as when watching a solar eclipse without eye protection. – Luaan Jan 08 '19 at 11:21
-
2The penetration depth only has an effect on what is damaged, not so much on the amount of damage. No, the main difference between visible light a microwaves (as in microwave-oven) is the power. 900 watts of visible light are probably just as dangerous. Of course, visible light will roast your skin before the heat penetrates deeper by conduction. But because its damage is confined to the skin, a much shorter exposure to 900W visible light will do lasting damage than 900W microwaves. – cmaster - reinstate monica Jan 08 '19 at 22:19
-
@Luaan So the natural light is dangerous despite the adaptations we have to protect ourselves against it, which seems to only reinforce my point. And I'm not even talking about the eyes specifically, that's just one example. Bigger penetration depth means less intensity per unit volume, which generally means less damage, not more. – Dmitry Grigoryev Jan 09 '19 at 09:20
-
@DmitryGrigoryev No, natural light is only dangerous when there's no adaptation. There's no evolutionary pressure to adapt eyesight to eclipses (they're rare, and the damage is significant, but not really big enough to affect your reproduction or survival capability). It's exactly the same with microwaves. And I'm also saying that microwaves are generally less dangerous than visible light of the same power (assuming similar absorption). The only thing I've been explaining is one of the few cases where microwaves can do more damage than visible light. It's not surprising it's the eye. – Luaan Jan 09 '19 at 10:03
-
@Luaan Just to make myself clear, I'm arguing against the claim "microwaves are more dangerous than visible light because they penetrate the skin to a greater depth", supposedly supported by Wikipedia. And I'm not at all convinced that eye adaptaions will allow rabbits to stare directly at the Sun for 100 minutes, so I'm not sure that the argument even applies to the eye. – Dmitry Grigoryev Jan 09 '19 at 10:15
-
@DmitryGrigoryev Well, one of the adaptations we have to natural light is that you close your eyes or turn they away from the Sun. If you forced them to keep their eyes fully open watching straight into the Sun, that would probably cause quite a bit of damage over 100 minutes. So if you force the two to be the same, yes, visible light will blind you just as fast (and probably faster, unless absorption by the liquids in your eye is important - note how they talk about cataracts, not damage to the retina; microwaves will cause more cataracts, visible light more retina damage). – Luaan Jan 09 '19 at 10:23
The danger of electromagnetic waves is a function of the photon energy, the intensity of the source and your distance from it, and the qualitative nature of the interaction of a specific frequency with organic matter.
That latter bit is very complex. The visible spectrum, down to about infra-red, doesn't penetrate the top layer of skin, or most clothing, so for the most part its interaction is limited to heating. Strong infrared can certainly cause burns. Strong visible light can certainly cause eye damage. But very high-intensity sources of visible light are rare in daily life, and very notably, we can see them, and avoid them.
Further down in the energy spectrum (longer wavelength) you get the "millimeter waves" of airport scanners, which can penetrate clothing but not skin, and then you're into the microwaves, uhf and vfh radio waves, and then the radio waves called shortwaves (high frequency), medium waves and longwaves (low frequency). Microwaves can penetrate into flesh, and radio frequencies can entirely traverse a human body, and these can cause very, very severe deep-tissue burns. Certain frequencies can also interfere with cardiac rhythm, which can be as fatal as it sounds.
Your home wifi equipment produces microwaves on pretty much the same frequencies as your microwave oven, but at milliwatt power and dispersed in all directions. The oven dumps hundreds of watts into a small enclosed space. That's the difference.
And we live in a sea of radio waves from microwave (cell phone, wifi) though uhf and vhf (two-way radios, broadcast TV and FM) and lower (broadcast radio). The key is the power. If you grab the antenna of your uncle's 500-W ham radio when he keys the mic, or climb the tower of a multi-kilowatt tv station, you'll get hurt, maybe very badly. But going about your normal business, you're probably absorbing less than a milliwatt of radio energy. And the only effect is heating, so it's little different than being in a room that's very slightly warmer.
Now, moving upward in energy from the visible spectrum, you get ultraviolet, x-rays and then a vast spectrum of increasing energy gamma. Not only can they penetrate into flesh, they have a very specific dirty trick: they have enough photon energy to ionize molecules, and when that happens to our dna and proteins, we start to have very bad days. This is a very specific capability that begins at a certain energy threshold.
Microwaves, as you've remarked, are in the opposite direction from UV, X-rays and gamma: lower photon energy, longer wavelength. They cannot duplicate the ionization danger of higher energies, no matter how intense their sources are.

- 310
-
1Certain frequencies can also interfere with cardiac rhythm, can you elaborate on that? I'm aware of the case of pacemakers, but how would this work in the fully natural situation? If it's not resonance you're talking about, then what is it? – gerrit Jan 08 '19 at 15:44
-
its not like if you grab a capable radio and walk amongst cardiac people, they will die off - or is it? – xray0 Jan 10 '19 at 10:37
Other answers already point out the matter of intensity. If you have a 1kW microwave that cooks your chicken and so call microwaves dangerous, you can equally cook chicken with 1kW visible light bulb. The difference is mostly how well and how deep the absorption goes, but the amount of energy is the same if the absorbed power is the same.
What makes the difference:
- Absorption coefficient; how WELL the light is absorbed. So, a black chicken will cook well in strong visible light, but a white chicken will require higher power, because it reflects more. Microwaves work well because they penetrate deeper before they get absorbed (due to longer wavelength), but also absorb WELL because the frequency is tuned near the resonance for water molecules.
- Resonances; If the wavelength matches exactly to one of the transition frequencies for molecules/atoms in the target, most of the energy is selectively absorbed just by those molecules. So, if you tuned your light specifically to a transition that breaks some specific bonds, or heat just specific tissues. This can do more damage because it may change chemistry, but luckily, breaking bonds requires quite high frequencies - see next case below. With MW and IR, you will still end up just heating the sample if you find a resonance (resonances in MW, IR and visible are mostly vibrational and rotational transitions, not bond changing, except for red-colored substances reacting to visible light, which you notice when red dyes bleach quickly in intense light).
- Ionization; If the energy of a SINGLE photon is enough to kick off an electron from a molecule/atom, then it's dangerous because it's actively affecting the chemistry (note that this is similar result than the resonant case above, but instead of having a precise frequency, it has way too much energy, with similar results). This is what is called ionising radiation (gamma/X-rays, down to UV range).
Note that resonance just means good absorbtion, nothing mystical. Water is mostly transparent for visible light because no significant vibrations of water molecule fall in this range - most are in IR and microwave, and there is another absorption range in UV.
Rule of thumb: microwaves, IR and visible light just heat you up. It's the heat is sufficient to raise the temperature into danger zone, it's dangerous, otherwise it's harmless. Only intensity matters (Watts per square meter), not the frequency. Ionizing radiation (UV/X-ray/gamma) are dangerous because of chemical damage even at low intensity.
Microwaves are NOT ionizing radiation, so the wireless and mobile signals do absolutely nothing - the power is way too low, otherwise you'd need to charge your phone every 5 minutes.

- 6,604
It's more accurate to say microwave ovens are dangerous. Then again, so is visible light.
It's a question not of photon energy, but total energy. A typical microwave outputs on the order of 1 kilowatt of electromagnetic radiation which is almost entirely absorbed by the food within.
By comparison, the solar power at earth's surface, at maximum, is around 1 kilowatt per square meter. If it's cloudy, not at the equator, or not noon, it will be less. Most foods have a surface area of much less than a square meter, so the total electromagnetic radiation power received by something sitting in the sun is much less than a microwave oven.
For a fair comparison, what do you think would happen if a magnifying glass with an area of one square meter, on a very sunny day, were used to focus light on to something the size of what you'd put into a microwave oven?
There are a few more subtle differences. For example dangerous powers of visible light are so bright you'll surely close your eyes. Furthermore, visible light penetrates less deeply, so you're likely to feel the heat and move away before it does any more than superficial damage to your skin, like a sunburn. On the other hand, microwave radiation is invisible and penetrates more deeply, so you may suffer irreversible injury before even noticing the hazard. The cornea is especially prone to microwave injury since there's no protective reflex to protect it, it has little thermal mass and thus heats quickly, and there's little blood flow to cool it.

- 4,161
-
Of course, solar ovens are a thing - and indeed, to get a decent oven, a square meter of mirrors concentrating the sunlight to the center (with the food item) is more than enough. – Luaan Jan 08 '19 at 11:27
There is a saying that "The dose makes the toxin."
Oxygen is the substance you most need a constant supply of. You will die after just a few minutes without oxygen.
But oxygen toxicity is real. Too much oxygen can harm or kill you. In fact, for billions of years all the organisms on Earth had no use for oxygen. When oxygen concentration increased in the atmosphere most life forms on Earth died off. Only the ones that adapted fast enough to tolerate and even to depend on oxygen could survive in the more oxygen rich atmosphere.
The same goes for every other necessary substance or environmental factor. And the same goes for every other dangerous substance or environmental factor. In high enough doses, even the most necessary things are deadly. In low enough doses, even the most deadly things can be harmless and maybe even useful.
Since visible light and all other frequencies of electromagnetic radiation are environmental factors, the preceding is also true for them. Too much of any frequencies of electromagnetic radiation, even the most beneficial, can be harmful or deadly, and small enough exposure to even the deadliest frequencies of electromagnetic radiation, such as X-Rays or gamma rays, can be harmless or even beneficial.
I remember a story by Arthur C. Clarke in which a character criticized the way that death rays in science fiction were visible to the human eye, saying that if visible light was deadly, humans couldn't live. But humans have evolved to survive the concentrations of visible light that are common on Earth. A human exposed to a concentration of visible light that was a thousand times, or a million times, or a billion times, stronger could be killed, cooked, or even instantly vaporized.
I also remember two other stories by Arthur C. Clarke, perhaps even in the same collection, where humans found plausible ways to create death rays out of visible light using the primitive technology of the 1950s and 1960s.
Some forms of radiation therapy for cancer involve using beams of X-Rays, Gamma rays, or charged particles to help kill cancer cells. So people undergoing radiation therapy are often benefited by being struck by death rays designed to kill living tissues, because the death rays are aimed at living tissues that would kill their host bodies eventually.
As we all know, antimatter is the most dangerous substance imagined by physicists. If a normal particle collides with its opposite antiparticle, both are annihilated and radiation is emitted.
You may have heard of people having PET scans for medical diagnosis. PET stands for Positron Emission Tomography. A positron is an anti electron, and thus an antiparticle. So people who had PET scans have survived, and perhaps benefited from, having minute amounts of antiparticles in their bodies.
So even with something as supremely deadly as antimatter, the dose makes the toxin.
A microwave oven does to food something very similar to what camp fires and stoves do to food, and requires about the same amount of energy per meal. The amount of energy received per second of being microwaved is many times the amount of energy per second in natural or artificial light for illumination. much
So being exposed to the same energy in visible light frequencies as an open fire or a stove imparts to a meal is not likely to be much better for someone than being microwaved in a microwave oven would be.

- 39
There is nothing inherently more or less dangerous about microwaves. Yes, the type of damage via microwave vs. visible light vs. xray are different but weather or not the light cause damage has the same factor for all the spectrum of light - intensity.
Remember, modern computer-using humans constantly volunteer to be bather in microwaves. Wifi uses exactly the same frequency as microwave ovens. The difference between wifi and microwave ovens is the wattage - the amount of power used to generate the light - the intensity.
You can cook with visible light if you pump enough power into it - or somehow concentrate it. This is how sun ovens work and how you can burn paper with a magnifying glass. You can also cook with xrays if it is intense enough.
Side note: Most people don't realise this but high intensity infrared light can blind you as surely as staring at the sun. Just because it is invisible to your eyes does not mean the photons does not hit your retina. High intensity infrared spotlights are sold as part of security systems for infrared sensitive cameras (night vision).

- 380
-
"High intensity" is relative, of course; most spotlights used for "night vision cameras" are still just a few watts, which is rather small compared to sunlight. There is a danger mainly because you don't realize you're too close to the light (and there's no eye response to the increasing amount of light hitting your eyes), and if it's dark, your pupils are maximally dilated. You don't see microwaves, but above about 20 W, the heat is quite noticeable, and a 200 W light is unmistakably warm (those are usually used for heating, though, not IR camera illumination). – Luaan Jan 08 '19 at 11:33
Stand in front of a 2KW spotlight, like used in stage productions. You will start to feel it on your skin (or simply blind you). As explained elsewhere, microwaves go deeper and a microwave oven is still delivering a KW of energy to your body.

- 117
-
This answer is nonscientific but very much on the mark :) Could be improved by saying "a light source with 750W optical output", which would indeed be equivalent to a home microwave oven.... – rackandboneman Jan 10 '19 at 01:54
-
you're right - I should have prefixed with "Unscientific (and maybe even flip)" – Reed Shilts Jan 14 '19 at 19:21
The trick with microwaves is that they use resonance frequencies of water. Some microwaves can be tuned to meat, vegetables or fish, since the resonance frequency can change slightly in composition. Visible light does not resonate with anything in our bodies.
Our generally photo-sensitive skin mostly reacts to UV-Ranges, even in a not yet ionizing spectrum. (The wavelength also determines how deep the light can enter the skin!)
With enough energy, visible light could be dangerous, but the energy needed is far greater without any resonance effects. (think of a child swing)

- 39
-
-
13Microwaves don't use resonance; it's a common misconception. See explanation e.g. here or anywhere on the Internet. – user27542 Jan 07 '19 at 10:45
-
1In what is fish water different from meat water or vegetable water? Not to mention that you can heat up butter just fine in a microwave. – Dmitry Grigoryev Jan 07 '19 at 12:04
-
@user27542 Interesting fact, didn't know. Why does visible light not do any dielectric heating? Maybe it's not resonance in the strict sense but you must at least be in the frequency ballpark? – Peter - Reinstate Monica Jan 07 '19 at 13:51
-
1@DmitryGrigoryev -- butter has a lot of water in it. When you put butter in a hot pan, the bubbles are the water boiling off. – Pete Becker Jan 07 '19 at 14:09
-
-
@user27542 they don't use the resonant frequency of the food, but it is not true that they don't use resonance: "the oven chamber [..] is, in reality, a multimode resonant cavity where the energy is reflected from the walls to create standing wave patterns." https://ieeexplore.ieee.org/document/4181134 – Pete Kirkham Jan 07 '19 at 16:17
-
1I think resonance is the wrong word here. I think the more correct phrasing is that most microwave ovens emit at a frequency that corresponds to a peak in the absorption spectrum of water. This answer is right in noting that absorption is another key factor in addition to the energy per photon and the intensity of light. – Cogitator Jan 07 '19 at 18:33
-
@Cogitator, I suppose you could call it a "peak", but it's a very, very, very broad peak, stretching from single-Hertz radio waves to the mid-infrared. – Mark Jan 07 '19 at 21:07
-
@DmitryGrigoryev Actually, heating butter is tricky in a microwave - the fats heat up from the microwaves much faster than the water (they absorb microwaves more readily than water, and have a much lower heat capacity), which can cause the butter to burn easily (chocolate is even worse). It will heat up easily, but it will also be burnt (unless you use low power, to give the heat time to dissipate to the surrounding water). It's similar to heating milk in a pot - if you do it slow enough, the water limits the temperature to safe levels; otherwise, the heat can burn the milk. – Luaan Jan 08 '19 at 11:37
-
1@PeteKirkham The cavity is resonant. The point is that the photons reflected from the other side of the microwave oven will not be lost (they're in phase with the photons coming from the magnetron). This has nothing to do with water, just with using energy efficiently. Indeed, if you use two magnetrons in a microwave oven, perfectly out of phase, pretty much all the food heating effect is lost (you do get some, since the food item disrupts the "perfection" of the cavity, but it's only a fraction of the power draw of the magnetrons). You'll also overheat and destroy the magnetrons quickly:) – Luaan Jan 09 '19 at 09:55
When we say that microwaves are less energetic, we are talking about the energy in a single photon. The number of photons is also important.
A single microwave photon is utterly harmless. Its only effect is heat, and it takes a serious amount of heat to damage us. But enough heat, in any form, will kill.
Visible light has enough energy that single photons can cause chemical reactions, but only in sensitive compounds. That is what happens in our eyes. The chemicals in our eyes are carefully constructed to be extra sensitive to light and that is what makes light visible.
Ultraviolet light is worse. Here the photons carry enough energy to cause unwanted chemical reactions in most organic compounds. Sun burn and skin cancer results.
Gamma ray photons from radioactivity are even worse, but they are fortunately rare.

- 972
- 6
- 9