There is a thread over at Mark Sisson’s discussion board that got me thinking about the dangers of microwaving food. I do not know whether or not it is in fact unhealthy to cook in the microwave, but I can think of a number of pathways how a microwave could potentially be dangerous, or at least less healthy than other means of cooking or heating up your food. To use a metaphor: cooking with a microwave is like putting millions of very thin needles into the food, and the tips of those needles repeatedly heat up to say thousand degrees, but only for a split second. Intuitively it makes sense that this is much more aggressive than conventional cooking, and that it can lead to denaturation of food constituents So I would not disregard the Heret & Blanc study (UPDATE: see this post also) out of hand – it is probably worth checking what they have done before jumping to conclusions.
Pathway 1 – Direct denaturation
What I call “direct denaturation” is the denaturation (ie the breaking down) of a molecule as a direct impact of the microwave radiation. In order to understand this we need to first understand how microwaves work – those who do can skip the next heading.
The physics of microwaves
Microwaves heat up the food by emitting radiation (radio waves, really) at one frequency. The frequency is chosen such that it is absorbed by water molecules, and they heat up when absorbing the radiation energy. The process is very similar to – and profoundly different – from the process of heating up the roof of a car, or your skin, or anything else, in the light of the sun (or the process of grilling food in an infrared grill). Why is it the same? Because in all cases radiation is absorbed and transformed into heat. Why is it different? The radiation of the sunlight (or an infrared grill) represents a whole spectrum of many different wavelengths. Moreover, it is in a frequency range where very few materials are transparent (or fully reflective), so most materials absorb at least some of the radiation and therefore heat up.
The microwaves used in an oven on the other hand are of one – and only one – frequency (or wavelength, which is equivalent). Moreover, most materials are actually “transparent” at this wavelength, ie the waves simply pass through without warming it up. Which is funny of course is that water – which is transparent for light and infrared and therefore does not heat up significantly in this case – is actually “black” with respect to those particular microwaves. This of course is by construction, not by accident. So water heats up, whilst most other materials are transparent, and they remain cold, unless of course they are in contact with the water which then disperses the heat onto them.
The issue is that some materials might not be transparent after all, which means they will be absorbing the energy, just like that water does. Now water is incredibly stable, and the microwaves won’t “break” it (what I mean is that they won’t break the water molecules into its constituents, hydrogen and oxygen. Of course water might evaporate, but in this case the molecules still remain intact).
However, most other molecules are not that stable, and chances are that if they absorb that kind of energy they might well break apart (they will be denaturated, directly by the microwaves). And if they do indeed break apart, all bets are off as to the toxicity of the end-products. They might be alright, they might be unhealthy, or they might be ultra-toxic.
The problem here is that we simply dont know. There are thousands – maybe millions – of different substances in our average item of food. Moreover, a lot of the micro stuff might be very “inconsistent” and unpredictable. Especially natural food (eg animals, plants, also dairy and eggs) will contain many substances directly derived from the environment of the plant or animal in question, so even different batches of the same product might contain different constituents. So the hypothesis that microwaving food is safe for you is impossible to verify – you would have to literally microwave a portion of everything you eat and test if for any possible toxin, known or unknown, just to be sure.
Pathway 2 – Indirect denaturation, type 1 (“ultra-hot water molecules”)
The heat created by microwaves is somewhat unusual. Without going to much into the thermodynamics of it (and thereby obviously losing some accuracy) the water molecules that happen to absorb one “piece” of the radiation (please just accept that radiation has pieces – I dont want to explain quantum theory here…suffice to say the Einstein got the Nobel price for a related discovery) got ultra-hot. Much hotter than your target temperature of about 50-100C/150-200F – it is probably more in the thousands. Now for the water itself this does not matter, because water is very stable, and the molecule will not break into hydrogen and oxygen at those temperatures. However, this heat is transferred to the surroundings in a process called dissipation. This is really a big word for molecules bumping into each other and transferring energy amongst them. So if the “hot” water happens to bump into other water molecules then after a number of bumps the energy is dissipated, and all is hunky dory. Some of the bumps might have been a bit heavy, but the water can take it.
But what if the first molecule that “hot” water encounters is not other water, but something more complex? Well, it is really like this other molecule being in an environment of 1000C rather than your comfortable 50C average temperature of the food. And at 1000C things happen – nasty things possibly. And again – we simply dont know, because we dont know what else is in the food. And whilst direct denaturation can only happen to substances that absorb the particular wavelength of the microwave oven, indirect denaturation can happen to any molecule that happens to be “in the wrong place, at the wrong time”
Pathway 3 – Indirect denaturation, type 2 (“pressure cooker”)
This effect is very similar to the type 1 denaturation, but less pronounced. The point here is that a microwave oven tends not to produce a nice even concentration of waves, but there will be parts of the oven where there are “lots of waves” and parts where there are “no waves” (again, please forgive me to not explain this in more detail; look up “standing waves” if you want to know more).
The best way to understand this phenomenon is to prepare a really nice and thick pea soup, put it in the fridge, and then microwave it at full power for 30 seconds or so. You take it out, stir it, and let it stand for say 15 secs, and it will be barely warm. Nevertheless, if you eat some just after you have taken it out (and without stirring it) then you might well burn your tongue. Why? Because there are areas that are really, really hot (those with “lots of waves”) and other that havent changed temperature. Stirring evens out the differences, but if you dont do this, the hot pockets can really burn you.
Again, the effect is that parts of the food have become much hotter than you thought they might have been. So even if your pea soup (or whatever you cooked) is eventually at 60C/140F, parts of it have probably been at least at 100C / 210F, ie the temperature of boiling water, with all the denaturation that happens at those temperatures. This effect is the more pronounced the more “solid” the food is: imagine some vegetable with a pretty robust cell structure: if there are “lots of waves” just within one cell, then this cell will heat up accordingly. Under normal pressure conditions, the water would evaporate at 100C, so it can not become hotter than that. However, if the cell is strong enough, you have a local “pressure cooker” and temperatures can locally and temporarily rise significantly above 100C in those circumstances.
So where does this leave us? The above shows that microwave cooking is indeed different from other types of cooking, in that extreme levels of heat are generated locally (even pathway #1 can be thought of this way) and this heat only then dissipates down to the average temperature of the food. But because for denaturation the average temperature matters much less than the peak temperature, it is plausible that cooking (or warming up for that matter) with microwaves has more harmful effects than cooking with a traditional heat source.
This effect will be the more pronounced the less “watery” the food is. If you had used distilled water there would be no difference boiling it in a microwave, or with conventional heat. All the aforementioned pathways are the less effective the more watery the food contains. This is obvious for pathway #3: you really need cells for this pressure cooker effect to work, so if there are no cells, #3 does not happen. But #2 is also pretty much impaired: If there is a lot of water around, the odds of the “hot” water molecules hitting anything else than other water molecules are very very small, meaning that not many “denaturating” collisions can happen. They do still happen, but whatever is produced is produced in much smaller concentration. A very similar argument goes for piece #1: the odds of a “piece” of radiation being absorbed by anything else than water are very small if so much water is around.
So – heating up your tea or coffee in the microwave is probably OK, and also your bouillon. With thick soups you might start having your doubts, and even more so with solid food. The one thing that is most likely to land you in trouble, even if you are just warming it up, is any food that should never be heated above say 60C/140F, and that only contains small quantities of water that are moreover separated by solid’ish cell structures. Cant think of anything that would fulfill all of the above, but already milk might be an example where at least the pathway #2 (ultra-hot water molecules) could do more damage than one might expect considering that the milk is only heated to say 30-40C / 90-100F. And of course there is the example of the blood package that was moderately heated in the microwave, rather than using conventional heat as usual, and where the patient receiving the transfusion did die.