said by ahhnold:Wifi, cellphone and other radio waves are classified as non-ionizing radiation. Its main interaction effect with biological tissues is heat transfer. No peer reviewed scientific studies have shown evidence to the contrary at current usage/exposure levels.
uv is non-ionizing and it causes cell damage and cancer unrelated to "heating". There is simulation evidence sub-millimeter radiation can cause DNA damage due to "nonlinear resonances". Very very low frequency radiation in thousands of meter bands has shown to be a hazard to the human nervous system due to induced currents.
To the point others are making regarding modern life surrounded by radio signals keep in mind EM signal strength follows inverse square law with distance. The signal strength of the cellular phone up to your ear is hundreds to trillions or more times more powerful than any other radio signals in your immediate environment. Cell phones transmit with a power of a watt or so.. the signals they are able to receive from towers are on order of one quadrillionth of a watt. Proximity to signal more than anything else is critical to understanding the users exposure.
In an isolated classroom environment with 30 transmitters broadcasting a few hundred mw each in close quarters all day is a significant exposure to microwave radiation above and beyond the background environment of typical exposure to radio and tv transmitters.
Finally there are government radiation exposure standards for these devices which assume ridiculous configurations such as holding the phone an inch from your head while speaking or keeping the body several inches from the antenna embedded in the display bezel. (Inverse square law strikes again)
These conditions are routinely and consistently violated by the majority of users who have no idea that their style of use effectively means they exceed government defined exposure limits.
Is it harmful? will it cause cancer? I doubt it. I don't know... Assume it did and the chance was small what would the sample size of the study and the duration need to be to detect the effect in an unambiguous and statistically significant way? My guess the answer far exceeds any effort anyone is willing to put into the question. Given 20% of the worlds population will die from cancer the SNR involved with detecting such a signal assuming it did exist is impossibly high. We were hardly able to detect the cancer signal in survivors as a result of the hiroshima/nakaski atomic bombs.
Obviously you can never prove a negative but all of these inconclusive statistical based studies people are falling back to in my view are worthless. Any research short of attempting to directly observe cellular damage or mechanisms for the same will never produce a positive result even if you flat out assume there was one to be found.
Until positive evidence is found I chose to assume it does not exist or my chance of being harmed is too small that I don't care. I only speak for myself.