Is there any merit to non-ionizing frequencies having harmful impacts on human biological function, I thought so, but is it all "conspiracy" and laughed out of the room or a legitimate scientific part of these discussions?
I'm not a physicist or biologist but what's always made sense to me is that anytime you walk outside during the day you are bathed in broad spectrum radiation from the sun. So anything weaker than the sun is probably safe enough. Anything a million or billion times weaker is probably a million or billion times safer. We already know when and how radios get dangerous (large transmission towers, microwave ovens, etc) and how to mitigate that danger. Inverse cube law and somesuch.
The sun is damaging because it contains ionizing radiation (radiation that is powerful enough to directly disassociate a molecule into ions). This is the UV portion of sunlight.
UV starts at 800,000 GHz.
The 6Ghz being discussed here is completely non-ionizing, not even comparable to UV.
The only concern with 6Ghz is that is can also cause dielectric heating, which is the same as a microwave. But again, at 25mW, you can't even feel the heat from direct contact with the antenna, let alone a few meters away. Your exposure follows the inverse-square law [1], which means that it drops proportional to the square of the distance. So if it's not a problem at 10cm, it's 100x less of a non-problem at 1m.
evolutionary argument is humans are aligned with broad spectrum radiation from the sun, but not the artificial forms which have different magnitudes in different frequencies.
Eg: you are much less likely to get sunburn if you get plenty of natural (or artificial) infrared.
There is no such thing as artificial forms of RF. They're all wiggling photons.
If nature gave us a flute, and man discovered how to make a bass guitar, all though they sound different the only real difference is that the bass guitar is wiggling air molecules more slowly than a flute would. There is zero, nil, no distinction whatsoever between a "natural" and "synthetic" photon wiggling at a given frequency.
> you are much less likely to get sunburn if you get plenty of natural (or artificial) infrared.
I have an FCC certification vouching that I know what those words mean. Your linked diatribe says, quote, "All types of man-made EMFs/EMR". All. Antennas are polarized because that's the easiest way to build them, and the polarization has nice properties.
A standard incandescent lightbulb creates about 100W of unpolarized RF from around 400THz to 750THz. It is manmade, it's an RF emitter, it is not polarized, and it's something everyone older than the age of around 10 has spent their entire lives around.
So either the author is completely wrong in sentence number 2, or they're implying that visible light isn't RF. Either way, they're wrong, and you can ignore the rest of their claims.
> Your linked diatribe says, quote, "All types of man-made EMFs/EMR". All. Antennas are polarized because that's the easiest way to build them, and the polarization has nice properties.
You are comparing light bulbs to wireless communications? What is your point? He says "All types of man-made EMFs/EMR", not "all types of manmade energy". It is clear he does not think that light bulbs are dangerous. So now you are just being confusing on purpose to muddy the water.
But if you bother to READ the whole article you would see he agrees with you:
"Natural EMR/EMFs (cosmic microwaves, infrared, visible light, ultraviolet, gamma rays) and several forms of artificially triggered electromagnetic emissions (such as from light bulbs with thermal filaments, gas discharge lamps, x-rays, lasers, etc.) are not polarized. "
You know we were talking about EMFs from data communication types of man-made EMFs/EMR"s, but you are being ignorant on purpose, becasue you cannot even read anything that is new and conflicts with your ideas.
> You are comparing light bulbs to wireless communications? What is your point?
Visible light and Wi-Fi are the same physical phenomenon, just at different frequencies.
> several forms of artificially triggered electromagnetic emissions (such as from light bulbs with thermal filaments, gas discharge lamps, x-rays, lasers, etc.) are not polarized.
So, he contradicts himself.
Also:
> Natural EMR/EMFs (cosmic microwaves, infrared, visible light, ultraviolet, gamma rays) [...] are not polarized.
Oh yes they absolutely can be, and frequently are. Polarized sunglasses are specifically made to block the polarized light reflecting off lakes, snow, sand, or other surfaces. Does the author consider light reflecting off a lake to be unnatural, or is it the OK kind of polarized because it's "natural"?
I'm pretty sure their point is that certain frequencies are getting a lot more power than is naturally possible. Not that the photons are special in some way.
I'm not so sure. Even in these threads we see specious distinctions between "natural" and "man-made" EMF.
But even then, it's impossible to discuss without talking about relative strengths. Wi-Fi transmits at about 100mW at full strength. For math purposes, let's assume it's a point source broadcasting in all directions. (That's not that much of a wild assumption, either.) The surface area of a sphere with a radius of 1m is about 12.5 m^2. On average, then, the Wi-Fi RF strength at 1m away is about 0.008W/m^2.
The sun above us delivers about 1360W/m^2 of RF radiation, or approximately 170,000 times the radiation of standing a meter from a Wi-Fi router. If it's across the room, 4m away, the ratio is closer to 3,000,000:1.
Even if our bodies responded to "man-made" radiation differently than the "good, natural" kind, there's so very little of it relatively that it can't make much of a difference. I mean, ever look at a 100W lightbulb? If Wi-Fi were at 400THz instead of 2.4GHz so that you could see it, it would be one thousandth as bright. There's just not enough power there to do anything meaningful to us.
> There's just not enough power there to do anything meaningful to us.
Unless specific frequency bands cause problems because something very specific is triggered.
Sure, wifi may only be hitting you with 1 milliwatt per square meter. But between 2.4GHz and 2.5GHz the sun only hits you with... if I did the math right, and just accounting for blackbody emissions, around 10 picowatts per square meter.
We're probably fine, but whether it's fine can't be proven with a simple physics calculation that ignores spectrum.
Microwave frequencies can harm biological function through heating tissue; in particular eyeballs have lots of water and poor ability to dissipate heat. However, very low power densities are almost certainly safe.
[edit]
Another example of non-ionizing radiation harming human tissue would be if you stick your hand in front of a cutting laser. Maybe obvious, but you asked...
Most concerns focus around the electromagnetic radiation heating your tissue. Microwave ovens operate at 2.4MHz, and most common frequencies can work like a microwave with varying efficiency. At the intensities of normal transmissions that isn't really a concern. For a time this seemed like something we might worry about with phones, since during a phone call there we have an active antenna right next to your fairly sensitive brain that might not like being heated up. But even there it turned out that the effect was too small to be of concern
Ham radio operators do need to worry about radio exposure safety for heating. But we are using much higher powers, 100W is normal HF radio, and 1500W is the limit. 5W handheld next to head is safe. Also, the
“DNA Breathing Dynamics in the Presence of a Terahertz Field” by B. S. Alexandrov, V. Gelev, A. R. Bishop, A. Usheva, K. O. Rasmussen (Submitted on 28 Oct 2009)
"Abstract: We consider the influence of a terahertz field on the breathing dynamics of double-stranded DNA. We model the spontaneous formation of spatially localized openings of a damped and driven DNA chain, and find that linear instabilities lead to dynamic dimerization, while true local strand separations require a threshold amplitude mechanism. Based on our results we argue that a specific terahertz radiation exposure may significantly affect the natural dynamics of DNA, and thereby influence intricate molecular processes involved in gene expression and DNA replication."
The book "The Body Electric: Electromagnetism And The Foundation Of Life"
by Robert Becker and Gary Selden, is a foundational book for this research. First published in 1998.
Becker in his research found that lower power levels could have biological effects, everything else being equal, than higher power levels.
There’s none other than localized heating effects, and yes, it’s laughed out of the room.
So, obviously you don’t want to microwave your eyeballs, but you’d feel that in other nearby tissues as heat. If you don’t feel heat from a non-ionizing RF source, you’re not getting cooked. In any case, the amount of infrared coming off an incandescent lightbulb is about 3 orders of magnitude higher than the energy coming off a WiFi router antenna. If being in the room with a lightbulb is safe, so is being in the room with WiFi.
There isn’t a set of rules of physics where low-power, non-heating, non-ionizing RF is dangerous, and also where CPUs work. They’re incompatible. You can’t have both of those at the same time.
> There isn’t a set of rules of physics where low-power, non-heating, non-ionizing RF is dangerous, and also where CPUs work. They’re incompatible. You can’t have both of those at the same time.
Please elaborate on this? But it sounds like you're overgeneralizing. There's a lot of ways non-ionizing RF could potentially be "dangerous" to some kind of biological tissue, we just haven't found those ways in humans.
For one category of mechanism, there's plenty of proteins that absorb certain wavelengths and activate cellular pathways based on the amount they receive.
We honestly don’t know. Current safety standards mostly focus on preventing tissue heating, because that’s the one effect we can reliably measure and understand. But there’s a chunk of exploratory research out there looking at potential “non-thermal” effects—things like subtle shifts in cell signaling, membrane permeability, or oxidative stress—that might not show up as a measurable temperature increase.
So far, the studies that have been well-designed and replicated haven’t consistently nailed down a clear causal link between non-thermal EMF exposure (within the limits that regulators consider safe) and actual health problems. Still, some researchers argue that we’re not accounting for all the slow-burn, cumulative effects that might be happening. It’s not easy to tease out these subtle influences from the noise of environmental variables, and that makes it hard to really say we’ve got a handle on the whole picture. Check out Prof Michael Levin's Bioelectricity work if you want to go down a very interesting rabbit hole about what we're only recently discovering about how our biology might really work and how electricity and emf's shape it.
With a large enough antenna and enough power you can cook your neighbor.
The ham radio licensing procedure in the US mostly focuses on this effect. Even though there's nothing conclusive I'd imagine there are other deleterious effects that aren't trivially measurable. If it can heat it up it can do other stuff too. Cooking your brain by standing too close to a high power transmission tower can't be good.
I'm an amateur extra, I would challenge any "scientist" laughing EMF dangers off to go find the nearest AM radio tower and spend 6 months in the transmission room for "science".
Without sarcasm, the studies I have found over the years ruled out cumulative effects (unlike ionizing radiation). They so far haven't been able to rule out various types of cancer, ALS, or other diseases caused by long-term exposure.
I also have an extra license. AM transmits up in the ballpark of 50,000W. Anything at 50,000W with toast you like a marshmallow. That has nothing to do with RF, but with the sheer amounts of power pumping through it.
I am plausibly a "scientist" who has done "science", and I'm not standing next to a AM radio tower for precisely the same reason I wouldn't stand next to a 50,000W light bulb, EMF be darned.
It's not on studies to rule out all the things you mention. The job of studies is to demonstrate that EMF does cause any of them. "EMF is safe" is falsifiable: if you can find one counterexample, it's untrue. And yet, after all the years we've been working with it, other than people who get cooked from sheer power levels, we don't have any proof that it causes those (or any) diseases.
Yes, there is evidence, but as usually you will get downvoted to oblivion before you can get the point across.
One of the things being pointed to are these EMFs effecting ion channels. The TRPV1 receptor is one of these channels. The TRPV1 receptor is a heat receptor but has many functions. Since this receptor is in the skin 5G and 6G can effect it. The receptor pumps calcium into the cell, and any neurologist will tell you what that can do.