Is Wi-Fi Safe?

This is intended for people who are not wireless engineers and may know very little about how Wi-Fi works.

I will provide an overview of the actual levels of RF (radio Frequency) energy used in typical Wi-Fi deployments.  Sorry if the title sounds a little click-baitish.  I wanted to give you some more information so that you can make a better informed decision.

Let’s talk about power.

Wi-Fi is typically measured using either watts (W), milliwatts (mW), or decibel-milliwatt (dBm).  dB is a reference measurement and dBm is the power relative to a milliwatt.  dBm gives us a much easier number to work with as many of the mW measurements used in Wi-Fi are extremely small numbers, as you will see.  Most Wi-Fi measurements in the real world are in negative dBm values.  The closer to 0 the measurement, the stronger the signal.

There are governmental regulations that limit how much output power can be used for specific frequencies.  In the USA the FCC is the entity that manages the RF spectrum.  Now there are many specific details regarding what is allowed, and many exceptions to those rules as you can imagine with any governmental regulation.  I will not be going into the weeds, but rather provide a general overview.

The FCC limits the amount of output power in the frequencies used in Wi-Fi (the 2.4GHz and 5 GHz spectrum) to a maximum of 1 watt.  Again, this is a gross generalization, but for our purposes this is the overall maximum allowed.  1 W can be defined as 1,000 mW or 30 dBm.  They are all equal.  The 1 W maximum limit is the actual power from the radio emitted prior to an antenna.  Antennas provide gain which increases the amplitude of the signal.  The limit of 1 W is referenced with an anticipated 6 dBi gain antenna, which would put the total RF energy to 4 W, 4,000 mW, or 36 dBm.  You can use higher gain antennas by reducing the output power as long as you stay under the 4 W, or 36 dBm limit, called the EIRP (Equivalent Isotropically Radiated Power).

Now to put that in perspective, the absolute maximum power we can use in Wi-Fi is 4 watts.  Microwave ovens typically run in the 700 – 1,000+ watt range, and happen to use the same frequency we use in Wi-Fi – 2.4 GHz.  Can you imagine trying to warm up yesterday’s pizza with a 4 W microwave? So we are talking a very small amount as our maximum.

RF dissipates FAST!

In most well-designed environments we are not using the maximum power levels near 1 W, but typically closer to the 100-200 mW range, or less for high dense areas.  For the purpose of this blog post I will be basing the information on a worst case scenario.

RF  energy is emitted from an antenna and radiates outward in the direction the antenna is designed.  Most home devices would emit the signal in the shape of a doughnut.  Usually equal in all directions but with a little less signal directly above or below the antenna (the black stick looking thing).  As the energy travels outward, it dissipates very rapidly.  A common analogy is a ripple in a pond.  At the point where the rock is thrown into the water the ripple is strongest.  As the waves move away from the point of impact they decrease in strength.  This is the same way RF works.  It is strongest near the source, the antenna.  In fact the rate at which it dissipates is so rapid that we end up with measurements of less than 1 mW in just a few feet!

Real World Measurements

I am going to be using an enterprise class access point (Cisco 1702i) for the test scenario below.

I have an access point (AP) configured to run at its maximum power setting which is 22 dBm, or ~160 mW.  It has a 4 dBi gain antenna so the total output could be measured at about 400 mW.

I took a few measurements with a Fluke Aircheck, which is a device used to take spot-check measurements.  

  • The first measurement I took was at a distance of 3′.  This would be about the closest most users would be to an AP in an office environment assuming 9′ ceilings and a 6′ tall person.  I measured the signal at -29 dBm.  This is equivalent to only 0.0012589254118 mW.  As you can see, at just 3 feet away we are already at 1/1,000th of a mW, or 1/1,000,000 of a watt!
  • At a distance of 6′ away I got a measurement of about -35 dBm, or 0.0003162277660 mW.  (You can see why we use dBm measurements instead of mW now.)  So in just 3′ we dropped signal by about 4 times.  That is actually a quick way of calculating signal loss – every time the distance doubles, you lose about 4 X the signal strength.  At 6 feet away from an AP running at max power we are measuring .0003 mW, and yet we started at 400 milliwatts!
  • In fact the absolute hottest signal I could measure by placing my Aircheck physically on the AP was -8 dBm, or 0.15848931925 mW.  You can see why I am not very concerned about the amount of energy I am absorbing from Wi-Fi.

Most Wi-Fi deployments are designed with a goal of -67 dBm of coverage everywhere.  Often this is done with much lower power levels than the maximum.  At -67 dBm we are at 0.0000001995262 mW.   Yes, that is 6 zeroes after the decimal.  And yes, that is a fraction of a milliwatt.  I guess you could call it 0.1 nanowatts.  For perspective, in a middle school environment that is designed for pretty high user capacity and -67 dBm coverage, the hottest signal measured in the school is about -40 dBm, or .0001 mW.

Distance dBm mW
At antenna element 26 398.10717055
0 Feet -8 0.15848932
3 Feet -29 0.00125893
6 Feet -35 0.00031623
Varies -40 0.00010000
Varies -67 0.00000020

What about the clients?

The majority of traffic is usually downstream from the AP to the client, so more often than not, it is the AP that is doing the transmitting.  But clients do transmit too.  Due to hardware limitations of the clients, many of which are battery powered, they cannot transmit at the same maximum power levels that the APs use.  Most personal client devices like cell phones, tablets, etc.  have a maximum output power of less than 20 dBm, or 100 mW.  The cell phone chips in the phones can go up to the 1 watt range, but the Wi-Fi chips are not that powerful.  When setting my phone, a Galaxy S8, to hotspot mode, the hottest measurement I could get was -4 dBm, or 0.39810717055 mW.  Still less than half of a single milliwatt.   Laptops may have higher output power levels than phones or tables, though we are also usually at least a couple feet away from the antennas which are typically around the screen of the laptop.

Summary

The simple fact that the actual amount of energy used in most Wi-Fi deployments is extremely small helps me feel way less concerned about the potential damaging effects of any Wi-Fi based radiation.  That being said, I would always offer a word of caution or common sense to give yourself that extra little space from whatever device you are using, it can’t hurt.  Place your home routers/Wi-Fi APs somewhere that is more than a foot or two away from where people typically sit.  That extra foot or two drastically decreases the exposure levels, and again, doesn’t hurt.  Are we cooking our kids in schools with Wi-Fi deployments?  I don’t think so.

–Scott