Sunday, 17 February 2013

Adjacent Channel Interference

I was inspired to try a bit of experimentation following Keith Parsons' recent WiFi stress testing sessions, where he conducted testing on a number of vendor APs to observe at what point they collapsed in to a heap  due to traffic throughput.

I can't carry out anything as grand or detailed as Keith (I don't have his knowledege, brains or resources!), but I thought it might be fun to test some de-facto rules around the use of the 2.4GHz band for WiFi.

In summary, due to the bandwidth requirements of WiFi equipment, it recommended that the spacing of at least 22MHz is allowed between the channels being used in the 2.4GHz band. This is required for the bandwidth requirements of the older DSSS modulation scheme. The newer OFDM modulation technique requires only 20MHz of space, but for reasons of backward compatibility, the 22MHz rule persists.

The band itself is sliced up in to 11, 13 or 14 channels depending where you are in the world. Each channel is 5MHz in width, as shown in the diagram below:



To provide the 22MHz spacing required, it is generally recommended to use channels 1, 6 and 11, which provides a 25MHz spacing between the centre frequencies of the channels in use.

If these guidelines are not observed, then  due to the spectral mask of the WiFi signals, adjacent channel interference starts to occur, which causes errors across interfering channels and is generally a very undesirable situation. So, in a ideal world, everyone will play nicely and stick to the non-overlapping channels 1,6 and 11. Even neighboring networks that are using the same channels will co-exist quite well as they all play by the same rules defined in the 802.11 standard.

However, although this 'conventional wisdon is well established. I have never tested it for myself. So, I decided to fire up a couple of APs, a couple or iPads and an iperf server and see exactly what happened when I created my own adjacent channel interference.

My test  kit was:
  • a Cisco 2602 AP (nailed on channel 11)
  • a Cisco 1142 AP (varied across channels 6 to 11 during the experiment)
  • an iPad 2 running the iperf2 app
  • a 4th gen iPad running the iperf2 app
  • a Windows 2003 server running an iperf server
Both APs were nailed at a transmit power of around 10dBm. The test venue was just my garage at home (that's all I had!), with the kit laid out as shown below:




The distances were a bit shorter than I would have liked, but I guessed that if there were to be any effects to be observed, I should easily see them well with all devices in such close proximity.

Each AP has its own SSID and each of the iPads was configured to use the SSID of its nearest AP.

The 2602 AP and 4th Gen iPad were left untouched using channel 11 throughout the test. Channel 11 was selected, as it was the only channel that was not being used by any of my neighbors - see the inSSIDer screen-shot of the 2.4GHz band prior to testing:



The iPad 2 and 1142 AP were switched across channels with each test run. The first test run was on channel 6, the second on channel 7..etc. The purpose was to see the effect of throughout on the 4th Gen iPad fixed on channel 11 as the iPad 2 got closer and closer to its channel. Both iPads were running a TCP iperf test to the iperf server on the Windows 2003 server.

Each test was run 3 times to verify that I was getting consistent results.

The first test run was run with only the 4th gen iPad running the iperf test. This was to gauge the throughput of the iPad in the absence of any interference or competing load.

The results are shown in the table below (this shows the throughput on the 4th gen iPad - both iPads were simultaneously running an iperf test during each test run):

1142 AP Channel
Centre Freq
Freq Diff (Mhz)
Result 1 (mbps)
Result 2 (mbps)
Result 3 (mbps)
Avg
-
n/a
n/a
35.5
35.8
34.6
35.3
6
2437
25
34.7
34
34.7
34.5
7
2442
20
37.5
35.1
35.5
36.0
8
2447
15
2.3
2.7
2.9
2.6
9
2452
10
17.4
18.5
18
18.0
10
2457
5
4
2.3
2.3
2.9
11
2462
0
22.1
22.2
23.5
22.6

The results were quite surprising to be honest.

The baseline test-run with just the 4th gen iPad running a test showed a througput of around 35mbps, which is what you might expect for a single stream device under good conditions. So, no surprises there.

There were similar results when the iPad 2 was fired up and set to use channels 6 & 7. Throughput was pretty much the same, which is to be expected as we are maintaining the 20MHz spacing required for OFDM modulation.

However, at the 15MHz spacing when the iPad 2 is on channel 8, the performance falls off a cliff, averaging 2.6mbps. This was a real surprise. I expected a degradation, but not this level of impact. I repeated these tests a number of times (3 times, and, even repeating again at the end of my testing) to make sure they were not some type of anomaly caused by some sudden local noise source or other factor. But, the result remained the same.

For some reason, performance recovered quite a lot with a 10MHz spacing and dropped back off again with a 5MHz spacing.

Once we hit zero spacing (i.e. both APs on the same channel), things recovered, as both APs would both be playing nicely, effectively sharing the available RF bandwidth between them.

I must admit to being left a little confused by some of the results observed. It certainly underlines the impact on performance that may be experienced by straying from the conventional wisdom of using channels 1, 6 and 11. But, it does not explain why certain spacings have such a devastating impact compared to others.

One other point to note is that given the results shown above, it could be feasible to use a more concise channel spacing stragegy if just allowing a 20MHz spacing approach. This may have some merit if you have an OFDM-only environment, but if DSSS devices are present in neighboring buildings or floors. you will have issues. Also, if your neighbors have adopted the well-establish 1-6-11 channel plan, your new-fangled 1-5-9-13 (for instance) channel plan is going to suffer some pretty bad adjacent channel interference, which judging by the results above is a very bad thing...

Caveat: this testing is not definitive or done using particularly well controlled or scientific methods. I do not guarantee the validity of anything presented in this document. However, I'd be very interested in any feedback from others as to its accuracy from their own testing or real-world observations. Any advice on a better approach to better understand nuances of the results observed would also be welcome.