Saturday, 15 February 2020

Client Offsets: The Six Inch Square Challenge


Designing WLAN's is a pretty tricky business at the best of times, but trying to design a WLAN that works well for all clients is hard as they vary so wildly in terms of form-factor, wireless chip-sets, antenna counts and a plethora of other factors that mean they all see the wireless world in their own annoyingly unique way.

The upshot of this is that when designing a WLAN we have to spend time trying to figure out how well (or badly) each client-type we need to support in our design sees our proposed wireless network. In simple terms, some will "hear" it more clearly than others due to their better antennas, build and sensitivity. A wise man once said that we should design our networks for our "most important, least capable" devices (...don't let him hug you, he WILL squish you). Once we've figured this device out, we have to work out the offset between our client of choice and our survey equipment or RF modelling software.

To get this "offset" to attempt to account for the signal level difference between our  client and preferred design method/tool, we usually take a few comparative signal measurements. This allows us to apply an offset, in dB, to our survey/design report so that coverage heat-maps will be more representative of how the world will look to our important client.

The snag is that, as Keith Parsons and others have noted, even the same model of client can vary due to manufacturing tolerances when you test their reported signal levels under the same conditions. For example, if you get hold of ten Samsung S10's and test them all in the same way, the tolerances of their internal RF components mean that the signal level results can vary by a few dB.

As you can see, getting this offset figure is not an exact science.

Six Inch Square Challenge


A colleague recently challenged me to undertake an exercise to  look at how wireless clients see the world based on yet another aspect I hadn't really considered too deeply before: its spatial orientation and position. I was aware that things would vary a little depending on how the device was oriented in terms of vertical of horizontal positioning, but had not really looked in to it in any real depth.

The exercise (which I've termed the "Six Inch Square Challenge" as people love a meme) investigates how the RSSI level reported by a client will vary with small positional and rotational changes (hint: it may be more surprising than you expect).

 It starts by taking a 6 inch square of paper and fixing it to a surface (e.g. a table top). A test AP is then set up at some distance (I'd recommend at least 5 metres away) in line of sight of the paper square location. A wireless client is then placed on one corner of the square and the test AP RSSI measured using a wireless app on the client. The client is then rotated through 360 degrees in 45 degree increments, with the RSSI at each point being recorded. It's best to allow a reasonable period (e.g. 30 secs) for the RSSI reading to settle at each point.

Once one corner of the square is completed for the 360 degree rotation, the client is moved to the other 3 corners of the square in turn and the whole measuring process repeated.

The entire measuring exercise is repeated 3 times to verify that consistency of measurements is being achieved. The picture below shows the rudimentary setup I used (a spice rack turntable and a butchered cardboard box with some packing tape). I designated the four corners of the paper square as north-west, north-east, south-west and south-east for obvious reasons.



  
My test client was a Samsung S7 running the Ubiquiti WiFiman app (as it just happened to be on my phone). As you can see in the picture, the phone is angled at around 45 degrees from horizontal.

At each test point, I found that the app needed around 30 seconds before it would settle on a relatively consistent RSSI level (give or take one dB). I found that it was best to keep at least 3 feet away from the client to avoid some variations that my proximity seemed to cause at times.

The results of the tests are show below (and available for download from here):
 At first glance, it's just a whole pile of numbers swimming before your eyes. But, once you zoom in on a few specific areas, there are some interesting details in there:

  • I found the range of RSSI readings in a 360 degree rotation very surprising. In the south-east location there was a 7.3dB variation just by rotating the phone through 360 degrees
  • The variation in RSSI just by turning through 45 degrees in some instances was surprisingly high. For example, take a look at 135 to 180 degrees in the south-east position: a 6.7dB difference
  • The variation in position of just a few inches for corresponding rotational positions was very surprising. For instance, take a look at 315 degrees for the north-east and south-east positions: a difference of 4.3 dB for a distance of just 6 inches across the table.
If you dig through the figures, I'm sure you can find your own insights.

I also repeated the same test run with the phone in a horizontal position and received a completely different set of results:

  
I'm sure you can take a look for yourself and pick out some interesting nuggets. It was quite an eye-opener for me (e.g. 10dB variation in the south-east position!)

Conclusion

 

As with everything else in Wi-Fi, getting hard and fast rules to apply when grappling with the data shown by these tests are challenging.

For me, the key takeaway is to be more aware of the possible variations and causes of those variations I might incur by even small changes in measuring technique when trying to determine survey/design offsets. Repetition of measurements would seem prudent to ensure you haven't hit a particular location or orientation that has a particularly unusual RSSI reading and is not "typical" - even just a few inches or varying the orientation by a few degrees may vary things.

It could be argued that perhaps you should use the lowest achieved RSSI measurements to cater for the worst case scenario, but this can turn in to a downward spiral of "what-ifs" when you start to consider things like hands around handsets, bodies between the phone and AP...the list goes on.

The best you can do with this information is to be aware that small positional variations can really skew things if you're not aware of them. I'd advise being sensible and pragmatic, and make sure a few consistent readings can be achieved in your measurement location when trying to determine an useful offset.

References


    


Friday, 7 February 2020

Wiperf: A wireless client performance probe mode on the WLAN Pi

I've had a number of occasions when it would have been really useful to deploy a wireless client device on a WLAN to monitor performance over time from a client perspective. Too often, when troubleshooting a wireless network, everything looks fine from the data provided by your infrastructure kit, but the user experience is a whole different story. Unfortunately, when this requirement has arisen, persuading anyone in the organization in which I was working to invest in a wireless probe-type solution has been an uphill battle.

Around 18 months ago I got to the point when working on an issue that I could not progress and had no choice but to roll my own rudimentary client probe solution. It was a Python script installed on a Raspberry Pi acting as a client on a particular SSID that was having issues. It reported wireless connectivity data and a few client tests (e.g. speedtest & ping) to a Google spreadsheet. The same code also ran on a WLAN Pi, so that I could have two probes running! The data they provided was invaluable in giving me an insight into what was going on over time on the network that was suffering issues.

After that particular “gig” was complete, I moth-balled my probes, but was hugely impressed with the insights they had provided.

Around 4 months ago, my friend Kristian Roberts dropped me a note on Slack asking about using the WLAN Pi as a client probe to gather connectivity data and forward it in to Splunk for reporting. In his day-job, he already used Splunk to do some pretty fancy reporting based on various inputs from his wireless network kit.

I pulled out my old Python scripts and started looking at how I could quickly modify them to send data in to Splunk. Unfortunately, when you look at code you did a few months ago, you start to see how you could have written it better and start to come up with new ideas.

Anyhow, here we are, four months later and I’ve got to the point of having a package that I’m relatively happy to share. There have many evenings of testing, tweaking and brainstorming but we finally have the newly named “wiperf” package ready to go.

Wiperf Overview 


In brief, wiperf is a series of Python modules that are included in the WLAN Pi image that will flip the WLAN Pi in to a wireless client mode, join an SSID and run a series of configured network tests. The tests include Ookla Speedtest, iperf3, ping, DNS, DHCP and http. Once the tests have been run, the results are returned to a Splunk server which stores the data and allows the creation of some very nice historical reports (see the example below).



Setting up the WLAN Pi to run in wiperf mode is very easy. There are just a couple of configuration files to edit for the wireless connection details and the test configurations. Then, it’s just a case of flipping the WLAN Pi into wiperf mode via the front panel buttons. To find out the full details of setting up the WLAN Pi, start here.

Building a Splunk server platform is not as bad as you might expect. You can install it on a Windows, Mac or Linux platform and it’s pretty much the usual executable download and next..next...next install wizard you expect for many applications. Although Splunk is a paid-for package generally, we can run on the free tier as our data volumes are so low.

Once Splunk is installed, a few reporting dashboards need to be configured in Splunk and then you’re good to go. Thanks to the fantastic help I got from Kristian, there are a series of pre-canned report files we’ve built that can easily be used to create your dashboards.

You can find the full details of how to set up Splunk and create your reporting dashboards in the following document I created: Splunk Install and Config Guide.


Conclusion 


Wiperf is a great tactical tool for setting up a wireless client probe when you’re looking at issues on a network or maybe want to do a little performance trending.

It is a little bit clunky to set up compared to the full-blown paid-for options available out there, but it’s a useful tool to have in your bag. It’s not a scalable performance monitoring solution, but I think it probably has quite a lot of value in demonstrating the value of the additional client-level insights a commercial offering provides. This will hopefully encourage decision makers to invest in a more scalable, fit-for-purpose commercial solution that will no-doubt provide an excellent return on investment.

I hope you have fun tinkering with wiperf and it provides some useful insights for you. Please take the time to consult the various documents I have created below to help you on your journey with exploring wiperf.

References