How We Rate and Review Products
March 19, 2003
Because of the ever increasing importance of wireless LANs for homes, small businesses, and the enterprise, 802.11 Planet is embarking on a new method of doing reviews.
For the New Year, and because of the ever increasing importance of wireless LANs for homes, small businesses, and the enterprise, 802.11 Planet is embarking on a new method of doing reviews.
Contributing editor Joe Moran is taking on the mantle of reviews guru. He'll be putting 802.11a, b, and g products through their paces in his home and office, using products like Chariot from NetIQ or AiroPeek NX from WildPackets as appropriate to get quantifiable results. We'll show you the results of our tests and how they compare from product to product. Plus we'll provide an ever growing features table of products as we add new reviews in each category.
Each product we review will receive a numerical rating, which equates to a five being excellent to a one being a product so extremely poor we can't find any redeeming qualities. To come up with this final rating, Joe will be evaluating products based on the following criteria:
Initial installation and setup
How easy or difficult is it to deploy the product?
Ease of use/configuration; quality of included software
How easily can the product be configured and re-configured? Does the interface report correct and useful information? How easy are firmware and other upgrades? Here we will also evaluate intangibles, such as whether displays are clear, cords are long enough, and so on.
Documentation; quality and quantity
Is documentation provided in written and/or electronic format? Are quick-start materials provided? Is all documentation comprehensive and clearly written?
We do basic interoperability testing, such as whether products from two vendors work together, and if so, is the performance acceptable. If the product is already certified for Wi-Fi interoperability by the Wi-Fi Alliance, we'll tell you.
We place the WLAN access point or router inside an office at the end of a hallway thats approximately 125 feet long. At the far end of this hallway is a right angle corridor that leads to another room about 25 feet long, yielding approximately 150 feet of total distance from the WLAN device.
Performance testing is conducted using v4.3 of the Chariot network performance analyzer from NetIQ. Three separate testing scripts, which measure TCP throughput, network response time, and UDP streaming are individually run. A 500 kbps stream is used for the last test. The tests are initiated via the WLAN side of the network on an IBM ThinkPad notebook running Windows 2000 Professional.
We start by testing performance with the notebook in the same room as the access point, the two devices about 10 feet apart. We then repeat the tests at a 25 foot distance, and then at 25 foot increments thereafter, up to the 150 foot maximum.
We also repeat all of the 10 foot tests with WEP turned on to determine if there is any performance penalty incurred by enabling the encryption feature.
How does the product compare with similar products in the same price category, especially considering the features provided?
We hope you'll find the new methodology for reviews testing useful and as always, we welcome your feedback.