Yesterday’s metrics fall short for today’s smart homes
Quality of Service (QoS) measures have been used for many years to evaluate the performance of connections to devices in the home. However, due to the growing diversity in IoT devices, QoS ends up answering the wrong question. Service Providers now need to be less concerned with the raw performance of the broadband connection, and more concerned with the happiness of their customers. After all, it is customer satisfaction that drives new subscriptions, deepens penetration, and prevents churn.
Quality of Experience (QoE) is a new metric developed using a wide range of data from Plume’s Cloud that does a much better job of expressing “device happiness” and therefore customer happiness. QoE has a number of fundamental differences compared to traditional QoS:
- QoE considers the entire path from the internet to the device in the home, particularly paths that include one or more hops of Wi-Fi. These days most devices are connected by Wi-Fi, and Wi-Fi is often the biggest problem area on the path.
- QoE considers many more network performance factors, including data rates, signal strengths, packet error rates, interference, channel utilization, broadband throughput, and the topology of the network in question (number of hops, channel sharing, etc.).
- QoE considers the needs of each device in the home, factoring the type of device, the networking capabilities of the device, the current and historical data usage, and requirements of the specific device in its specific environment.
The complexity brought on by the realization of Smart Home 2.0 requires a metric that considers the needs of each device. Consider a home with a Wi-Fi connected-thermostat and a Wi-Fi-connected video streaming device. Perhaps the thermostat is at the far end of the house, and can only achieve 5 Mb/s throughput. On the other hand, the video streaming device might be in a location where it can achieve 15Mb/s. Traditional QoS methods would indicate that the thermostat has poor QoS, but the video streaming device has good QoS. But true customer happiness is likely reversed. The thermostat would be perfectly happy with even just 1Mb/s throughput, but the video streaming device can’t even support 4K TV reliably with that throughput. By considering the needs of each individual device, QoE gives the more informed answer: the thermostat is fine, but the video device requires attention.
Example use cases for QoE illustrate why a more accurate measure is so critical:
- Guiding decisions on network configuration: Frequency channels and multi-hop paths should be chosen with the intent to improve performance of unhappy devices; additional resources do not need to be allocated to happy devices that happen to have modest connections.
- Identifying the need for Wi-Fi extenders: Save your extender expenses for homes in which devices are truly struggling, not homes with adequate connections to devices with low needs.
- Finding customers experiencing difficulties: Customers complain and churn when devices work poorly. Whether or not a device works well depends equally on what the device is/needs and what the network provides.
- Enabling support personnel to identify real problems: Generic, single-dimension measures of performance fail to catch many devices that are struggling.
How different is QoE from traditional approaches? Traditional approaches search for devices with weak Wi-Fi signals (e.g. <-75dBm), or high levels of interference (e.g. >50%). The following anonymized data from a sample of U.S. households managed by the Plume Cloud quantifies the difference for two exemplary cases:
- 10% of video-streaming devices were truly unhappy (had poor QoE)
- 3.7% had a weak signal and 2.6% suffered high interference by traditional methods, for a total of 6.3% of video-streaming devices with poor QoS by traditional methods
- But, only about half of the video-streaming devices identified by traditional methods were actually unhappy (3.4% poor QoE rating)
- 46% of the video-streaming devices identified as poor QoS by traditional methods are actually fine.
- 66% of the video-streaming devices that are truly unhappy were not detected using traditional methods.
IoT Devices (including sensors, doorbells, security cameras, thermostats, smoke alarms, light bulbs, etc.)
- Only 1.3% of IoT Devices were truly unhappy (have poor QoE)
- 6.8% had weak signal and 2.5% had high interference by traditional methods, for a total of 9.3% of devices with poor QoS by traditional methods
- But, only about one in thirty of the IoT Devices identified by traditional methods were actually unhappy (0.3% overall)
- QoS is incredibly inaccurate for IoT Devices, missing 77% of IoT Devices that are truly unhappy, and identifying 9.3% of all IoT Devices as unhappy when nearly all of those are fine.
What’s the explanation for these discrepancies?
The causes can be determined by looking at the device types. Traditional methods identify too few unhappy video streaming devices because the high needs of video devices, and other factors in their performance, are not considered within a QoS metric. Traditional methods identify too many IoT devices because traditional methods don’t realize that most IoT devices have modest needs.
How can Plume help?
The benefits of QoE are obvious. It’s less obvious how to obtain the information necessary to make efficient decisions. Plume is in a unique position to do deep analysis on a large amount of data to understand users’ experiences–with tens of millions of homes under management, all connected to cloud-based data storage and analysis. We use novel methods to identify device types accurately and in detail, and we employ sophisticated machine learning algorithms to correlate device and network behaviors to customer complaints. Wrapping this into a QoE scoring methodology, Plume enables Internet Service Providers to understand their customers, and focus their resources and attention in the right places.