Improvements to Vyopta’s “Quality” Metrics

As of January 22nd 2020, we have released significant improvements in our quality metrics.  The metrics that we report for quality have been updated to be more precise, to encompass more data points, and ultimately to give you more confidence in taking action on your data. Here’s what has changed:

  • Audio, Video, and Presentation channels now have individualized quality thresholds. This eliminates some “false positive” quality reporting that was previously caused by issues on the Presentation channel. Lower levels of Presentation channel quality issues do not generally result in a lower user quality experience, but were being reported at the same level as video and audio channels. 
  • Latency is now factored into the quality data when provided by the source. Latency is not factored when not available from the incoming source data. 
  • All quality thresholds (Good / Fair / Bad) have been updated to more accurately reflect true user experiences.
  • We have improved the reporting of Zoom quality metrics. Previously Zoom quality issues were being under reported in CPM Analytics, but over reported in CPM Monitoring. The new metrics for Zoom are a more accurate reflection of user quality experience. 
  • Metrics use the following Good / Fair / Bad thresholds:

Picture1.png

CPM Analytics and Monitoring are reporting results using the new metrics effective Jan 22, 2020

 

What you will see at the transition

  • All customers will experience differing levels of quality reporting changes given the new definitions. The number and percentages of calls, meetings, and participant experience reported at each threshold is likely to change. If you have alerts based on quality, the frequency of those alerts will also change. 
    • The magnitude of the change is very specific to your configuration and the levels of quality that currently exist across your infrastructure and endpoints. 
  • As a point of general guidance a comparison of the same calls (after the changes vs before) would be that you would see fewer Bad calls, meetings, and participants in your reporting and alerting. 
    • Bad calls may be lower by 25%-40% (Median 33%)
    • Bad Meetings may be lower by 1%-15% (Median 7%)
    • Bad Participants may be lower by 1%-20% (median 7%) 
  • This is primarily due to the prior rules reporting an unnecessarily high amount of Bad calls due to jitter on the presentation channel that didn’t actually reflect a truly Bad experience. 
    • Exceptions include customers with a mix of Zoom in their configuration. In those cases the decrease in bad calls may be less or the percentage of bad calls reported might even be higher due to the more accurate / higher Zoom reporting.

What you should do next:

  1. Run any regular quality reports and compare to prior reports to understand the changes for your unique configuration.
  2. Review and Quality alerting based on your report analysis and adjust alerting levels as needed.  

We do understand that changes in reporting of core metrics such as quality are challenging, especially when that reporting has been shared and made actionable across our customers’ organizations. At the same time, we are obligated to assist you by evolving the entire science of Unified Communication quality reporting across the industry...one where there are no standards, and each vendor can report quality data in unique ways that may or may not reflect your needs in delivering a reliable end user UC experience. 

 

Was this article helpful?
1 out of 1 found this helpful

Comments

0 comments

Article is closed for comments.