At Vyopta, we understand how important it is to have a good user experience when using collaboration tools. That's why we introduced our new User Experience Score. This score is designed to provide you with a better understanding of the quality of your user experience, and to help you take action on your data.
The User Experience Score is calculated by a proprietary algorithm that takes into account many factors, such as average call quality metrics, average quality of meetings, number of rejoins, meeting failures, and weighted volume of issues. This score is dynamic, so it will evolve as collaboration infrastructure evolves.
The Experience Score is available in the users dataset at an individual level, but is more valuable when viewed at scale. You can create a dashboard panel that visualizes the average Experience Score over time, broken down by user department or any other factor. You can also create a panel that tracks the Experience Score of VIP users.
While the inputs and algorithm can change over time, here are a list of factors that are considered:
-
Average call quality metrics – when an individual joins a meeting or call, do they consistently have poor quality connections? Factors like packet loss, latency, jitter, etc are considered in aggregate for each user.
-
Average quality of meetings the user joined – This differs slightly from the above. For example, if a user has high quality connections but other people on the call have bandwidth issues that result in choppy audio, that will negatively impact the user’s experience even if the problem isn’t on their end.
-
Number of rejoins – Some problems result in rejoining a meeting - “I can’t hear you, I’m going to drop and rejoin” - and those are often not captured in quality metrics. Note that the Experience Score factors in circumstances where rejoins may be expected, such as for breaks in very long meetings.
-
Meeting failures – Very similar to rejoins except when the whole meeting restarts instead of one person rejoining. For example, in a one-on-one meeting: “I can’t hear you, let’s drop and I’ll call you right back”.
-
Weighted volume of issues – The quantity of issues is important for power users as the impact on time/productivity scales with volume, even if average quality isn’t remarkable. Additionally, issues are weighted based on the impact they have on experience. For example, if a user can’t hear the speaker due to a poor quality audio stream, that is given more weight than a presentation stream with low frame rate.
Comments
Please sign in to leave a comment.