Downsampling in Tech Insights Monitoring Metric Trends and KPIs

 

What is downsampling?  Downsampling is the process of reducing the sampling rate of a set of data. With downsampling, the end result - a KPI or trend in Tech Insights Monitoring examples -  is not displaying the results of all data points, but rather is picking up one out of N samples of the metric’s dataset where N is set to various integers to control how many data points are plotted in total. For example, if we have a trend where Category = Calls + Metric = Call Streams with a time window of Last 24 Hours, you’ll notice there are data points plotted in every 10 minute increments. That exact metric when viewed with a time window of Last 4 Hours shows data points in every 1 minute increments. The underlying data to power 1 minute increments is there to power any time window, but to help performance and maintain feasibility, the Last 24 Hours trend is downsampled. The end result is that the metric for Last 24 hours is downsampled to ultimately plot the aggregation (max, min, su, average, or latest) of the underlying data in every 10 minute window. 

 

Why is it needed? To illustrate the need for downsampling, consider a “Last 30 Days” Tech Insights Monitoring time window for a trend. If no downsampling were in effect and we were to display one data point per minute, that would be over 43,000 data points to plot which is roughly 4x more than the number of pixels the standard screen has today - that’s simply not possible. With large data volumes there’s often a technical requirement to downsample results in order to feasibly plot datasets. Additionally, there are serious application performance implications when dealing with datasets that large and downsampling drastically improves performance. 

 

When does Vyopta utilize downsampling? If the time window in Tech Insights monitoring is greater than 4 hours, downsampling is in effect. That means any selection at or below the “Last 24 hours” selection in the “LAST HOURS/DAYS” time picker section or any selection in the “CUSTOM DATE/TIME” section of the picker where the end time is more than 4 hours past the start time will have downsampling in effect. 

 

Best practices related to downsampling:

  • Using “Last 4 Hours” as the time horizon is the best way to have the largest window with 1 minute granularity on the x-axis and no downsampling in effect. 
    • To maintain a 4 hour window while navigating to different points in time in the past, simply hit the pause button when in “Last 4 Hours” and navigate back/forward in 4 hour increments with the rewind/fast forward buttons 
  • Use broad time windows (anything beyond 4 hours) to look for general themes and outliers in the data. Once a point if interest is found, hone in on that point in time with a 4 hour window (or less) around it.
  • When using a broad time window (anything beyond 4 hours), consider using a different aggregation than Vyopta’s standard “latest” function so you can avoid potentially arbitrary “latest” values. 
    • In general, “Average” is a great aggregation to use for large time windows. 
  • Please note that if an aggregation is not specified or available in a trend/KPI configuration, the Vyopta default is to use “latest”
Was this article helpful?
1 out of 1 found this helpful

Comments

0 comments

Please sign in to leave a comment.