Suggestion:<br><br>Pete, I suggest that the timeline graph should NOT be averaged when changing the period. The histogram should represent maximum values to make it easier to troubleshoot and find the hotspots. An option to apply an average (in the form of a line as you implemented on the upper charge would be groovy. I too, really want the "scale" of the packet loss to be steady at the maximum valut within any given sample.<br><br>Why doesn't the scale of the latency change with the packet loass? It seems like they behace differently? Thanks.<br><br>