PL graph

Posted by: DougHo

PL graph - 02/01/01 09:30 PM

I have my PL% for graph set to 80% (but I'm not sure that matters for this report). I had one period of 100% loss. When I set graph for any period up to and including 24 hours, the red line extends fully to the top (80%).<br>However, when I change the period to 48 hours, the red line only extends halfway to the top.<br>I assume that is a bug - let me know if you need the saved file.<br>If it matters, I'm on Win2K at 1280x1024 x lots of colors.<br><br>
Posted by: Pete Ness

Re: PL graph - 02/02/01 12:14 AM

Actually, changing the scale of a graph changes the number of samples included in any one pixel. Because the packet loss is a percentage, if the "pixels" surrounding your 100% packet loss have no packet loss (or less than 100% packet loss), then when you average these in (by doubling the number of samples in any one pixel), the packet loss percentage may change. This is really evident as you move from a small scale (say 30 minutes) up to a large scale (24 or 48 hours) - you'll see the packet loss pixels changing height during this time because of this averaging.<br><br>
Posted by: DougHo

Re: PL graph - 02/02/01 12:38 AM

I realize that the width changes with each different length period. But the height stays constant (and proper) from 60 seconds to 24 hours. The height error is at 48 hours (not tall enough).<br>If your design is indeed a "halfway" height at 48 hours (why?), then you should change the right-hand scale label to double (for example in my case it would change from 80% to 160%).<br><br><br>
Posted by: Pete Ness

Re: PL graph - 02/02/01 12:46 AM

Some example graphs may help (shoot 'em off to me in e-mail and I'll include them in my reply if you like).<br><br>Let's say that at 24 hour scale, 20 samples are included in every "pixel" width. Let's say you have 20 samples in a row that were lost. As you increase your time scale from 1 hour to 24 hours, the number of samples in any pixel will increase - from 1 at 1 hour (just roughly) to 20 at 24 hours. During this scaling process, 100% of the packets in a pixel width are lost, so packet loss shows at full height.<br><br>Now, let's pop the scale up one more - to 48 hours. At 48 hours, there are 40 samples in any pixel. Because we only lost 20 samples, the other 20 samples were successful - so we only lost 50% of the packets sent out in this time period. We don't actually drop below 100% loss up until that point - because up until that point, we'd always lost more samples than were averaged into a single pixel width.<br><br>I may be misunderstanding your question - and some pictures might help me a lot. Feel free to correct my misunderstanding in that case.<br><br>
Posted by: DougHo

Re: PL graph - 02/03/01 05:31 PM

It sounds like you understand it. I guess the point I'm trying to argue is that it seems important to know any time epoch which had 100% packet loss. When you plot your red line using multiple epochs and it "averages down", I miss that critical info.<br><br>
Posted by: Anonymous

Re: PL graph - 03/24/02 09:46 AM

Suggestion:<br><br>Pete, I suggest that the timeline graph should NOT be averaged when changing the period. The histogram should represent maximum values to make it easier to troubleshoot and find the hotspots. An option to apply an average (in the form of a line as you implemented on the upper charge would be groovy. I too, really want the "scale" of the packet loss to be steady at the maximum valut within any given sample.<br><br>Why doesn't the scale of the latency change with the packet loass? It seems like they behace differently? Thanks.<br><br>