My apologies if this has been already addressed elsewhere.

With the graph resolution display period set to any value greater than 60 minutes the auto scaling Y value (latency) is always some inaccurate, averaged-looking value rather than the true maximum Y value of the displayed time frame. Aside from the undesirable arbitrary discontinuity in behavior from the accurately displayed max Y value at resolutions of 60 minutes or less, this appears to simply be a radically inaccurate manner to designate graph scale. What's the story here?