Tuesday, February 09, 2010

Rolling averages considered harmful

Well, if not actually harmful, it's possible to be misled by them. Here's an example (yes I am making this up) that illustrates something kind of anomalous.

Suppose we're measuring, oh, new cases of the flu, or complaints about potholes in the street, something like this. Now consider the following figures taken over a 7-week period:

week # of cases change
week#1 20 (N/A)
week#2 30 +10
week#3 40 +10
week#4 50 +10
week#5 46 -4
week#6 42 -4
week#7 41 -1
That table tells the story that the number of flu cases (or whatever) got worse and peaked in week#4. In weeks 5,6,7 we saw the number of incoming cases decrease.

Now watch what happens if we ask for a 4-week rolling average to be reported weekly. The table below shows the average over the previous four weeks:

week # cases change 4-week rolling
average
change
week#1 20 (N/A) (N/A) (N/A)
week#2 30 +10 (N/A) (N/A)
week#3 40 +10 (N/A) (N/A)
week#4 50 +10 35 (N/A)
week#5 46 -4 41.5 +6.5
week#6 42 -4 44.5 +3.0
week#7 41 -1 44.75 +0.25
Note that in weeks 5,6,7, the 4-week rolling average is still increasing, even though the number of new cases is headed downward. This sort of thing can cause executives to worry needlessly, but the bad part is that the short-attention-span crowd can then order wild-goose-chases. "You guys say the number of cases is down week-on-week, but our 4-week rolling average is still trending up! Somebody had better find out why!"

So if you're tasked with weekly reports of a 4-week rolling average, you might want to watch out for anomalies like this.

No comments: