Amazon web services AWS Cloudwatch – difference between Maximum and Average metrics in cloudwatch.

Standard

I asked myself why has my cloudwatch monitoring such a big difference between maximum and average.

Lets just look at some facts first:

Cloudwatch can either monitor every 5 minutes, which is the default or you can select detailed monitoring then it is every 1 minute.

For the sake of this explanation we assume 5 minutes monitoring and we are graphing 1 hour.

So average would do this:

( 2 + 3 + 5 + 7+ 4 + 6 + 3 + 8 + 9 + 4 + 10 + 1) / 12 = 5.1666

So 5.1666 would be shown on the graph

On maximum it would show the highest number 10

And SUM of course would be

2 + 3 + 5 + 7+ 4 + 6 + 3 + 8 + 9 + 4 + 10 + 1 = 62