For a 0.2 mm resolution rain gauge, the minimum time interval for 5 % precipitation rate error is determined as follows:
1 divided by 5 % error = 20 tips of a tipping bucket mechanism
Minimum required time period for evaluating precipitation intensity with an error of less than 5 % is:
Light rain - (<2.5 mm/h) with less than 12 full tipping buckets per hour requires a minimum of 100 minutes to achieve a 5 % error in rain intensity.
Moderate rain - (2.5 to 7.5 mm/h) with 13 to 37 full tipping buckets per hour requires a minimum of 33 to 100 minutes to achieve a 5% error in rain intensity and a maximum.
Heavy rain - (7.6 to 50 mm/h) with 38 or more full tipping buckets per hour requires a minimum of 5 to 33 minutes to achieve a 5 % error in rain intensity.
Violent rain - (>50 mm/h) with 250 or more full tipping buckets per hour requires 5 minutes or less to achieve a 5 % error in rain intensity.