Some events are characterized by their severity and duration. For example, the seriousness of an ozone alert may depend on its duration and the level of ozone during the alert. In communications, the seriousness of network problems depends on the rate at which messages are blocked (fail to reach their destination) and the persistence of a high blocking rate.

In statistical terms, a random mean *M(t)* changes with time
*t* but
is usually near a background level, such as zero. Occasionally,
however, *M(t)* jumps to a higher level and stays high, but not
necessarily constant, for a random length of time before it returns to
its background level. When *M(t)* remains above a severity
threshold
*T* for a duration *D*, then an *event* at thresholds
*(T,D)* is said to occur. The higher the thresholds,
the more serious
the consequences of the event. To detect events and estimate event
rates for a range of severity and persistence thresholds, observations
*Y(t)* are taken. For example, *M(t)* might
be the probability a message sent at time *t* is blocked
and *Y(t)* the
completion status of a message sent at time *t*.

This paper shows how a sequence of Bernoulli data collected throughout time can be used to detect events and to estimate event rates for wide ranges of thresholds on duration and severity. The algorithm for detecting events is simple, but, unlike past detection schemes, does not rely on aggregating the data. Our estimator for rate as a function of severity and persistence is model--free, except for one step in a sampling bias correction, and practical even if there are hundreds of millions of observations. Confidence intervals on rates are built using a kind of partial bootstrapping that is suitable for very large sets of data. This work has been applied to data on the reliability of communications networks.

**Paper:**
Postscript