The laws of iterated and triple logarithms for extreme values of regenerative processes

We analyze almost sure asymptotic behavior of extreme values of a regenerative process. We show that under certain conditions a properly centered and normalized running maximum of a regenerative process satisfies a law of the iterated logarithm for the $\limsup$ and a law of the triple logarithm for the $\liminf$. This complements a previously known result of Glasserman and Kou [Ann. Appl. Probab. 5(2) (1995), 424--445]. We apply our results to several queuing systems and a birth and death process.


Introduction and main results
Various problems related to asymptotic behavior of extreme values of regenerative processes is of considerable practical interest and has attracted a lot of attention in probabilistic community. For example, extremes in queuing systems and of birth and death processes have been investigated in [2,3,6,13,20], to name but a few. Analysis carried out in the above papers is mostly based on the classical theory of extreme values for independent identically distributed (i.i.d.) random variables. A survey of early results in this direction can be found, among other, in paper [3]. In recent pa-per [22] a slightly different approach to the asymptotic analysis of extreme values of regenerative processes using a nonlinear time transformations has been proposed.
The aforementioned works were mostly aimed at the derivation of weak limit theorems for extremes of regenerative processes. In this article instead, we are interested in almost sure (a.s.) behavior of general regenerative processes and in particular of regenerative processes appearing in queuing and birth-death systems. Our main results formulated in Theorems 1 and 2 below provide the laws of iterated and triple logarithms for the running maximum of regenerative processes. A distinguishing feature of our results is a different scaling required for lim sup and lim inf. Under the assumption that the right tail of the maximum of a regenerative process over its regeneration cycle has an exponential tail, this type of behavior has already been observed in [11], see Proposition 3.2 therein. Our theorems provide a generalization of the aforementioned result and cover, for example, regenerative processes with Weibulllike tails of the maximum over a regeneration cycle. As in many other papers dealing with extremes of regenerative processes, our approach relies on analyzing the a.s. behavior of the running maximum of i.i.d. random variables. In this respect, let us also mention papers [16,17,19] dealing with a.s. growth rate of the running maximum, see Section 3.5 in [8] for a survey.
Before formulating the results we introduce necessary definitions. Let us recall, see [4], that a positive measurable function U defined in some neighbourhood of +∞ is called regularly varying at +∞ with index κ ∈ R if U (x) = x κ V (x), and the function V is slowly varying at +∞, that is Given a function H : R → R we denote by H −1 its generalized inverse defined by The following definition is of crucial importance for our main results.
Definition 1. We say that a function H : R → R satisfies condition (U) if the following holds: 2. the function H is eventually nondecreasing and differentiable; is regularly varying at +∞ with index κ.
Note that the assumption of regular variation ofĥ implies that h is eventually positive. Thus, H is eventually strictly increasing and the generalized inverse H −1 defined by (1) eventually coincides with the usual inverse. Let X = (X(t)) t≥0 be a regenerative random process, that is, and (T k , ξ k (·)) k∈N is a sequence of independent copies of a pair (T, ξ(·)), see, for example, [21, Part II, Chapter 2] and [9,Chapter 11,§8]. The points (S k ) are called regeneration epochs and the interval and note thatX(T 1 ) is the maximum of the process X on the first period of regeneration. Let F be the distribution function ofX(T 1 ), that is, Note also that it is always possible to write a decomposition where Here and hereafter we denote by C, C 1 , C 2 etc. some positive constants which may vary from place to place and may depend on parameters of the process X(·).
We are ready to formulate our first result.
Theorem 1. Let (X(t)) t≥0 be a regenerative random process. Assume that there exists a decomposition (2) such that (3) holds and the function R 0 satisfies condition (U). Suppose further that α T < ∞. For large enough x ∈ R, let r 0 be the derivative of R 0 . Then and lim inf where Our next result is a counterpart of Theorem 1 for discrete processes taking values in some lattice in R. Such processes are important, among other fields, in the queuing theory. Assume that P(X(t) ∈ {0, 1, 2, 3, . . .}) = 1, t ≥ 0, and, for k = 0, 1, 2, 3, . . . , put Similarly to (2) and (3) we can write a decomposition where R 0 : R → R and R 1 : R → R are real-valued functions and R 1 is such that Theorem 2. Let (X(t)) t≥0 be a regenerative random process such that (6) holds. Assume that there exists a decomposition (7) such that (8) is fulfilled and the function R 0 satisfies condition (U). Suppose also that α T < ∞.
(i) The asymptotic relation entails lim sup (ii) The asymptotic relation The functions A 0 and r 0 were defined in Theorem 1.

Remark 1.
In the discrete setting we assume that there exist extensions of the sequences (R 0 (k)) and (R 1 (k)) to functions defined on the whole real line with the extension of R 0 being smooth. While such an assumption might look artificial, it is necessary for keeping the paper homogeneous and allows us to work both in continuous and discrete settings with the same class of functions U.
The article is organized as follows. In Section 2 we collect and prove some auxiliary results needed in the proofs of our main theorems. They are given in Section 3. In Section 4 we apply Theorems 1 and 2 to some queuing systems and birth-death processes.

Preliminaries
Let us consider a sequence (ξ k ) k∈N of independent copies of a random variable ξ with the distribution function F ξ (x) = P(ξ ≤ x) =: 1 − exp(−R ξ (x)). Put The following result was proved in [1], see Theorem 1 therein. and where, for large enough x ∈ R, .
The proof of Lemma 1, given in [1], consists of two steps. Firstly, the claim is established for the standard exponential distribution τ e , that is, assuming P(ξ ≤ x) = P(τ e ≤ x) = 1 − exp(−x). In the second step the claim is proved for an arbitrary R ξ using regular variation and the representation Here and hereafter z e n = max 1≤i≤n τ e i and (τ e i ) i∈N are independent copies of τ e . We need the following generalization of Lemma 1.

Lemma 2.
Assume that the law of ξ is such that the function R ξ possesses a decomposition (2) with R 1 satisfying (3) and R 0 satisfying condition (U). Then and where a 0 (n) = R −1 0 (log n) and r 0 (x) = R ′ 0 (x). To prove Lemma 2 we need the following simple result, see Theorem 3.1 in [5]. Proof of Lemma 2. Fix a sequence of standard exponential random variables (τ e i ) i∈N and assume without loss of generality that the sequence (z n ) n∈N is constructed from (τ e i ) i∈N via formula (16). The subsequent proof is divided into two steps. STEP 1. Suppose additionally that the function R 0 is everywhere nondecreasing, differentiable, and R 0 (−∞) = 0.
Let C 1 be a constant such that (3) holds. From the definition of the function R −1 ξ and decomposition (2) we obtain and thereupon Hence, by monotonicity of R −1 0 , we have where the equality follows from the mean value theorem for differentiable functions, r 0 (x) = (R −1 0 (x)) ′ and 0 ≤ θ n ≤ 1. Thus, from Lemma 3 we deduce lim n→∞r 0 (z e n + C 1 (2θ n − 1)) r 0 (log n) = 1 a.s.
LetR 0 : R → R andR 1 : R → R be arbitrary nondecreasing differentiable functions such thatR The functionsR 0 ,R 1 andR satisfy all the assumptions of Step 1. Thus, if we set then the sequence (z n ) n∈N satisfies (17) and (18) with the same normalizing functions r 0 (a 0 (n)) and a 0 (n). The latter holds true since for sufficiently large x > 0 we havẽ . It remains to note that the asymptotics of (z n ) and (z n ) are the same. Indeed, set . Then z n =z n for n ≥ n 0 and we see that both (17) and (18) hold for (z n ) as well. This finishes the proof of Lemma 3.
The next lemma is a counterpart of Lemma 2 for discrete distributions. Assume that ξ has distribution where p k ≥ 0 and  (i) if (9) holds, then (z n ) satisfies equality (17); (ii) if (11) holds, then (z n ) also satisfies (18).
The next simple lemma is probably known, however we prefer to give an elementary few lines proof.
Lemma 5. For arbitrary p > 1 and b ∈ R it holds Proof. By the Stolz-Cesáro theorem we have The proof is complete.

Proofs of Theorems 1 and 2
Proof of Theorem 1. Let us start with a proof of equality (4). To this end, we introduce the following notation Since (S k ) are the moments of regeneration of the process (X(t)) t≥0 , (Y k ) are i.i.d. random variables. Furthermore, it is clear that the sequence (Y k ) satisfies conditions of Lemma 2. Therefore, lim sup n→∞ r 0 (a 0 (n))(Z n − a 0 (n)) L 2 (n) = 1 a.s.
Denote by N the counting process for the sequence (S k ), that is, Since lim t→∞ N (t) = +∞ a.s. and N (t) runs through all positive integers, from (25) we obtain By the strong law of large numbers for N we have whence, as t → ∞, In what follows o(1) is a random function which converges to zero a.s. as t → ∞.
It remains to note that Summarizing this gives equality (4). The proof of the relation (5) utilizes equality (18) of Lemma 2 and is similar. We omit the details.
The derivation of Theorem 2 is based on Lemma 4 and basically repeats the proof of Theorem 1. We leave the details to a reader.
Suppose that under the assumptions of Theorem 1 the parameter t runs over a countable set t ∈ T := {t 0 = 0 < t 1 < t 2 < · · · } such that lim i→∞ t i = +∞ as i → ∞. The set T can be random and the process X can depend on T . Assume that P(S i ∈ T ) = 1 for all i ∈ N.
Put X i := X(t i ) andX n = max 0≤i≤n X i . Assume that extreme values of the process X are attained at the points of the set T . More precisely, for all t ≥ 0, In what follows the next proposition will be useful.

Applications
Example 1 (Queuing system GI/G/1). Let us consider a single-channel queuing system with customers arriving at 0 = t 0 < t 1 < t 2 < · · · < t i < · · · . Let 0 = W 0 , W 1 , W 2 , . . . , W i , . . . be the actual waiting times of the customers. Thus, at time t = 0 a first customer arrives and the service starts immediately. Denote by ζ i = t i − t i−1 , for i ∈ N, the interarrival times between successive customers, and η i , i ∈ N, is the service time of the i-th customer. Suppose that (ζ i ) and (η i ) are independent sequences of i.i.d. random variables. In the standard notation, this queuing system has the type GI/G/1, see [12,14]. Let W (t) be the waiting time of the last customer in the queue at time t ≥ 0, that is, and W (t n ) = W (t n +) = W n .
Denote Eζ i = a, Eη i = b and assume that both expectations are finite. Further, we impose the following conditions on ζ i and η i : and for some γ > 0, it holds Under these assumptions the evolution of the queuing system can be decomposed into busy periods, when a customer is in service, interleaved by idle periods, when the system is empty. Let us introduce regeneration moments (S k ) of the process W as follows: S 0 = 0 and, for i ∈ N, S i is the arrival time of a new customer at the end of i-th idle period. Let T i be the duration of the i-th regeneration cycle, andW (T 1 ) be the maximum waiting time during the first regeneration cycle. It is known, see [3] and [13], that under conditions (36) and (37), we have Condition (36) also guarantees that the average duration of the i-th regeneration cycle is finite, that is, α T = ET i < ∞, see [14, Chapter 14, §3, Theorem 3.2]. Thus, if we set X(t) = W (t), R 0 (x) = γx, R 1 (x) = log C+o(1) and r 0 (x) = γ, then from Theorem 1 and Proposition 1 we derive the corollary. Corollary 1. Assume that the queuing system GI/G/1 satisfies conditions (36) and (37). Then and Remark 2.
(i) Suppose that that is, we consider the queuing system M/M/1. Assume further, that ρ := λ/µ < 1. It is easy to check that conditions (36) and (37) are fulfilled, and therefore equalities (38) and (39) hold with γ = µ − λ = µ(1 − ρ). Example 2 (Queuing system M/M/m). Let us now consider a queuing system with m servers and customers which arrive according to the Poisson process with intensity λ, and service times being independent copies of a random variable η with an exponential distribution In the standard notation, this queuing system has the type M/M/m, see [12,14]. We impose the following assumption on the parameters λ and µ ensuring existence of the stationary regime: For t ≥ 0, let Q(t) denote the length of the queue at time t, that is, the total number of customers in service or pending. Set In the same way as in Example 1, one can introduce regeneration moments (S k ) for the process Q: S 0 := 0 and, for i ∈ N, S i is the arrival time of a new customer after the i-th busy period. Let T i be the duration of the i-th regeneration cycle andQ(T 1 ) be the maximum length of the queue in the first regeneration cycle. Put In recent paper [7] the authors established that function R in (41) satisfies conditions (7) and (8) with Using Theorem 2 we infer and lim inf Remark 3. Relations (42) and (43) have been proved in [7] by direct calculations. Let us note that in case m = ∞, which has also been treated in [7], the asymptotics ofQ(t) is of completely different form, see also [18].
Example 3 (Birth and death processes). Let X = (X(t)) t≥0 be a birth and death processes with parameters λ n = λn + a, µ n = µn, λ, µ, a > 0, n = 0, 1, 2, . . . , see [14,Ch. 7,§6]. That is, (X(t)) t≥0 is a time-homogeneous Markov process such that, for t ≥ 0, given {X(t) = n} the probability of transition to state n + 1 over a small period of time δ is (λn + a)δ + o(δ), n = 0, 1, 2, 3, . . . , and the probability of transition to n−1 is µnδ+o(δ), n = 1, 2, 3, . . . . The parameter a can be interpreted as the infinitesimal intensity of population growth due to immigration. The birth-death process X is usually called the process with linear growth and immigration. We assume that X(0) = 0 and Put It is not difficult to check that condition (45) ensures and Under conditions (46) and (47), see [14] and [15], there exists a stationary regime, that is, lim Further, X is a regenerative process with regeneration moments (S k ), where S 0 = 0 and S i , i ∈ N, is the moment of i-th return to state 0. It is known that where T k = S k − S k−1 is the duration of the k-th regeneration cycle, see Eq. (32) in [22]. If (45) holds, then see [14]. We are interested in the asymptotic behavior of extreme values Let us show how to apply Theorem 2 to the asymptotic analysis ofX(t). Firstly, we need to evaluate accurately the sequence (R(n)) defined by q(n) := P(X(T 1 ) > n) = exp(−R(n)).
The function R 0 (x) = −x log ρ − a λ log x is increasing for x ≥ x 0 = − a λ log ρ and, furthermore, satisfies condition (U).
Thus, from Theorem 2 we infer the following.
Let us finally mention without a proof a statement which follows easily from equations (48), (56) and Theorem 2 in [22].