1 Introduction
Since the seminal work of Ornstein and Uhlenbeck [22], the Ornstein–Uhlenbeck (OU) process has become a cornerstone of stochastic modeling. Originally introduced to describe the velocity of a particle subject to friction in statistical physics, its mathematical tractability and mean-reverting structure have made it a canonical model across disciplines, from physics and biology to financial mathematics. In finance, OU-type dynamics underpin, among others, the Vasicek model for interest rates [23], the Heston model for stochastic volatility [14], and the Lévy-driven volatility models of Barndorff-Nielsen and Shephard [5, 4].
OU-type processes are particularly valued for their ability to capture relaxation phenomena in physics and mean-reverting quantities in finance, such as interest rates or volatility. The classical Brownian and fractional Brownian cases laid the foundation for a vast literature, but they also revealed limitations: Gaussianity and light tails are often inconsistent with empirical data. This has motivated a surge of interest in OU processes driven by more general type of noises. For instance, Barndorff-Nielsen and Shephard [5] pioneered Lévy-driven OU processes for stochastic volatility, while more recent works have emphasized roughness and long memory as essential features of volatility dynamics.
In parallel, several recent contributions have explored OU-type dynamics driven by even more exotic or generalized types of noise than the usual Brownian or fractional Brownian motion case, extending the classical Gaussian or Lévy settings. Examples include OU processes constructed in the white-noise framework via generalized fractional operators as in Beghin, Cristofaro and Mishura [6], models based on generalized grey Brownian motion and considered by Bock, Demestre and da Silva [9], and OU-type systems driven by generalized grey incomplete gamma noise, quite recently investigated by Bock and Cristofaro [8]. These works illustrate the growing interest in OU dynamics beyond the classical paradigms and further motivate the study of OU processes driven by new type of noises, and in particular, a non-Gaussian one (the Hermite Ornstein-Uhlenbeck process) as considered in the present paper.
Modeling of volatility remains one of the central challenges in quantitative finance. The breakthrough work of Jaisson, Gatheral, and Rosenbaum [13] demonstrated that volatility exhibits rough behavior, well described by fractional OU processes with Hurst parameter H close to zero. Yet Gaussian or Lévy-driven models, while capturing some stylized facts, remain insufficient for certain asset classes, such as commodities or cryptocurrencies, where empirical evidence points to heavy tails, long memory, and non-Gaussian dependence structures. In these contexts, richer models are required.
A promising direction is provided by Hermite processes, which are non-Gaussian, self-similar processes living in the qth Wiener chaos. When $q=1$, the Hermite process reduces to fractional Brownian motion, but for $q\ge 2$ it becomes genuinely non-Gaussian while still retaining long memory and self-similarity. Recent work by Assaad, Diez and Tudor [2] has highlighted the potential of generalized Hermite processes as flexible models for log-volatility, with self-similarity indices spanning the entire interval $(0,1)$. These processes thus offer a natural framework for modeling assets that require advanced non-Gaussian features.
In this paper, we take a further step by considering an Ornstein–Uhlenbeck process driven by a “Hermite–Ornstein–Uhlenbeck (HOU) noise”. It will be called the OU-HOU process (the Ornstein–Uhlenbeck process driven by a Hermite–Ornstein–Uhlenbeck process). This construction is particularly appealing because it introduces multiscale mean reversion: the outer OU captures long-term dynamics, while the inner Hermite-driven OU injects correlated, non-Gaussian fluctuations. Such nested structures are natural in applications where both short-term noise and long-term equilibrium forces coexist, including statistical physics (colored noise models), neuroscience (correlated membrane potential fluctuations), and finance (multiscale volatility). For related works on OU-type processes driven by standard OU or fractional OU processes, we refer to [7] or [12].
First of all, let us introduce $\left({Z_{t}^{H,q}},t\ge 0\right)$ as the Hermite process of order $q\ge 1$ and with self-similarity index $H\in \left(\frac{1}{2},1\right)$ in the following way (precise details are given in Section 2):
\[\begin{aligned}{}{Z_{t}^{H,q}}& =d(H)\hspace{-0.1667em}{\int _{{\mathbb{R}^{q}}}}\hspace{-0.1667em}\left({\int _{0}^{t}}\hspace{-0.1667em}\hspace{-0.1667em}du{(u-{y_{1}})_{+}^{-\left(\frac{1}{2}+\frac{1-H}{q}\right)}}\dots {(u-{y_{q}})_{+}^{-\left(\frac{1}{2}+\frac{1-H}{q}\right)}}\right)\\ {} & \hspace{1em}\times dB({y_{1}})\dots dB({y_{q}})\end{aligned}\]
where $(B(y),y\in \mathbb{R})$ is a two-sided Brownian motion and the iterated stochastic integral from above is a multiple integral with repect to B...Formally, we are interested in this paper to the stochastic process $({X_{t}},t\ge 0)$ defined by the dynamics
with the initial condition ${X_{0}}={x_{0}}\in \mathbb{R}$, where $\theta \gt 0$ and the noise process $({V_{t}},t\ge 0)$ is a Hermite–Ornstein–Uhlenbeck process, i.e., for every $t\ge 0$,
where $\left({Z_{t}^{H,q}},t\ge 0\right)$ is a Hermite process of order $\ge 1$ with self-similarity index $H\in \left(\frac{1}{2},1\right)$. In particular, V satisfies the Langevin equation
with the vanishing initial condition, ${V_{0}}=0$.
The main objective of this work is to develop consistent estimators for both the drift parameter θ and the Hurst parameter H, based on discrete-time observations of the solution to (1). Our methodology relies on quadratic variations, building on earlier results for Hermite processes (Tudor and Viens [21]) and Hermite–Ornstein–Uhlenbeck processes (Assaad and Tudor [1]). These works showed that, in the Rosenblatt case ($q=2$), quadratic variations converge to a non-Gaussian limit, complicating inference. To address this, we adapt the recent breakthrough approach of a modified quadratic variation method introduced recently by Ayache and Tudor [3], which cancels the long-memory contribution of the second chaos and restores a Gaussian CLT. This enables the construction of estimators for H that are not only consistent but also asymptotically normal, allowing for classical statistical inference. The estimation of θ is also based on quadratic variations, and this is obtained thanks to the special behavior of this statistic when $q=2$.
The paper is organized as follows. In Section 2 we present the Hermite processes and the associated Ornstein–Uhlenbeck processes. We also introduce the OU-HOU process and we discuss its main properties. In Section 3 we derive the behavior of the quadratic variation of the OU-HOU process and then, in Section 4, the results are applied to derive an estimator for the drift parameter θ when the Hurst parameter H is known. In Section 5, we also estimate the Hurst parameter via the modified quadratic variations.
2 Hermite processes and Hermite–Ornstein–Uhlenbeck processes
We begin by introducing the Hermite process and the associated Ornstein–Uhlenbeck process. Let $\left({Z_{t}^{H,q}},t\ge 0\right)$ be the Hermite process of order $q\ge 1$ and with self-similarity index $H\in \left(\frac{1}{2},1\right)$. For each $t\ge 0$, the random variable ${Z_{t}^{H,q}}$ can be expressed as a multiple stochastic integral with respect to the two-sided Brownian motion $(B(y),y\in \mathbb{R})$ in the following way:
where for every ${y_{1}},\dots ,{y_{q}}\in \mathbb{R}$,
(4)
\[\begin{aligned}{}{Z_{t}^{H,q}}& =d(H){\int _{{\mathbb{R}^{q}}}}dB({y_{1}})\cdots dB({y_{q}})\\ {} & \hspace{1em}\left({\int _{0}^{t}}du{(u-{y_{1}})_{+}^{-\left(\frac{1}{2}+\frac{1-H}{q}\right)}}\dots {(u-{y_{q}})_{+}^{-\left(\frac{1}{2}+\frac{1-H}{q}\right)}}\right)\\ {} & ={I_{q}}({L_{t}^{H,q}}),\hspace{1em}t\ge 0,\end{aligned}\]
\[ {L_{t}^{H,q}}({y_{1}},\dots ,{y_{q}})=d(H){\int _{0}^{t}}du{(u-{y_{1}})_{+}^{-\left(\frac{1}{2}+\frac{1-H}{q}\right)}}\dots {(u-{y_{q}})_{+}^{-\left(\frac{1}{2}+\frac{1-H}{q}\right)}}.\]
We denote by ${I_{q}}$ the multiple stochastic integral of order $q\ge 1$ with respect to B (see, e.g., [17] or [15] for the definition). In (4), $d(H)$ is a strictly positive constant chosen so that $\mathbf{E}{\left({Z_{t}^{H,q}}\right)^{2}}={t^{2H}}$ for every $t\ge 0$. We recall that ${Z^{H,q}}$ is a H-self-similar process and it has stationary increments and long memory. Its sample paths are Hölder continuous of order δ, for every $\delta \in (0,H)$. The class of Hermite processes contains the fractional Brownian motion which is obtained for $q=1$ in (4) and which is the only Gaussian process in this class. The Hermite process of order $q=2$ is the so-called Rosenblatt process. We refer to the monographs [18] or [20] for a more detailed presentation of Hermite and related processes. The value at time 1 of a Hermite process ${Z^{H,q}}$ is called a Hermite random variable.We will use in this work the Wiener integral with respect to a Hermite process. If $f:\mathbb{R}\to \mathbb{R}$ is a deterministic function that satisfies
then one can construct the stochastic integral ${\textstyle\int _{\mathbb{R}}}f(s)d{Z_{s}^{H,q}}$. It is called the Hermite–Wiener integral and it enjoys the isometry property: if f, g satisfy (5), then
and we recall that the integral $d{Z^{H,q}}$ above can be interpreted in the Hermite–Wiener or Riemann–Stieltjes senses. The process X will be called the Ornstein–Uhlenbeck process driven by a Hermite–Ornstein–Uhlenbeck process (OU-HOU process).
\[ \mathbf{E}{\int _{\mathbb{R}}}f(s)d{Z_{s}^{H,q}}{\int _{\mathbb{R}}}g(s)d{Z_{s}^{H,q}}=H(2H-1){\int _{\mathbb{R}}}{\int _{\mathbb{R}}}f(u)g(v)|u-v{|^{2H-2}}dudv.\]
The Hermite–Ornstein–Uhlenbeck (HOU) process (or the Ornstein–Uhlenbeck process driven by a Hermite noise) $({V_{t}},t\ge 0)$ is given by the formula (2) and it solves the stochastic differential equation (3). The stochastic integral in (2) is a well-defined Hermite–Wiener integral since its integrand satisfies (5). It can be also interpreted as a Riemann-Stieltjes integral with respect to the Hermite process ${Z^{H,q}}$ (see [2] or [20]). We also know that for every $T\gt 0$ and for every $p\ge 1$, we have (see also the proof of Proposition 1)
and for $0\le s\lt t\le T$,
Let us now discuss the Ornstein–Uhlenbeck process driven by a HOU process (or the OU-HOU process). This stochastic process, denoted $({X_{t}},t\ge 0)$, is the unique solution to the model (1) and it can be written as
The stochastic integral above is well-defined as a pathwise Riemann–Stieltjes integral. This is because the paths of the integrator V are δ-Hölder continuous, for each $\delta \in (0,H)$, as a consequence of the inequality (7). We can also write, for every $t\ge 0$,
(8)
\[ {X_{t}}=-{\int _{0}^{t}}{e^{-\theta (t-s)}}{V_{s}}ds+{\int _{0}^{t}}{e^{-\theta (t-s)}}d{Z_{s}^{H,q}},\]Another useful representation of the OU-HOU process is the following, see, e.g., [12]. For every $t\ge 0$,
where ${X^{(1)}}$ and ${X^{(\theta )}}$ are HOU processes with drift parameters 1 and θ, respectively, i.e.,
for $t\ge 0$. We also notice that ${X_{t}^{(\theta )}}$ is a random variable belonging to the qth Wiener chaos. Indeed, from the definition of the Wiener integral with respect to the Hermite process (see, e.g., [20]), for $t\ge 0$,
Therefore,
From (9) and the analysis of the HOU process (see, e.g., [16], [19] or [20]), we can easily deduce the following properties of the OU-HOU process.
(10)
\[ {X_{t}^{(\theta )}}={\int _{0}^{t}}{e^{-\theta (t-s)}}d{Z_{s}^{H,q}}\hspace{1em}\text{and}\hspace{1em}{X_{t}^{(1)}}={\int _{0}^{t}}{e^{-(t-s)}}d{Z_{s}^{H,q}},\]
\[\begin{aligned}{}{X_{t}^{(\theta )}}& =d(H){\int _{{\mathbb{R}^{q}}}}dB({y_{1}})\dots dB({y_{q}})\\ {} & \hspace{1em}\left({\int _{0}^{t}}du{e^{-\theta (t-u)}}{(u-{y_{1}})_{+}^{-\left(\frac{1}{2}+\frac{1-H}{q}\right)}}\dots {(u-{y_{q}})_{+}^{-\left(\frac{1}{2}+\frac{1-H}{q}\right)}}\right)\\ {} & ={I_{q}}({h_{\theta ,t}}),\end{aligned}\]
where, for every ${y_{1}},\dots ,{y_{q}}\in \mathbb{R}$,
(11)
\[ {h_{\theta ,t}}({y_{1}},\dots ,{y_{q}})=d(H){\int _{0}^{t}}du{e^{-\theta (t-u)}}{(u-{y_{1}})_{+}^{-\left(\frac{1}{2}+\frac{1-H}{q}\right)}}\dots {(u-{y_{q}})_{+}^{-\left(\frac{1}{2}+\frac{1-H}{q}\right)}}.\](12)
\[ {X_{t}}={I_{q}}({g_{\theta ,t}})\hspace{1em}\text{with}\hspace{2.5pt}{g_{\theta ,t}}=\frac{1}{1-\theta }{h_{\theta ,1}}+\frac{\theta }{1-\theta }{h_{\theta ,t}}.\]Proof.
Let $T\gt 0$ and let ${X^{(\theta )}}$ be the HOU process given by (10). From Proposition 3.3 in [20], we know that for all $p\ge 1$,
Thus the bound (13) follows directly from (9). To see point 2, we have, for every $0\le s\lt t\le T$,
\[ {X_{t}^{(\theta )}}-{X_{s}^{(\theta )}}=-\theta {\int _{s}^{t}}{X_{u}^{\theta )}}du+{Z_{t}^{H,q}}-{Z_{s}^{H,q}}.\]
Thus, for every $p\ge 1$, by using (15),
\[\begin{aligned}{}\mathbf{E}{\left|{X_{t}^{(\theta )}}-{X_{s}^{(\theta )}}\right|^{p}}& \le {C_{T,p,\theta }}\left(\mathbf{E}{\left|{\int _{s}^{t}}{X_{u}}du\right|^{p}}+\mathbf{E}{\left|{Z_{t}^{H,q}}-{Z_{s}^{H,q}}\right|^{p}}\right)\\ {} & \le {C_{T,p,\theta }}\left(|t-s{|^{p}}+|t-s{|^{pH}}\right)\\ {} & \le {C_{T,p,\theta }}|t-s{|^{pH}}.\end{aligned}\]
□3 Quadratic variation of the OU-HOU process
For the stochastic process $({X_{t}},t\ge 0)$ given by (1), we use the notation
where ${t_{i}}=\frac{i}{N}$ for $i=0,1,\dots ,N$. This is known in the literature as the (centered and renormalized) quadratic variation of the process X associated to the uniform partition of the unit interval $[0,1]$.
(16)
\[ {V_{N}}(X)=\frac{1}{N}{\sum \limits_{i=0}^{N-1}}\left[\frac{{\left({X_{{t_{i+1}}}}-{X_{{t_{i}}}}\right)^{2}}}{{N^{-2H}}}-1\right],\]Our estimators for the parameters θ and H that appear in the model (1)–(3) will be constructed in terms of the sequence $({V_{N}}(X),N\ge 1)$ given by (16), where X is the OU-HOU process from (1) or (8). The asymptotic properties of these estimators are related to the limit behavior of ${V_{N}}(X)$ as $N\to \infty $. We therefore analyze its asymptotic behavior.
We will state and prove a general result concerning the almost sure convergence of random variables in the Wiener chaos. This result implies that convergence in ${L^{2}}(\Omega )$ with an explicit power rate $\gamma \gt 0$, of a sequence of random variables in a fixed chaos implies almost sure convergence (without the need to pass to a subsequence).
Lemma 1.
Let $p\ge 1$. Consider the sequence $({F_{N}},N\ge 1)$ where ${F_{N}}={I_{p}}({f_{N}})$ with ${f_{N}}\in {L^{2}}({\mathbb{R}^{p}})$ for every $n\ge 1$. Let $F={I_{p}}(f)$ with $f\in {L^{2}}({\mathbb{R}^{p}})$. Assume that, for N large enough, there exists a constant $C\ge 0$, such that
with $\gamma \gt 0$. Then the sequence $({F_{N}},N\ge 1)$ converges almost surely, as $N\to \infty $, to F.
Proof.
By (17) and the hypercontractivity property of random variables in Wiener chaos (see, e.g., [15]), we have for every $p\ge 1$,
for N sufficiently large. Let $0\lt \delta \lt \frac{\gamma }{2}$. We have by (18), for any $p\ge 1$,
and the above series is convergent for p large. The Borel–Cantelli lemma gives the desired almost sure convergence. □
We now deduce the asymptotic behavior of the quadratic variation of the OU-HOU process. A key element of the proof is the behavior of the quadratic variation of the Hermite process ${Z^{H,q}}$. We denote, for $N\ge 1$,
By Proposition 4.1 in [11], we know that, for all $q\ge 2$,
where ${K_{H,q}}$ is an explicit strictly positive constant depending on H, q and ${Z^{{H^{\prime }},2}}$ is a Rosenblatt process (i.e., a Hermite process of order $q=2$) with self-similarity index ${H^{\prime }}=\frac{2(H-1)}{q}+1$. In particular, the sequence $\left({K_{H,q}}{N^{\frac{2-2H}{q}}}{V_{N}}({Z^{H,q}}),N\ge 1\right)$ converges in ${L^{2}}(\Omega )$ to ${Z_{1}^{{H^{\prime }},2}}$.
(19)
\[ {V_{N}}({Z^{H,q}})=\frac{1}{N}{\sum \limits_{i=0}^{N-1}}\left[\frac{{\left({Z_{{t_{i+1}}}^{H,q}}-{Z_{{t_{i}}}^{H,q}}\right)^{2}}}{{N^{-2H}}}-1\right],\](20)
\[ \mathbf{E}{\left|{K_{H,q}}{N^{\frac{2-2H}{q}}}{V_{N}}({Z^{H,q}})-{Z_{1}^{{H^{\prime }},2}}\right|^{2}}\le C\left\{\begin{array}{l@{\hskip10.0pt}l}{N^{\frac{4-4H}{q}-1}},\hspace{1em}& \text{if}\hspace{2.5pt}H\in \left(\frac{1}{2},\frac{3}{4}\right),\\ {} {N^{-\frac{1}{2}}}\log (N),\hspace{1em}& \text{if}\hspace{2.5pt}H=\frac{3}{4},\\ {} {N^{2H-2}},\hspace{1em}& \text{if}\hspace{2.5pt}H\in \left(\frac{3}{4},1\right),\end{array}\right.\]From (20), we obtain the following result.
Proposition 2.
Let ${V_{N}}(X)$ be given by (16), where X is the OU-HOU process given by (1). Then, for every $q\ge 3$,
\[ {K_{q,H}}{N^{\frac{2-2H}{q}}}{V_{N}}(X){\to _{N\to \infty }}{Z_{1}^{{H^{\prime }},2}}\hspace{1em}\textit{almost surely and in}\hspace{2.5pt}{L^{2}}(\Omega ),\]
where ${Z_{1}^{{H^{\prime }},2}}$ is a Rosenblatt random variable with the Hurst parameter ${H^{\prime }}=\frac{2(H-1)}{q}+1$ and ${K_{H,q}}$ is the constant from (20). Moreover,
(21)
\[ \mathbf{E}{\left|{K_{q,H}}{N^{\frac{2-2H}{q}}}{V_{N}}(X)-{Z_{1}^{{H^{\prime }},2}}\right|^{2}}\le C\left\{\begin{array}{l@{\hskip10.0pt}l}{N^{\frac{4-4H}{q}-1}},\hspace{1em}& \textit{if}\hspace{2.5pt}H\in \left(\frac{1}{2},\frac{3}{4}\right),\\ {} {N^{-\frac{1}{2}}}\log (N),\hspace{1em}& \textit{if}\hspace{2.5pt}H=\frac{3}{4},\\ {} {N^{2H-2}},\hspace{1em}& \textit{if}\hspace{2.5pt}H\in \left(\frac{3}{4},1\right).\end{array}\right.\]Proof.
We notice that, by (1) and (3),
where
From (6) and (13), it is immediate to see that for every $T\gt 0$ and $0\le s\lt t\le T$ and for all $p\ge 1$, with C that may depend on T, p, θ,
(22)
\[\begin{aligned}{}{X_{t}}& =-\theta {\int _{0}^{t}}{X_{s}}ds-{\int _{0}^{t}}{V_{s}}ds+{Z_{t}^{H,q}}\\ {} & ={Y_{t}}+{Z_{t}^{H,q}},\end{aligned}\](24)
\[\begin{aligned}{}\mathbf{E}|{Y_{t}}-{Y_{s}}{|^{p}}& \le C\left(\mathbf{E}{\left|{\int _{s}^{t}}{X_{u}}du\right|^{p}}+\mathbf{E}{\left|{\int _{s}^{t}}{V_{u}}du\right|^{p}}\right)\\ {} & \le C|t-s{|^{p-1}}{\int _{s}^{t}}\underset{0\le u\le T}{\sup }(\mathbf{E}|{X_{u}}{|^{p}}+\mathbf{E}|{V_{u}}{|^{p}})du\\ {} & \le C|t-s{|^{p}}.\end{aligned}\]Before expanding ${V_{N}}(X)$, we express explicitly the effect of the decomposition ${X_{t}}={Y_{t}}+{Z_{t}^{H,q}}$ from Equation (22). For each i, we write
with the notations
and
To estimate the term ${A_{1,N}}$, we notice that, by (24), we have, for every $p\ge 1$,
\[ {X_{{t_{i+1}}}}-{X_{{t_{i}}}}=({Y_{{t_{i+1}}}}-{Y_{{t_{i}}}})+({Z_{{t_{i+1}}}^{H,q}}-{Z_{{t_{i}}}^{H,q}}).\]
Hence
\[ {({X_{{t_{i+1}}}}-{X_{{t_{i}}}})^{2}}={({Z_{{t_{i+1}}}^{H,q}}-{Z_{{t_{i}}}^{H,q}})^{2}}+{({Y_{{t_{i+1}}}}-{Y_{{t_{i}}}})^{2}}+2({Y_{{t_{i+1}}}}-{Y_{{t_{i}}}})({Z_{{t_{i+1}}}^{H,q}}-{Z_{{t_{i}}}^{H,q}}),\]
and therefore
\[ {V_{N}}(X)={V_{N}}({Z^{H,q}})+{N^{2H-1}}{\sum \limits_{i=0}^{N-1}}{({Y_{{t_{i+1}}}}-{Y_{{t_{i}}}})^{2}}+2{N^{2H-1}}{\sum \limits_{i=0}^{N-1}}({Y_{{t_{i+1}}}}-{Y_{{t_{i}}}})({Z_{{t_{i+1}}}^{H,q}}-{Z_{{t_{i}}}^{H,q}}).\]
Multiplying by ${K_{H,q}}{N^{\frac{2-2H}{q}}}$ and subtracting ${Z_{1}^{{H^{\prime }},2}}$ yield
(25)
\[\begin{aligned}{}& {K_{H,q}}{N^{\frac{2-2H}{q}}}{V_{N}}(X)-{Z_{1}^{{H^{\prime }},2}}\\ {} & \hspace{1em}={K_{H,q}}{N^{\frac{2-2H}{q}}}{V_{N}}({Z^{H,q}})-{Z_{1}^{{H^{\prime }},2}}+{K_{q,H}}{N^{\frac{2-2H}{q}}}{N^{2H-1}}{\sum \limits_{i=0}^{N-1}}{\left({Y_{{t_{i+1}}}}-{Y_{{t_{i}}}}\right)^{2}}\\ {} & \hspace{2em}+2{K_{H,q}}{N^{\frac{2-2H}{q}}}{N^{2H-1}}{\sum \limits_{i=0}^{N-1}}\left({Y_{{t_{i+1}}}}-{Y_{{t_{i}}}}\right)\left({Z_{{t_{i+1}}}^{H,q}}-{Z_{{t_{i}}}^{H,q}}\right)\\ {} & \hspace{1em}={K_{H,q}}{N^{\frac{2-2H}{q}}}{V_{N}}({Z^{H,q}})-{Z_{1}^{{H^{\prime }},2}}+{A_{1,N}}+{A_{2,N}},\end{aligned}\](26)
\[ {A_{1,N}}={K_{H,q}}{N^{\frac{2-2H}{q}}}{N^{2H-1}}{\sum \limits_{i=0}^{N-1}}{\left({Y_{{t_{i+1}}}}-{Y_{{t_{i}}}}\right)^{2}}\](27)
\[ {A_{2,N}}=2{K_{H,q}}{N^{\frac{2-2H}{q}}}{N^{2H-1}}{\sum \limits_{i=0}^{N-1}}\left({Y_{{t_{i+1}}}}-{Y_{{t_{i}}}}\right)\left({Z_{{t_{i+1}}}^{H,q}}-{Z_{{t_{i}}}^{H,q}}\right).\]
\[\begin{aligned}{}\mathbf{E}{\left|{A_{1,N}}\right|^{p}}& ={K_{H,q}}{N^{\frac{(2-2H)p}{q}}}{N^{(2H-1)p}}\mathbf{E}{\left|{\sum \limits_{i=0}^{N-1}}{({Y_{{t_{i+1}}}}-{Y_{{t_{i}}}})^{2}}\right|^{p}}\\ {} & \le C{N^{\frac{(2-2H)p}{q}}}{N^{(2H-1)p}}{N^{p-1}}{\sum \limits_{i=0}^{N-1}}\mathbf{E}|{Y_{{t_{i+1}}}}-{Y_{{t_{i}}}}{|^{2p}}\\ {} & \le C{N^{\frac{(2-2H)p}{q}}}{N^{(2H-1)p}}{N^{p-1}}{N^{1-2p}}.\end{aligned}\]
Thus
and this converges to zero as $N\to \infty $. Concerning the summand ${A_{2,N}}$, let us notice that, by using (10),
\[ {Y_{t}}=-\frac{\theta }{1-\theta }{\int _{0}^{t}}{X_{s}^{(1)}}ds+\frac{{\theta ^{2}}}{1-\theta }{\int _{0}^{t}}{X_{s}^{(\theta )}}{d_{s}}-{\int _{0}^{t}}{V_{s}}ds,\]
where ${X^{(1)}}$, ${X^{\theta )}}$ and V are all HOU processes. By using the proof of Proposition 2 in [3], we deduce that
The bound (21) is obtained by plugging the estimates (20), (28) and (29) into (25). The almost sure convergence is a consequence of Lemma 1, since all the summands in the right-hand side of (25) belongs to the qth Wiener chaos. □We will focus in the next sections on the case $q=2$ (the Rosenblatt process case). The particularity of this case is that the limit of the quadratic variation ${V_{N}}(X)$ is exactly the driving noise that appears in (1). Let us write separately the result obtained in Proposition 2 for $q=2$.
Corollary 1.
Let $q=2$, $\theta \gt 0$ and let $({X_{t}},t\ge 0)$ given by (1). Then
\[ {K_{H,2}}{N^{1-H}}{V_{N}}(X){\to _{N\to \infty }}{Z_{1}^{H,2}}\hspace{1em}\textit{almost surely and in}\hspace{2.5pt}{L^{2}}(\Omega ),\]
where ${Z_{1}^{H,2}}$ is the value at time 1 of the stochastic process ${Z^{H,2}}$ which appears as the integrator in (2). Moreover,
\[ \mathbf{E}{\left|{K_{H,2}}{N^{1-H}}{V_{N}}(X)-{Z_{1}^{H,2}}\right|^{2}}\le C\left\{\begin{array}{l@{\hskip10.0pt}l}{N^{1-2H}},\hspace{1em}& \textit{if}\hspace{2.5pt}H\in \left(\frac{1}{2},\frac{3}{4}\right),\\ {} {N^{-\frac{1}{2}}}\log (N),\hspace{1em}& \textit{if}\hspace{2.5pt}H=\frac{3}{4},\\ {} {N^{2H-2}},\hspace{1em}& \textit{if}\hspace{2.5pt}H\in \left(\frac{3}{4},1\right).\end{array}\right.\]
4 Estimation of the drift parameter
The main goal of this section is to derive an estimator for the drift parameter of our Rosenblatt–Ornstein–Hermite process (the HOU process with $q=2$). We will assume for the moment that the Hurst parameter H is known (it will also be estimated in the next section). We will suppose in the sequel that we have access to the discrete observations of the process $({X_{t}},t\ge 0)$ on the uniform partition of the interval $[0,1]$. We will start by describing the construction of our estimator and then derive its asymptotic properties.
4.1 The definition and the consistency of the estimator
The procedure to derive an estimator for θ is inspired from the work [10]. Let $H\in \left(\frac{1}{2},1\right)$. From (1) and (3), we can write, for every $t\ge 0$,
Approximating Lebesque integrals by Riemann sums, the relation (31) leads to the definition of the estimator
\[\begin{aligned}{}{X_{t}}& =-\theta {\int _{0}^{t}}{X_{s}}ds-{\int _{0}^{t}}{V_{s}}ds+{Z_{t}^{H,q}}\\ {} & =-\theta {\int _{0}^{t}}{X_{s}}ds-{\int _{0}^{t}}{X_{s}}ds-\theta {\int _{0}^{t}}ds\left({\int _{0}^{s}}{X_{u}}du\right)+{Z_{t}^{H,q}}\\ {} & =-\theta {\int _{0}^{t}}{X_{s}}(1+t-s)ds-{\int _{0}^{t}}{X_{s}}ds+{Z_{t}^{H,q}}.\end{aligned}\]
In particular, for $t=1$,
Assume that $q=2$. In this case, we construct an estimator for the drift parameter θ in (1) by using the result in Corollary 1. From (30), with ${V_{N}}(X)$ given by (16),
\[\begin{aligned}{}{K_{H,2}}{N^{1-H}}{V_{N}}(X)-{X_{1}}& =\theta {\int _{0}^{1}}{X_{s}}(2-s)ds+{\int _{0}^{1}}{X_{s}}ds\\ {} & \hspace{1em}+{K_{H,2}}{N^{1-H}}{V_{N}}(X)-{Z_{1}^{H,2}}.\end{aligned}\]
Since, by Corollary 1,
\[ {K_{H,2}}{N^{1-H}}{V_{N}}(X)-{Z_{1}^{H,2}}\underset{N\to \infty }{\to }0\hspace{1em}\text{almost surely,}\]
we can write (where ${a_{N}}\sim {b_{N}}$ means that ${a_{N}}-{b_{N}}$ converges to zero almost surely)
(31)
\[ {K_{H,2}}{N^{1-H}}{V_{N}}(X)-{X_{1}}\sim \theta {\int _{0}^{1}}{X_{s}}(2-s)ds+{\int _{0}^{1}}{X_{s}}ds.\]From the previous results, we deduce the consistency of the above estimator.
Proposition 3.
Let $H\in \left(\frac{1}{2},1\right)$ and let ${\widehat{\theta }_{N}}$ be given by (32). Then ${\widehat{\theta }_{N}}$ is strongly consistent, i.e,
Proof.
The result is obtained from Corollary 1, and from the almost sure convergences
and
□
As a straightforward corollary, we deduce that this quantity is nonzero almost surely, which will be important for the estimation procedure.
4.2 The limit distribution of the estimator
In this section, we will derive the limit distribution for the estimator ${\widehat{\theta }_{N}}$ of the drift parameter θ. As mentioned above, we assume at this time that the Hurst parameter H is known.
We start with a technical lemma which is needed for the proof of the main result of this section.
Lemma 2.
Let $H\in \left(\frac{1}{2},1\right)$ and let, for $N\ge 1$,
Then
In particular, the sequence $({U_{N}},N\ge 1)$ converges to ${\textstyle\int _{0}^{1}}{X_{s}}(2-s)ds$ in ${L^{2}}(\Omega )$.
Proof.
We can write
\[\begin{aligned}{}& {U_{N}}-{\int _{0}^{1}}{X_{s}}(2-s)ds\\ {} & \hspace{1em}={\sum \limits_{i=0}^{N-1}}{\int _{{t_{i}}}^{{t_{i+1}}}}\left({X_{{t_{i+1}}}}(2-{t_{i+1}})-{X_{s}}(2-s)\right)ds\\ {} & \hspace{1em}={\sum \limits_{i=0}^{N-1}}{\int _{{t_{i}}}^{{t_{i+1}}}}({X_{{t_{i+1}}}}-{X_{s}})(2-{t_{i+1}})ds-{\sum \limits_{i=0}^{N-1}}{\int _{{t_{i}}}^{{t_{i+1}}}}{X_{s}}({t_{i+1}}-s)ds.\end{aligned}\]
Using the elementary inequality $|a-b{|^{2}}\le 2({a^{2}}+{b^{2}})$, we obtain
\[ \mathbf{E}{\left|{U_{N}}-{\int _{0}^{1}}{X_{s}}(2-s)ds\right|^{2}}\le 2\mathbf{E}{\left|{A_{N}}\right|^{2}}+2\mathbf{E}{\left|{B_{N}}\right|^{2}},\]
where ${A_{N}}$ and ${B_{N}}$ denote the two sums above.Next, for each i, we use Jensen/Cauchy–Schwarz inequality to control the square of the integral
and summing over i gives
Similarly,
Since $2\gt 2H$ and ${t_{i+1}}-s\le 1/N$, we have ${({t_{i+1}}-s)^{2}}\le {({t_{i+1}}-s)^{2H}}$, so
Theorem 1.
Assume $H\in \left(\frac{1}{2},\frac{2}{3}\right)$ and let ${\widehat{\theta }_{N}}$ be given by (32). Then
\[ {E_{H,2}}{N^{H-\frac{1}{2}}}\left({\int _{0}^{1}}{X_{s}}(2-s)ds\right)\left({\widehat{\theta }_{N}}-\theta \right){\to _{N\to \infty }^{(d)}}Z,\]
where ${E_{H,2}}\gt 0$ is an explicit constant depending only on H, and Z is a standard normal random variable.
Proof.
First we prove that, with ${U_{N}}$ given by (33),
where ${E_{H,2}}\gt 0$ is an explicit constant depending only on H, and Z is a standard normal random variable. By using (32), with the expression of ${X_{1}}$ taken from (30), we can write, for every integer $N\ge 1$,
Taking into account the decomposition (25), we get
By using the estimates (28) and (29) with $p=1$, $q=2$, we obtain
and this converges to zero as $N\to \infty $ for $H\lt \frac{3}{4}$. By writing
(35)
\[ {E_{H,2}}{N^{H-\frac{1}{2}}}{U_{N}}\left({\widehat{\theta }_{N}}-\theta \right){\to _{N\to \infty }^{(d)}}Z,\](36)
\[\begin{aligned}{}{\widehat{\theta }_{N}}-\theta & =\frac{{K_{H,2}}{N^{1-H}}{V_{N}}(X)-{Z_{1}^{H,2}}}{\frac{1}{N}{\textstyle\textstyle\sum _{i=0}^{N-1}}{X_{{t_{i+1}}}}(2-{t_{i+1}})}+\frac{{\textstyle\textstyle\int _{0}^{1}}{X_{s}}ds-\frac{1}{N}{\textstyle\textstyle\sum _{i=0}^{N-1}}{X_{{t_{i+1}}}}}{\frac{1}{N}{\textstyle\textstyle\sum _{i=0}^{N-1}}{X_{{t_{i+1}}}}(2-{t_{i+1}})}\\ {} & \hspace{1em}+\theta \frac{{\textstyle\textstyle\int _{0}^{1}}{X_{s}}(2-s)ds-\frac{1}{N}{\textstyle\textstyle\sum _{i=0}^{N-1}}{X_{{t_{i+1}}}}(2-{t_{i+1}})}{\frac{1}{N}{\textstyle\textstyle\sum _{i=0}^{N-1}}{X_{{t_{i+1}}}}(2-{t_{i+1}})}\\ {} & :=\frac{{K_{H,2}}{N^{1-H}}{V_{N}}(X)-{Z_{1}^{H,2}}}{{U_{N}}}+\frac{{T_{1,N}}}{{U_{N}}}+\frac{{T_{2,N}}}{{U_{N}}}.\end{aligned}\]
\[ {U_{N}}\left({\widehat{\theta }_{N}}-\theta \right)={K_{H,2}}{N^{1-H}}{V_{N}}({Z^{H,q}})-{Z_{1}^{H,2}}+{A_{1,N}}+{A_{2,N}}+{T_{1,N}}+{T_{2,N}},\]
with ${A_{1,N}}$, ${A_{2,N}}$ defined by (26) and (27), respectively. It has been shown in Proposition 3 in [1] (see also Theorem 3.3 in [21]) that, for $H\in \left(\frac{1}{2},\frac{2}{3}\right)$,
\[ {E_{H,2}}{N^{H-\frac{1}{2}}}\left({K_{H,2}}{N^{1-H}}{V_{N}}({Z^{H,q}})-{Z_{1}^{H,2}}\right){\to _{N\to \infty }^{(d)}}N(0,1),\]
where ${E_{H,2}}\gt 0$ is a constant. To deduce the limit (35), it remains to show that, for $i=1,2$, with ${T_{1,N}}$, ${T_{2,N}}$ defined by (36), and with ${A_{i,N}}$ given by (26), (27),
(37)
\[ {N^{H-\frac{1}{2}}}{A_{i,N}}\hspace{1em}\text{and}\hspace{1em}{N^{H-\frac{1}{2}}}{T_{i,N}}{\to _{N\to \infty }}0\hspace{1em}\text{in probability.}\]
\[ {\int _{0}^{1}}{X_{s}}ds-\frac{1}{N}{\sum \limits_{i=0}^{N-1}}{X_{{t_{i+1}}}}={\sum \limits_{i=0}^{N-1}}{\int _{{t_{i}}}^{{t_{i+1}}}}({X_{s}}-{X_{{t_{i+1}}}})ds,\]
we get
\[\begin{aligned}{}\mathbf{E}{\left|{N^{H-\frac{1}{2}}}{T_{1,N}}\right|^{2}}& =\mathbf{E}{\left|{N^{H-\frac{1}{2}}}\left({\int _{0}^{1}}{X_{s}}ds-\frac{1}{N}{\sum \limits_{i=0}^{N-1}}{X_{{t_{i+1}}}}\right)\right|^{2}}\\ {} & \le C{N^{2H-1}}{\sum \limits_{i=0}^{N-1}}{\int _{{t_{i}}}^{{t_{i+1}}}}\mathbf{E}{({X_{s}}-{X_{{t_{i+1}}}})^{2}}ds.\end{aligned}\]
By (14), we obtain
\[ \mathbf{E}{\left|{N^{H-\frac{1}{2}}}{T_{1,N}}\right|^{2}}\le C{N^{2H-1}}\times {N^{-2H}}=C{N^{-1}}.\]
Next, by (34),
\[\begin{aligned}{}& \mathbf{E}{\left|{N^{H-\frac{1}{2}}}{T_{2,N}}\right|^{2}}\\ {} & \hspace{1em}=\mathbf{E}{\left|{N^{H-\frac{1}{2}}}\left({\int _{0}^{1}}{X_{s}}(2-s)ds-\frac{1}{N}{\sum \limits_{i=0}^{N-1}}{X_{{t_{i+1}}}}(2-{t_{i+1}})\right)\right|^{2}}\\ {} & \hspace{1em}\le C{N^{2H-1}}\times {N^{-2H}}=C{N^{-1}},\end{aligned}\]
and therefore (37) holds also for $i=2$. We then obtain (37) and we conclude the proof of (35). To get the stated result, we write
\[\begin{aligned}{}& {E_{H,2}}{N^{H-\frac{1}{2}}}{\int _{0}^{1}}{X_{s}}(2-s)ds\left({\widehat{\theta }_{N}}-\theta \right)\\ {} & \hspace{1em}={E_{H,2}}{N^{H-\frac{1}{2}}}{U_{N}}\left({\widehat{\theta }_{N}}-\theta \right)\\ {} & \hspace{2em}+{E_{H,2}}{N^{H-\frac{1}{2}}}\left({\int _{0}^{1}}{X_{s}}(2-s)ds-{U_{N}}\right)\left({\widehat{\theta }_{N}}-\theta \right).\end{aligned}\]
We show that the last summand from above goes to zero in probability as $N\to \infty $. By Lemma 2, ${N^{H-\frac{1}{2}}}\left({\textstyle\int _{0}^{1}}{X_{s}}(2-s)ds-{U_{N}}\right){\to _{N\to \infty }}0$ in ${L^{2}}(\Omega )$ and by Proposition 3, ${\widehat{\theta }_{N}}-\theta $ converges to zero almost surely as N tends to infinity. □5 Modified quadratic variation and Gaussian estimators for the Hurst parameter
In this section, we deal with the estimation of the Hurst parameter of the OU-HOU process. In fact, we propose another approach in order to avoid Rosenblatt-type estimator for the Hurst index which can be difficult to use for real applications. Instead, we rigorously adapt the Ayache–Tudor [3] modified quadratic variation method to the Ornstein–Uhlenbeck process driven by Hermite–Ornstein–Uhlenbeck (OU-HOU) noise. We first construct the statistic, then prove a central limit theorem (CLT) with an explicit rate for the Wasserstein distance by using the known results for the Hermite process. Using these results, we derive a strongly consistent and asymptotically normal estimator of the Hurst parameter H and discuss the implications for the estimation of θ.
5.1 The modified quadratic variation
The construction proposed in [3] is based on some special increments of the Hermite process along the dyadic partition of the interval $[0,1]$. We now introduce the dyadic discretization that will be used to construct localized increments of the OU-HOU process. This discretization is designed to isolate the diagonal singularity of the Hermite kernel and to produce independent contributions across disjoint windows.
Definition 1 (Dyadic anchors and index sets).
Fix parameters $\beta \in (0,1)$ and $\gamma \in (0,\beta )$. For each integer $N\ge 1$, define
For $l\in {L_{N}}$, the dyadic anchor is
(38)
\[ {L_{N}}:=\Big\{0,1,\dots ,\big\lfloor {2^{N(1-\beta )}}\big\rfloor -1\Big\},\hspace{2em}{L_{N,\gamma }}:={L_{N}}\cap \Big[1,\hspace{0.1667em}\big\lfloor {2^{N\gamma }}\big\rfloor \Big].\]Thus ${L_{N}}$ indexes the admissible anchors at resolution ${2^{-N}}$, and ${L_{N,\gamma }}$ selects the first of them, $\lfloor {2^{N\gamma }}\rfloor $. Note that $|{L_{N,\gamma }}|\asymp {2^{N\gamma }}$ as $N\to \infty $.
Definition 2 (Localized increments).
For each $l\in {L_{N,\gamma }}$, define the increment of the OU-HOU process X of length ${2^{-N}}$ anchored at ${e_{l,N,\beta }}$ by
Similarly, define the corresponding increment of the Hermite driver ${Z^{H,q}}$ by
By construction, if $l\ne k$ then the intervals $[{e_{l,N,\beta }},\hspace{0.1667em}{e_{l,N,\beta }}+{2^{-N}}]$ and $[{e_{k,N,\beta }},\hspace{0.1667em}{e_{k,N,\beta }}+{2^{-N}}]$ are disjoint.
With these localized increments, we compose the modified quadratic variation of the OU-HOU process X. We set, for each $N\ge 1$,
Similarly, we define the modified quadratic variation of the Hermite process
We denoted by $\| h{\| _{\mathit{Lip}}}$ the Lipschitz norm of h given by
(42)
\[\begin{aligned}{}{V_{N,\mathit{mod}}}(X)& =\frac{{2^{2HN}}}{\sqrt{|{L_{N,\gamma }}|}}\sum \limits_{l\in {L_{N,\gamma }}}\left({(\Delta {X_{l,N}})^{2}}-\mathbf{E}{(\Delta {Z_{l,N}})^{2}}\right)\\ {} & =\frac{{2^{2HN}}}{\sqrt{|{L_{N,\gamma }}|}}\sum \limits_{l\in {L_{N,\gamma }}}\left({(\Delta {X_{l,N}})^{2}}-{2^{-2HN}}\right).\end{aligned}\]
\[ {V_{N,\mathit{mod}}}({Z^{H,q}})=\frac{{2^{2HN}}}{\sqrt{|{L_{N,\gamma }}|}}\sum \limits_{l\in {L_{N,\gamma }}}\left({(\Delta {Z_{l,N}})^{2}}-{2^{-2HN}}\right).\]
We will use the Wasserstein metric to evaluate the distance between probability distributions. Let us recall its definition. Let
\[ \mathcal{A}=\{h:\mathbb{R}\to \mathbb{R},h\hspace{2.5pt}\text{is Lipschitz continuous with}\hspace{2.5pt}\| h{\| _{\mathit{Lip}}}\le 1\}\]
and let F, G be random variables such that $h(F),h(G)\in {L^{1}}(\Omega )$ for every $h\in \mathcal{A}$. Then the Wasserstein distance between the probability distributions of F and G is defined by
(43)
\[ {d_{W}}({P_{F}},{P_{G}})=\underset{h\in \mathcal{A}}{\sup }\left|\mathbf{E}h(F)-\mathbf{E}h(G)\right|.\]The behavior of the sequence $({V_{N,\mathit{mod}}}({Z^{H,q}}),N\ge 1)$ has been analyzed in [3]. We have the following result:
and for N large enough,
From the above result, we deduce the behavior of the modified variation of the OU-HOU process.
Proposition 4.
Proof.
We follow the idea of the proof of Proposition 2, based on the decomposition (20). From the formula (20), we get
where
The estimate (46), combined with (44) and (45), gives the conclusion. □
\[ {B_{1,N}}=\frac{{2^{2HN}}}{\sqrt{|{L_{N,\gamma }}|}}\sum \limits_{l\in {L_{N,\gamma }}}{(\Delta {Y_{l,N}})^{2}},\]
and
\[ {B_{2,N}}=2\frac{{2^{2HN}}}{\sqrt{|{L_{N,\gamma }}|}}\sum \limits_{l\in {L_{N,\gamma }}}(\Delta {Y_{l,N}})(\Delta {Z_{l,N}}),\]
with the notation $\Delta {Y_{l,N}}:={Y_{{e_{l,N,\beta }}+{2^{-N}}}}-{Y_{{e_{l,N,\beta }}}}$, where Y is given by (23). Next, the asymptotic behavior of ${V_{N,\mathit{mod}}}({Z^{H,q}})$ is given by (44) and (45). On the other hand, by using the calculations in the proof of Proposition 7 in [3], we can prove that
(46)
\[ \mathbf{E}|{B_{1,N}}|\le C{2^{(2H-2)N+\frac{{N^{\gamma }}}{2}}}\hspace{1em}\text{and}\hspace{1em}\mathbf{E}|{B_{2,N}}|\le C{2^{(H-1)N+\frac{{N^{\gamma }}}{2}}}.\]5.2 On the estimation of the Hurst and drift parameters
Using Proposition 4 and a standard procedure, we can define an estimator for the Hurst index of the OU-HOU process which solves the stochastic differential equation (1). That is, we let
\[ {S_{N}}(X)=\frac{1}{|{L_{N,\gamma }}|}\sum \limits_{l\in {L_{N,\gamma }}}{(\Delta {X_{l,N}})^{2}},\hspace{1em}N\ge 1,\]
and
Since the OU-HOU process has a structure similar to that of the standard HOU process (in the sense that both can be written as the sum of the Hermite process plus another process with nice simple paths, see (20) and (24)), we can follow Section 6 in [3] to get the asymptotic properties of the estimator (47). We will have that ${\widehat{H}_{N}}(X)$ is strongly consistent, i.e.,
and
\[ 2N\log (2)\sqrt{|{L_{N,\gamma }}|}\left(H-{\widehat{H}_{N}}(X)\right){\to _{N\to \infty }^{(d)}}N\left(0,\mathbf{E}|{Z_{1}^{H,q}}{|^{4}}-1\right).\]
From the above considerations and the expression of the estimator (32), we can deduce a new estimator for the drift parameter θ in the model (1) when the Hurst parameter is unknown. We just have to plug in the Hurst estimator (47) into (32). That is, we set, for every $N\ge 1$,
We observe that the above estimator (49) can be constructed from the observation of the OU-HOU process X at discrete times. In practice, we need the data ${X_{{t_{i}}}},i=1,\dots ,N$, with ${t_{i}}=\frac{i}{N}$ and the observation of X at the dyadic anchors given by (39). It can be observed, by following the proofs in [21], that the constant ${K_{H,2}}$ depends continuously on the Hurst parameter H. Thus, by taking into account (48) and Proposition 3, we deduce that ${\widehat{\theta }_{1,N}}$ given by (49) is a consistent estimator for θ when H is unknown, i.e., it converges in probability, as $N\to \infty $, to the drift parameter θ.