The time-inhomogeneous autoregressive model AR(1) is studied, which is the process of the form

The classical autoregressive model

However, in real applications, we cannot always guarantee that all

So, we come to the model

Usually, when studying the recurrence of Markov chains, we use techniques developed in the classical theory based on a drift condition (see, for example, [

The general theory for inhomogeneous Markov chains is much more involved than its homogeneous counterpart. One of the most popular instruments in research is the coupling method (see classical books by T. Lindvall [

Summarizing the methods described above, we can outline the plan for this paper. First, we use the modified drift condition from [

This paper is organized as follows. Section

Throughout this paper, we assume that all random variables are defined on a common probability space

By

Next, we need a notation for a process conditioned on

In Section 4 and later we deal with a pair of inhomogeneous Markov chains

In this section we deal with a single time-inhomogeneous Markov chain

The main result of this section is presented in the following theorem.

The main tool in the proof is the drift condition from the paper [

For

Put

Condition 1, along with inequality (

Condition 1 in Theorem

Since random variables

The following immediate corollary could be useful in practical applications.

In the homogeneous theory, geometric recurrence implies a corresponding chain’s positivity and geometric ergodicity. However, in the time-inhomogeneous case, such a conclusion is wrong in general since essentially inhomogeneous chains (which are not asymptotically homogeneous) usually do not have a stationary distribution.

In this section we consider a set of independent random variables

Our goal is to demonstrate that

We define

It is important to emphasize that one-step transition probabilities for two chains do not coincide when

Our goal is to find conditions which ensure proximity of

For this purpose we construct a coupling for chains

Assume that there exists a sequence of real numbers

Going forward we will require Condition

Second, we introduce substochastic kernels

Note that by definition of

We denote the residual substochastic kernels by

Here,

To prove the main result, we will need some regularity conditions on

Denote

The motivation for this condition is as follows. We may think of

Let us now discuss the condition on

It is easy to show that a similar motivation remains valid for inhomogeneous autoregressive models. Assume

Generally speaking, Condition

In the discussion above, we have shown that we may expect

Finally, we would like to mention that Δ is in fact

Assuming Condition

We define the Markov chain

It is straightforward that marginal distributions of the process

We will use the canonical probability

Let us denote by

We will also need a special notation for the sets

By

Recall the process

We can demonstrate how to calculate

It is possible to calculate all constants involved in (

Using the standard coupling approach we first obtain

Let us denote a kernel

We note that since

We apply this equality to the last decoupling time and derive:

Next, we want to replace

Everywhere in this section we assume that conditions of Theorem

We start by introducing an important result that allows us to connect the expectation of the exponential moment of

Constants

From [

Next we introduce

Going forward we will use the notation from Theorem

For all

Note, that it is important in the preceding derivations that

We use Lemma

Now, we consider a situation when the pair

For any

Note that we cannot apply Lemma

Now we can apply Lemma

Now combining (

Finally, we look at the situation when the pair

Using the first entrance – last exit decomposition and the Markov property, we get the following representation of the probability of interest

Using the same arguments as in Lemma

It is critical that the probability from Lemma

We proceed by induction. Let

Consider now

So, we have shown that for all positive integers

In what follows we will use a simplified notation

In this lemma we use the following notation which is aimed to simplify further formulas:

It is easy to see that

Assume first that

Now, using the same arguments as in the proof of Lemma

Substituting this upper bound into (

So, finally we get the estimate

Now we use (

Finally we can show that

As in the previous lemma, we denote by

Let us introduce

Before moving to

Now we have

We can use result of Theorem

Finally we can write

In this appendix we want to demonstrate how to calculate the constants involved in the bound in Theorem

First, we note that the bound in Theorem

We start with

We start with calculating

Finally, we should calculate

We say that a sequence of Markov kernels

There exists a sequence of measurable functions

The sequence

The theorem is proven in [