The Gaussian-Volterra process with a linear kernel is considered, its properties are established and projection coefficients are explicitly calculated, i.e. one of possible prediction problems related to Gaussian processes is solved.

Starting with the famous fractional Brownian motion (fBm), the models involving the noise represented by Gaussian-Volterra processes (GVp’s) become very popular and are even more popular now, because they have non-Markov property, so, in some sense, they have a memory, and at the same time, the phenomenon of memory is observed in almost all real processes: in economics, finance, cellular and other types of communications, in neural networks and other areas. In this connection, GVp’s were studied in many papers, including [

In general, GVp is a process of the form

Second, the kernels (

The paper is organized as follows: in Section

Let

1) Obviously,

2) Representation (

3) On the one hand, according to (

On the other hand, according to the theorem of normal correlation,

Equalities (

4) Self-similarity and nonstationarity of the process itself immediately follow from (

5) If the increments are subsequent, i.e.

In the general case, when

6) Indeed,

Formula (

Let

Also, according to (

Let us record the covariance matrix of increments, starting from 1 and to

Evidently, the matrix

Write the determinant

Next, we subtract the penultimate row from the last row and then expand the determinant over the last column, arriving at

The last determinant has size

Now we find a general direct formula for determinant

A sequence with such a recurrent formula is considered in [

In this case, according to [

Find

We chose the pair

From (

Furthermore,

Finally, inequality

Even in our simple case it is not so trivial to establish that the matrix

It is interesting to compare the direct calculations of determinant

Now, by the recurrent formula (

Finally, check the same values obtained by the final formula (

In particular, for

Now, calculate the determinant by (

Thus, all three methods of calculation gave the same results, which also confirms the validity of the formulas.

Suppose that we need to calculate the determinant

Similarly to (

Applying formula (

Let us check the latter formula for

Now we consider the problem of projection of

We multiply both parts of (

1) If

2), 3) The system of the linear equations (

Thus, we solve the equation

According to Corollary

Therefore, we need to find

First, let

Applying formula (

Let us check the latter formula for

Thus, we have obtained the same results for all three methods.

In the case

First of all we investigate the case

Thus,

Further, consider

Now we analogously subtract the penultimate row from the last row and expand the determinant over the last row. That is,

Note that (

Next, we study the case

Therefore, we obtain by (

Thus, we can rewrite the last equality in the form

Now we use formulas (

If

Consequently, by the second equality of (

In order to check the obtained results, we calculate the projection coefficients for some values of

By formulas from Theorem

On the other hand, we can solve the system of equations (

If

In the case

As we see, the results coincide.