Preprint
Article

On Nearly Sasakian and Nearly Kähler Statistical Manifolds

Altmetrics

Downloads

176

Views

29

Comments

0

A peer-reviewed article of this preprint also exists.

Submitted:

18 May 2023

Posted:

18 May 2023

You are already at the latest version

Alerts
Abstract
In this paper, we introduce the notions of nearly Sasakian and nearly Kähler statistical structures with a non-trivial example. The conditions for a real hypersurface in a nearly Kähler statistical manifold to admit a nearly Sasakian statistical structure are given. We also study Invariant and anti-invariant statistical submanifolds of nearly Sasakian statistical manifolds. At the end, some conditions under which such a submanifold of a nearly Sasakian statistical manifold is itself a nearly Sasakian statistical manifold are given.
Keywords: 
Subject: Computer Science and Mathematics  -   Mathematics

1. Introduction

Information geometry as a famed theory in geometry is a gadget to peruse spaces including of probability measures. Nowadays, this interdisciplinary field as a combination of differential geometry and statistics has impressive role in various science. For instance, a manifold learning theory in a hypothesis space consisting of models is developed in [12]. The semi-Riemannian metric of this hypothesis space is uniquely derived relied on the information geometry of the probability distributions. In [2], Amari also presented the geometrical and statistical ideas for investigating neural networks including invisible units or unobservable variables. To see more applications of this geometry in other sciences, can be referred to [5,8].
Suppose that ζ is an open subset of R n and χ is a sample space with parameter ξ = ( ξ 1 , , ξ n ) . A statistical model S is the set of probability density functions defined by
S = { p ( y ; ξ ) : χ p ( y ; ξ ) d y = 1 , p ( y ; ξ ) > 0 , ξ ζ R n } .
The Fisher information matrix g ( ξ ) = [ g l s ( ξ ) ] on S is given as
g l s ( ξ ) : = χ l ξ s ξ p ( y ; ξ ) d y = E p [ l ξ s ξ ] ,
where E p [ ] is the expectation of ( y ) with respect to p ( y ; ξ ) , ξ = ( y ; ξ ) : = log p ( y ; ξ ) and l : = ξ l . The space S with together the information matrices is a statistical manifold.
In 1920, Fisher was the first one who offered (1) as a mathematical purpose of information (see [9]). It is observed that ( S , g ) is a Riemannian manifold if all components of g are converging to real numbers and g is positive-definite. So g is called a Fisher metric on S. Using g, an affine connection with respect to p ( y ; ξ ) is described by
Γ l s , k = g ( l s , k ) : = E p [ ( l s ξ ) k ξ ] .
Nearly Kähler structures on Riemannian manifolds were specified by Gray [10] to describe an especial class of almost Hermitian structures in every even dimension. As an odd-dimensional peer of nearly Kähler manifolds, nearly Sasakian manifolds were introduced by Blair, Yano and Showers in [7]. They presented that a normal nearly Sasakian structure is Sasakian and a hypersurface of a nearly Kähler structure is nearly Sasakian if and only if it is quasi-umbilical with the (almost) contact form. In particular, S 5 properly imbedded in S 6 inherits a nearly Sasakian structure which is not Sasakian.
A statistical manifold can be considered as an expanse of a Riemannian manifold such that the compatibility of the Riemannian metric is developed to a general condition. Applying this opinion in geometry, we create a convenient nearly Sasakian structure on statistical structures and define a nearly Sasakian statistical manifold.
The purpose of this paper is to present nearly Sasakian and nearly Kähler structures on statistical manifolds and show the relation between two geometric notions. To achieve this goal, the notions and attributes of statistical manifolds are obtained in Section 2. In Section 3, we describe a nearly Sasakian structure on statistical manifolds and give some properties of them. In Section 4, we investigate nearly Kähler structures on statistical manifolds. In this context the conditions for a real hypersurface in a nearly Kähler statistical manifold to admit a nearly Sasakian statistical structure are given. Section 6 is devoted to study (anti-)invariant statistical submanifolds of nearly Sasakian statistical manifolds. Some conditions under which an invariant submanifold of a nearly Sasakian statistical manifold is itself a nearly Sasakian statistical manifold are given at the end.

2. Preliminaries

For an n-dimensional manifold N, consider ( U , x i ) , i = 1 , , n , as a local chart of the point x U . Considering the coordinates ( x i ) on N, we have the local field x i | p as frames on T p N .
An affine connection is called Codazzi connection if the Codazzi equations satisfy:
( X 1 g ) ( X 2 , X 3 ) = ( X 2 g ) ( X 1 , X 3 ) , ( = ( X 3 g ) ( X 1 , X 2 ) ) ,
for any X 1 , X 2 , X 3 Γ ( T N ) where
( X 1 g ) ( X 2 , X 3 ) = X 1 g ( X 2 , X 3 ) g ( X 1 X 2 , X 3 ) g ( X 2 , X 1 X 3 ) .
The triplet ( N , g , ) also is called a statistical manifold if the Codazzi connection is a statistical connection, i.e., a torsion-free Codazzi connection. Moreover, the affine connection * as a (dual) conjugate connection of with respect to g is determined by
X 1 g ( X 2 , X 3 ) = g ( X 1 X 2 , X 3 ) + g ( X 2 , X 1 * X 3 ) .
Considering g as the Levi-Civita connection on N, one can see g = 1 2 ( + * ) and
* g = g .
Thus ( N , g , * ) forms a statistical manifold. In particular, the torsion-free Codazzi connection reduces to the Levi-Civita connection g if g = 0 .
A ( 1 , 2 ) -tensor field K on a statistical manifold ( N , g , ) is described by
K X 1 X 2 = X 1 X 2 X 1 g X 2 ,
from (2) and (3) we have
K = g * = 1 2 ( * ) .
Hence, it follows that K satisfies
K X 1 X 2 = K X 2 X 1 , g ( K X 3 X 2 , X 1 ) = g ( X 1 , K X 2 X 3 ) .
The curvature tensor R of a torsion-free linear connection is described by
R ( X 1 , X 2 ) = X 1 X 2 X 2 X 1 [ X 1 , X 2 ] ,
for any X 1 , X 2 Γ ( T N ) . On a statistical structure ( , g ) , denote the curvature tensor of as R or R for short, and denote R * as R * in a similar argument. It is obvious that
R ( X 1 , X 2 ) = R ( X 2 , X 1 ) ,
R * ( X 1 , X 2 ) = R * ( X 2 , X 1 ) .
Moreover, setting R ( X 1 , X 2 , X 3 , X 4 ) = g ( R ( X 1 , X 2 ) X 3 , X 4 ) , we can see that
R ( X 1 , X 2 , X 3 , X 4 ) = R * ( X 1 , X 2 , X 4 , X 3 ) ,
R ( X 1 , X 2 ) X 3 + R ( X 2 , X 3 ) X 1 + R ( X 3 , X 1 ) X 2 = 0 ,
R * ( X 1 , X 2 ) X 3 + R * ( X 2 , X 3 ) X 1 + R * ( X 3 , X 1 ) X 2 = 0 .
The statistical curvature tensor field S of the statistical structure ( , g ) is given by
S ( X 1 , X 2 ) X 3 = 1 2 { R ( X 1 , X 2 ) X 3 + R * ( X 1 , X 2 ) X 3 } .
By the definition of R , it follows
S ( X 1 , X 2 , X 3 , X 4 ) = S ( X 2 , X 1 , X 3 , X 4 ) , S ( X 1 , X 2 , X 3 , X 4 ) = S ( X 1 , X 2 , X 4 , X 3 ) , S ( X 1 , X 2 , X 3 , X 4 ) = S ( X 3 , X 4 , X 1 , X 2 ) ,
where S ( X 1 , X 2 , X 3 , X 4 ) = g ( S ( X 1 , X 2 ) X 3 , X 4 ) .
The Lie derivative with respect to a metric tensor g in a statistical manifold ( N , g , ) , for any X 1 , X 2 , v Γ ( T N ) is given by
( £ v g ) ( X 1 , X 2 ) = g ( X 1 g v , X 2 ) + g ( X 1 , X 2 g v ) = g ( X 1 v , X 2 ) g ( K X 1 v , X 2 ) + g ( X 1 , X 2 v ) g ( X 1 , K X 2 v ) .
The vector field v is said to be the Killing vector field or infinitesimal isometry if £ v g = 0 . Hence using the above equation and (8), it follows
g ( X 1 v , X 2 ) + g ( X 1 , X 2 v ) = 2 g ( K X 1 v , X 2 ) .
Similarly, (7) implies
g ( X 1 * v , X 2 ) + g ( X 1 , X 2 * v ) = 2 g ( K X 1 v , X 2 ) .
The curvature tensor R g of a Riemannian manifold ( N , g ) admitting a Killing vector field v, satisfies the following
R g ( X 1 , v ) X 2 = X 1 g X 2 g v X 1 g X 2 g v ,
for any X 1 , X 2 , v Γ ( T N ) [6].

3. Nearly Sasakian Statistical Manifolds

An almost contact manifold is a ( 2 n + 1 ) -dimensional differentiable manifold N equipped with an almost contact structure ( F , v , u ) where F is a tensor field of type ( 1 , 1 ) , v a vector field and u a 1-form such that
F 2 = I + u v , F v = 0 , u ( v ) = 1 .
Also, N will be called an almost contact metric manifold if it admits a pseudo-Riemannian metric g with the following condition
g ( F X 1 , F X 2 ) = g ( X 1 , X 2 ) u ( X 1 ) u ( X 2 ) , X 1 , X 2 Γ ( T N ) .
Moreover, as in the almost contact case, (19) yields u = g ( . , v ) and g ( . , F ) = g ( F , . ) .
Theorem 1. 
The statistical curvature tensor field S of a statistical manifold ( N , g , ) with an almost contact metric structure ( F , v , u , g ) such that the vector field v is Killing, satisfies the equation
2 S ( X 1 , v ) X 2 = X 1 X 2 v X 1 X 2 v + X 1 * X 2 * v X 1 * X 2 * v ,
for any X 1 , X 2 Γ ( T N ) .
Proof. 
According to (10), (12) and () we can write
R * ( X 2 , X 3 , X 1 , v ) = R * ( X 3 , X 1 , X 2 , v ) R * ( X 1 , X 2 , X 3 , v ) = R ( X 3 , X 1 , v , X 2 ) + R ( X 1 , X 2 , v , X 3 ) = R ( X 1 , X 3 , v , X 2 ) R ( X 2 , X 1 , v , X 3 ) .
Applying (9) in the above equation we find
R * ( X 2 , X 3 , X 1 , v ) = g ( X 1 X 3 v + X 3 X 1 v + [ X 1 , X 3 ] v , X 2 ) + g ( X 2 X 1 v + X 1 X 2 v + [ X 2 , X 1 ] v , X 3 ) .
Since v is Killing, differentiating
g ( X 2 v , X 3 ) + g ( X 2 , X 3 v ) = 2 g ( K X 2 v , X 3 ) ,
with respect to X 1 , we obtain
2 X 1 g ( K X 3 X 2 v ) = ( X 1 g ) ( X 3 v , X 2 ) + g ( X 1 X 3 v , X 2 ) + g ( X 3 v , X 1 X 2 ) + ( X 1 g ) ( X 2 v , X 3 ) + g ( X 1 X 2 v , X 3 ) + g ( X 2 v , X 1 X 3 ) .
Setting the last equation in (20), it follows
R * ( X 2 , X 3 , X 1 , v ) = 2 g ( X 1 X 2 v , X 3 ) 2 g ( X 1 X 2 v , X 3 ) + 2 ( X 1 g ) ( X 3 v , X 2 ) + 2 g ( K X 3 v , X 1 X 2 ) 2 X 1 g ( K X 3 X 2 , v ) 2 g ( K X 1 v , [ X 3 , X 2 ] ) + 2 X 3 g ( K X 1 X 2 , v ) + 2 g ( K X 2 v , [ X 1 , X 3 ] ) 2 X 2 g ( K X 1 X 3 , v ) + 2 g ( K X 3 v , X 2 X 1 ) + R ( X 2 , X 3 , v , X 1 ) .
As ( X 1 g ) ( X 3 v , X 2 ) = 2 g ( K X 1 X 3 v , X 2 ) and using (12) in the above equation we get
R ( X 2 , X 3 , v , X 1 ) = g ( X 1 X 2 v , X 3 ) + g ( X 1 X 2 v , X 3 ) + 2 g ( K X 1 X 2 , X 3 v ) g ( K X 3 v , X 1 X 2 ) g ( K X 2 v , [ X 1 , X 3 ] ) + X 1 g ( K X 3 X 2 , v ) + g ( K X 1 v , [ X 3 , X 2 ] ) X 3 g ( K X 1 X 2 , v ) + X 2 g ( K X 1 X 3 , v ) g ( K X 3 v , X 2 X 1 ) .
Similarly, we find
R * ( X 2 , X 3 , v , X 1 ) = g ( X 1 * X 2 * v , X 3 ) + g ( X 1 * X 2 * v , X 3 ) 2 g ( K X 1 X 2 , X 3 * v ) + g ( K X 3 v , X 1 * X 2 ) + g ( K X 2 v , [ X 1 , X 3 ] ) X 1 g ( K X 3 X 2 , v ) g ( K X 1 v , [ X 3 , X 2 ] ) + X 3 g ( K X 1 X 2 , v ) X 2 g ( K X 1 X 3 , v ) + g ( K X 3 v , X 2 * X 1 ) .
Adding the previous relations and using (7) and (15), we have the assertion. □
A nearly Sasakian manifold is an almost contact metric manifold ( N , F , v , u , g ) if
( X 1 g F ) X 2 + ( X 2 g F ) X 1 = 2 g ( X 1 , X 2 ) v + u ( X 1 ) X 2 + u ( X 2 ) X 1 ,
for any X 1 , X 2 Γ ( T N ) [7]. In such manifolds, the vector field v is Killing. Moreover, a tensor field h of type ( 1 , 1 ) is determined by
X 1 g v = F X 1 + h X 1 .
The last equation immediately leads to that h is skew-symmetric and
h F = F h , h v = 0 , u h = 0 ,
and
v g h = v g F = F h = 1 3 £ v F .
Moreover, Olszak proved the following formulas in [11]:
R g ( F X 1 , X 2 , X 3 , X 4 ) + R g ( X 1 , F X 2 , X 3 , X 4 ) + R g ( X 1 , X 2 , F X 3 , X 4 )
+ R g ( X 1 , X 2 , X 3 , F X 4 ) = 0 , R g ( F X 1 , F X 2 , F X 3 , F X 4 ) = R g ( X 1 , X 2 , X 3 , X 4 ) R g ( v , X 2 , X 3 , X 4 ) u ( X 1 )
+ R g ( v , X 1 , X 3 , X 4 ) u ( X 2 ) ,
R g ( v , X 1 ) X 2 = g ( X 1 h 2 X 1 , X 2 ) v u ( X 2 ) ( X 1 h 2 X 1 ) ,
R g ( F X 1 , F X 2 ) v = 0 ,
for any X 1 , X 2 , X 3 , X 4 Γ ( T N ) .
Lemma 1. 
For a manifold N with a statistical structure ( , g ) , and an almost contact metric structure ( F , v , u , g ) , the following holds
X 1 F X 2 F X 1 * X 2 + X 2 F X 1 F X 2 * X 1 = ( X 1 g F ) X 2 + ( X 2 g F ) X 1 + K X 1 F X 2 + K X 2 F X 1 + 2 F K X 1 X 2 ,
for any X 1 , X 2 Γ ( T N ) .
Proof. (6) and (7) imply
X 1 F X 2 F X 1 * X 2 + X 2 F X 1 F X 2 * X 1 = X 1 g F X 2 + K X 1 F X 2 F X 1 g X 2 + F K X 1 X 2 + X 2 g F X 1 + K X 2 F X 1 F X 2 g X 1 + F K X 2 X 1 = ( X 1 g F ) X 2 + ( X 2 g F ) X 1 + K X 1 F X 2 + K X 2 F X 1 + 2 F K X 1 X 2 .
Hence, the proof is complete. □
Definition 1. 
A nearly Sasakian statistical structure on N is a quintuple ( , g , F , v , u ) consisting of a statistical structure ( , g ) and a nearly Sasakian structure ( g , F , v , u ) satisfying
K X 1 F X 2 + K X 2 F X 1 = 2 F K X 1 X 2 ,
for any X 1 , X 2 Γ ( T N ) .
A nearly Sasakian statistical manifold is a manifold which admits a nearly Sasakian statistical structure.
Remark 1. 
A multiple ( N , * , g , F , v , u ) is also a nearly Sasakian statistical manifold if ( N , , g , F , v , u ) is a nearly Sasakian statistical manifold. In this case, from Lemma 1 and Definition 1, we have
X 1 * F X 2 F X 1 X 2 + X 2 * F X 1 F X 2 X 1 = ( X 1 g F ) X 2 + ( X 2 g F ) X 1 ,
for any X 1 , X 2 Γ ( T N ) .
Theorem 2. 
If ( N , , g ) is a statistical manifold, and ( g , F , v ) an almost contact metric structure on N, then ( , g , F , v ) is a nearly Sasakian statistical structure on N if and only if the following formulas hold:
X 1 F X 2 F X 1 * X 2 + X 2 F X 1 F X 2 * X 1 = u ( X 1 ) X 2 + u ( X 2 ) X 1 2 g ( X 1 , X 2 ) v ,
X 1 * F X 2 F X 1 X 2 + X 2 * F X 1 F X 2 X 1 = u ( X 1 ) X 2 + u ( X 2 ) X 1 2 g ( X 1 , X 2 ) v ,
for any X 1 , X 2 Γ ( T N ) .
Proof. 
Let ( N , , g , F , v ) be a nearly Sasakian statistical manifold. Applying (21), Lemma 1 and Definition 1, we get (28). Also, () follows from Remark 1. Conversely, using (7) and subtracting the relations (28) and (), we obtain (27). □
Example 1. 
Let us consider the 3-dimensional unite sphere S 3 in the complex two dimensional space C 2 . As S 3 is isomorphic to the Lie group S U ( 2 ) , set { e 1 , e 2 , e 3 } as the basis of the Lie algebra su ( 2 ) of S U ( 2 ) obtained by
e 1 = 2 2 i 0 0 i ¯ , e 2 = 2 2 0 1 1 0 , e 3 = 1 2 0 i i 0 .
So, the Lie bracket are described by
[ e 1 , e 2 ] = 2 e 3 , [ e 2 , e 3 ] = e 1 , [ e 1 , e 3 ] = e 2 .
The Riemannian metric g on S 3 is defined by the following
g ( e 1 , e 2 ) = g ( e 1 , e 3 ) = g ( e 2 , e 3 ) = 0 , g ( e 1 , e 1 ) = g ( e 2 , e 2 ) = g ( e 3 , e 3 ) = 1 .
Assume that v = e 3 and u is the 1-form described by u ( X 1 ) = g ( X 1 , v ) for any X 1 Γ ( T S 3 ) . Considering F as a ( 1 , 1 ) -tensor field determined by F ( e 1 ) = e 2 , F ( e 2 ) = e 1 and F ( v ) = 0 , the above equations imply that ( S 3 , F , v , u , g ) is an almost contact metric manifold. Using Koszul’s formula, it follows e i g e j = 0 , i , j = 1 , 2 , 3 , except
e 1 g e 2 = v = e 2 g e 1 , e 1 g v = e 2 , e 2 g v = e 1 .
According to the above equations we see that
( e i g F ) e j + ( e j g F ) e i = 0 = 2 g ( e i , e j ) v + u ( e i ) e j + u ( e j ) e i , i , j = 1 , 2 , 3 ,
unless
( e 1 g F ) e 1 + ( e 1 g F ) e 1 = 2 v = 2 g ( e 1 , e 1 ) v + u ( e 1 ) e 1 + u ( e 1 ) e 1 , ( e 1 g F ) v + ( v g F ) e 1 = e 1 = 2 g ( e 1 , v ) v + u ( e 1 ) v + u ( v ) e 1 , ( e 2 g F ) e 2 + ( e 2 g F ) e 2 = 2 v = 2 g ( e 2 , e 2 ) v + u ( e 2 ) e 2 + u ( e 2 ) e 2 , ( e 2 g F ) e 3 + ( e 3 g F ) e 2 = e 2 = 2 g ( e 2 , e 3 ) v + u ( e 2 ) e 3 + u ( e 3 ) e 2 ,
which gives ( g , F , v , u ) is a nearly Sasakian structure on S 3 . Setting
K ( e 1 , e 1 ) = e 1 , K ( e 1 , e 2 ) = K ( e 2 , e 1 ) = e 2 , K ( e 2 , e 2 ) = e 1 ,
while the other cases are zero, one see that K satisfies (8). From (6), it follows
e 1 e 1 = e 1 , e 1 e 2 = e 3 e 2 , e 1 e 3 = e 2 , e 2 e 1 = e 2 e 3 , e 2 e 2 = e 1 , e 2 e 3 = e 1 .
So, we obtain ( e i g ) ( e j , e k ) = 0 , i , j , k = 1 , 2 , 3 , except
( e 1 g ) ( e 1 , e 1 ) = 2 , ( e 1 g ) ( e 2 , e 2 ) = ( e 2 g ) ( e 1 , e 2 ) = ( e 2 g ) ( e 2 , e 1 ) = 2 .
Hence ( , g ) is a statistical structure on S 3 . Moreover, the equations
K e 1 F ( e 1 ) + K e 1 F ( e 1 ) = 2 e 2 = 2 F K e 1 e 1 , K e 1 F ( e 2 ) + K e 2 F ( e 1 ) = 2 e 1 = 2 F K e 1 e 2 , K e 2 F ( e 2 ) + K e 2 F ( e 2 ) = 2 e 2 = 2 F K e 2 e 2 ,
hold. Therefore ( S 3 , , g , F , v , u ) is a nearly Sasakian statistical manifold.
Proposition 1. 
For a nearly Sasakian statistical manifold ( N , , g , F , v , u ) , the following conditions hold:
i ) F K v v = 0 , i i ) F K F X 1 v = 0 , i i i ) K v X 1 = u ( X 1 ) K v v , i v ) X 1 v = X 1 g v + u ( X 1 ) K v v , v ) X 1 * v = X 1 g v u ( X 1 ) K v v ,
for any X 1 Γ ( T N ) .
Proof. 
Setting X 1 = X 2 = v in (27), it follows (i). For X 2 = v in (27), we have
K F X 1 v = 2 F K X 1 v .
Putting X 1 = F X 1 in the last equation and using (18), we get
K X 1 v = u ( X 1 ) K v v + 2 F K F X 1 v .
Applying F yields
F K X 1 v = 2 K F X 1 v + 2 u ( K F X 1 v ) v .
(30) and the last equation imply
3 K F X 1 v = 4 u ( K F X 1 v ) v ,
which gives us F K F X 1 v = 0 , so (ii) holds. This and (31) yield (iii). From (6), (7) and (iii) we have (iv) and (v). □
Corollary 1. 
A nearly Sasakian statistical manifold satisfies the following
u ( X 2 ) K X 1 K v v = u ( X 1 ) K X 2 K v v = u ( K X 1 X 2 ) K v v ,
for any X 1 , X 2 Γ ( T N ) .
Proof. (6) and (30) imply
F 2 ( X 1 v X 1 g v ) = 0 ,
which gives us
X 1 v = X 1 g v + g ( X 1 v , v ) v .
Similarly
X 1 * v = X 1 g v + g ( X 1 * v , v ) v .
Then subtracting the above two equations yields
K X 1 v = g ( X 1 v , v ) v ,
which gives us K v v = g ( v v , v ) v . Thus we obtain
u ( X 2 ) K X 1 K v v = u ( X 2 ) g ( v v , v ) K X 1 v = u ( X 1 ) u ( X 2 ) g ( v v , v ) K v v = u ( X 1 ) K X 2 K v v .
Moreover, (iii) implies
u ( K X 1 X 2 ) K v v = g ( K X 1 X 2 , v ) K v v = g ( K X 1 v , X 2 ) K v v = u ( X 1 ) u ( X 2 ) g ( v v , v ) K v v .
So the assertion follows. □
Corollary 2. 
In a nearly Sasakian statistical manifold N, let X 1 Γ ( T N ) and X 1 v . Then
(1)
K X 1 v = 0 ,
(2)
X 1 v = X 1 * v = X 1 g v .
Proposition 2. 
On a nearly Sasakian statistical manifold, the following holds
g ( X 1 v , X 2 ) + g ( X 2 v , X 1 ) = 2 u ( X 1 ) u ( X 2 ) g ( K v v , v ) ,
for any X 1 , X 2 Γ ( T N ) .
Proof. 
Since v is a Killing vector field in a nearly Sasakian manifold (see [7]), hence we have
g ( X 1 g v , X 2 ) + g ( X 2 g v , X 1 ) = 0 .
Setting (6) in the above equation, we have the assertion. □
Lemma 2. 
Let ( N , , g , F , v ) be a nearly Sasakian statistical manifold. Then the statistical curvature tensor field satisfies
S ( v , X 1 ) X 2 = g ( X 1 h 2 X 1 , X 2 ) v u ( X 2 ) ( X 1 h 2 X 1 ) ,
for any X 1 , X 2 Γ ( T N ) .
Proof. 
According to (6), (7) and Theorem 1, we can write
X 1 X 2 v X 1 X 2 v = X 1 X 2 g v + X 1 ( u ( X 2 ) K v v ) X 1 X 2 g v u ( X 1 X 2 ) K v v = K X 1 X 2 g v + X 1 g X 2 g v + ( X 1 u ) X 2 K v v + u ( X 2 ) ( K X 1 K v v + X 1 g K v v ) X 1 g X 2 g v K X 1 X 2 g v .
Applying (17) in the above equation, we have
X 1 X 2 v X 1 X 2 v = R g ( X 1 , v ) X 2 + K X 1 X 2 g v + ( X 1 u ) X 2 K v v + u ( X 2 ) ( K X 1 K v v + X 1 g K v v ) K X 1 X 2 g v .
We conclude similarly that
X 1 * X 2 * v X 1 * X 2 * v = R g ( X 1 , v ) X 2 K X 1 X 2 g v ( X 1 * u ) X 2 K v v + u ( X 2 ) ( K X 1 K v v X 1 g K v v ) + K X 1 X 2 g v .
The above two equations imply
X 1 X 2 v X 1 X 2 v + X 1 * X 2 * v X 1 * X 2 * v = 2 R g ( X 1 , v ) X 2 2 u ( K X 1 X 2 ) K v v + 2 u ( X 2 ) K X 1 K v v ,
from this and Theorem 1, we have
S ( X 1 , v ) X 2 = R g ( X 1 , v ) X 2 u ( K X 1 X 2 ) K v v + u ( X 2 ) K X 1 K v v .
Thus the assertion follows from the last equation, () and Corollary 1. □
Corollary 3. 
On a nearly Sasakian statistical manifold N, the following holds
S ( X 1 , X 2 ) v = g ( X 1 + h 2 X 1 , X 2 ) v + u ( X 2 ) ( X 1 h 2 X 1 )
+ g ( X 2 h 2 X 2 , X 1 ) v u ( X 1 ) ( X 2 h 2 X 2 ) , S ( F X 1 , F X 2 ) v = 0 ,
for any X 1 , X 2 Γ ( T N ) .
Proof. 
We have
S ( X 1 , X 2 ) v = S ( v , X 1 ) X 2 S ( X 2 , v ) X 1 .
Applying Lemma 2 in the last equation, it follows (32). To prove (33), putting X 1 = F X 1 and X 2 = F X 2 in the above equation and using the skew-symmetric property of h, we get
S ( F X 1 , F X 2 ) v = g ( F X 1 + h 2 F X 1 , F X 2 ) v + g ( F X 2 h 2 F X 2 , F X 1 ) v = 0 .
Proposition 3. 
The statistical curvature tensor field S of a nearly Sasakian statistical manifold N, satisfies the following
S ( F X 1 , X 2 , X 3 , X 4 ) + S ( X 1 , F X 2 , X 3 , X 4 ) + S ( X 1 , X 2 , F X 3 , X 4 ) + S ( X 1 , X 2 , X 3 , F X 4 ) = 0 ,
S ( F X 1 , F X 2 , F X 3 , F X 4 ) = S ( X 1 , X 2 , X 3 , X 4 ) + u ( X 2 ) R g ( v , X 1 , X 3 , X 4 ) u ( X 1 ) R g ( v , X 2 , X 3 , X 4 ) ,
for any X 1 , X 2 , X 3 , X 4 Γ ( T N ) .
Proof. 
Applying (7) in (15), it follows
S ( X 1 , X 2 ) X 3 = R g ( X 1 , X 2 ) X 3 + [ K X 1 , K X 2 ] X 3 .
Thus using (36) and (23), we can write
S ( F X 1 , X 2 , X 3 , X 4 ) + S ( X 1 , F X 2 , X 3 , X 4 ) + S ( X 1 , X 2 , F X 3 , X 4 ) + S ( X 1 , X 2 , X 3 , F X 4 ) = g ( K F X 1 K X 2 X 3 K X 2 K F X 1 X 3 + K X 1 K F X 2 X 3 K F X 2 K X 1 X 3 X 3 K X 2 K X 1 F X 3 , X 4 ) + K X 1 K X 2 F + g ( K X 1 K X 2 X 3 K X 2 K X 1 X 3 , F X 4 ) .
On the other hand, (27) implies
g ( K X 1 F X 2 + K X 2 F X 1 , X 3 ) = 2 g ( K X 1 X 2 , F X 3 ) ,
which gives us
g ( K F X 1 K X 2 X 3 K X 2 K F X 1 X 3 + K X 1 K F X 2 X 3 K F X 2 K X 1 X 3 + K X 1 K X 2 F X 3 K X 2 K X 1 F X 3 , X 4 ) + g ( K X 1 K X 2 X 3 K X 2 K X 1 X 3 , F X 4 ) = 2 g ( K X 2 X 3 , F K X 1 X 4 ) 2 g ( K X 1 X 3 , F K X 2 X 4 ) + 2 g ( F K X 2 X 3 , K X 1 X 4 ) 2 g ( F K X 1 X 3 , K X 2 X 4 ) = 0 .
Putting the above equation in (37), we get (34). Considering X 1 = F X 1 in (34) and using (18), it follows
S ( X 1 , X 2 , X 3 , X 4 ) + u ( X 1 ) S ( v , X 2 , X 3 , X 4 ) + S ( F X 1 , F X 2 , X 3 , X 4 ) + S ( F X 1 , X 2 , F X 3 , X 4 ) + S ( F X 1 , X 2 , X 3 , F X 4 ) = 0 .
Similarly, setting X 2 = F X 2 , X 3 = F X 3 and X 4 = F X 4 , respectively, we have
S ( F X 1 , F X 2 , X 3 , X 4 ) S ( X 1 , X 2 , X 3 , X 4 ) + u ( X 2 ) S ( X 1 , v , X 3 , X 4 ) + S ( X 1 , F X 2 , F X 3 , X 4 ) + S ( X 1 , F X 2 , X 3 , F X 4 ) = 0 ,
S ( F X 1 , X 2 , F X 3 , X 4 ) + S ( X 1 , F X 2 , F X 3 , X 4 ) S ( X 1 , X 2 , X 3 , X 4 ) + u ( X 3 ) S ( X 1 , X 2 , v , X 4 ) + S ( X 1 , X 2 , F X 3 , F X 4 ) = 0 ,
and
S ( F X 1 , X 2 , X 3 , F X 4 ) + S ( X 1 , F X 2 , X 3 , F X 4 ) + S ( X 1 , X 2 , F X 3 , F X 4 ) S ( X 1 , X 2 , X 3 , X 4 ) + u ( X 4 ) S ( X 1 , X 2 , X 3 , v ) = 0 .
By adding (38) and (39), and subtracting the expression obtained from (40) and (41), we will have
2 S ( F X 1 , F X 2 , X 3 , X 4 ) 2 S ( X 1 , X 2 , F X 3 , F X 4 ) + u ( X 1 ) S ( v , X 2 , X 3 , X 4 ) + u ( X 2 ) S ( X 1 , v , X 3 , X 4 ) u ( X 3 ) S ( X 1 , X 2 , v , X 4 ) u ( X 4 ) S ( X 1 , X 2 , X 3 , v ) = 0 .
Replacing X 1 and X 2 by F X 1 and F X 2 , we can rewrite the last equation as
2 S ( F 2 X 1 , F 2 X 2 , X 3 , X 4 ) 2 S ( F X 1 , F X 2 , F X 3 , F X 4 ) u ( X 3 ) S ( F X 1 , F X 2 , v , X 4 ) u ( X 4 ) S ( F X 1 , F X 2 , X 3 , v ) = 0 .
Applying (33) in the above equation, we get
S ( F 2 X 1 , F 2 X 2 , X 3 , X 4 ) = S ( F X 1 , F X 2 , F X 3 , F X 4 ) .
On the other hand using (18), it is seen that
S ( F 2 X 1 , F 2 X 2 , X 3 , X 4 ) = S ( X 1 , X 2 , X 3 , X 4 ) u ( X 2 ) S ( X 1 , v , X 3 , X 4 ) u ( X 1 ) S ( v , X 2 , X 3 , X 4 ) .
According to (32), we have
R g ( v , X 1 , X 3 , X 4 ) = R g ( X 3 , X 4 , v , X 1 ) = S ( X 3 , X 4 , v , X 1 ) = S ( v , X 1 , X 3 , X 4 ) .
The above three equations imply (35). □
Corollary 4. 
The tensor field K in a nearly Sasakian statistical manifold N, satisfies the relation
F [ K F X 2 , K F X 1 ] F = [ K X 1 , K X 2 ] ,
for any X 1 , X 2 Γ ( T N ) .
Proof. 
Using () and (36), we obtain
S ( F X 1 , F X 2 , F X 3 , F X 4 ) S ( X 1 , X 2 , X 3 , X 4 ) u ( X 2 ) R g ( v , X 1 , X 3 , X 4 ) + u ( X 1 ) R g ( v , X 2 , X 3 , X 4 ) = g ( K F X 1 K F X 2 F X 3 K F X 2 K F X 1 F X 3 , F X 4 ) g ( K X 1 K X 2 X 3 K X 2 K X 1 X 3 , X 4 ) = F [ K F X 2 , K F X 1 ] F X 3 [ K X 1 , K X 2 ] X 3 .
Comparing with relation (35) yields the assertion. □
A statistical manifold is called conjugate symmetric if the curvature tensors of the connections and * , are equal, i.e.,
R ( X 1 , X 2 ) X 3 = R * ( X 1 , X 2 ) X 3 ,
for all X 1 , X 2 , X 3 Γ ( T N ) .
Corollary 5. 
Let ( N , , g , F , v ) be a conjugate symmetric nearly Sasakian statistical manifold. Then the following holds
R ( F X 1 , F X 2 , F X 3 , F X 4 ) R ( X 1 , X 2 , X 3 , X 4 ) = u ( X 2 ) R ( X 3 , X 4 , v , X 1 ) u ( X 1 ) R ( X 3 , X 4 , v , X 2 ) , R ( X 1 , X 2 ) v = R g ( X 1 , X 2 ) v , R ( F X 1 , F X 2 ) v = 0 ,
for any X 1 , X 2 , X 3 , X 4 Γ ( T N ) .

4. Hypersurfaces in Nearly Kähler Statistical Manifolds

Let N ˜ be a smooth manifold. A pair ( g ˜ , J ) is said to be an almost Hermitian structure on N ˜ if
J 2 = I d , g ˜ ( J X 1 , J X 2 ) = g ˜ ( X 1 , X 2 ) ,
for any X 1 , X 2 Γ ( T N ˜ ) . Let ˜ g denotes the Riemannian connection of g ˜ . Then J is Killing if and only if
( ˜ X 1 g J ) X 2 + ( ˜ X 2 g J ) X 1 = 0 .
In this case, the pair ( g ˜ , J ˜ ) is called a nearly Kähler structure and if J is integrable, the structure is Kählerian [7].
Lemma 3. 
Let ( ˜ , g ˜ ) be a statistical structure, and ( g ˜ , J ) a nearly Kähler structure on N ˜ . We have the following formula:
˜ X 1 J X 2 J ˜ X 1 * X 2 + ˜ X 2 J X 1 J ˜ X 2 * X 1 = ( ˜ X 1 g J ) X 2 + ( ˜ X 2 g J ) X 1 + K ˜ X 1 J X 2 + K ˜ X 2 J X 1 + 2 J K ˜ X 1 X 2 ,
for any X 1 , X 2 Γ ( T N ˜ ) where K ˜ is given as (8) for ( ˜ , g ˜ ) .
Definition 2. 
A nearly Kähler statistical structure on N ˜ is a triple ( ˜ , g ˜ , J ) , where ( ˜ , g ˜ ) is a statistical structure, ( g ˜ , J ) is a nearly Kähler structure on N ˜ and the following equality is satisfied
K ˜ X 1 J X 2 + K ˜ X 2 J X 1 = 2 J K ˜ X 1 X 2 ,
for any X 1 , X 2 Γ ( T N ˜ ) .
Let N be a hypersurface of a statistical manifold ( N ˜ , g ˜ , ˜ , ˜ * ) . Considering n and g, respectively as a unit normal vector field and the induced metric on N, the following relations hold
˜ X 1 X 2 = X 1 X 2 + h ( X 1 , X 2 ) n , ˜ X 1 n = A X 1 + τ ( X 1 ) n ,
˜ X 1 * X 2 = X 1 * X 2 + h * ( X 1 , X 2 ) n , ˜ X 1 * n = A * X 1 + τ * ( X 1 ) n ,
for any X 1 , X 2 Γ ( T N ) . It follows
g ( A X 1 , X 2 ) = h * ( X 1 , X 2 ) , g ( A * X 1 , X 2 ) = h ( X 1 , X 2 ) , τ ( X 1 ) + τ * ( X 1 ) = 0 .
Furthermore, the second fundamental form h g is related to the Levi-Civita connections ˜ g and g by
˜ X 1 g X 2 = X 1 g X 2 + h g ( X 1 , X 2 ) n , ˜ X 1 g n = A g X 1 ,
where g ( A g X 1 , X 2 ) = h g ( X 1 , X 2 ) .
Remark 2. 
Let ( N ˜ , g ˜ , J ) be a nearly Kähler manifold, and N a hypersurface with a unit normal vector field n . Let g be the induced metric on N, and consider v, u and F , respectively as a vector field, a 1-form and a tensor of type ( 1 , 1 ) on N such that
v = J n ,
J X 1 = F X 1 + u ( X 1 ) n ,
for any X 1 Γ ( T N ) . Then ( g , F , v ) is an almost contact metric structure on N [7].
Lemma 4. 
Let ( N ˜ , ˜ , g ˜ , J ) be a nearly Kähler statistical manifold. If ( N , g , F , v ) is a hypersurface with the induced almost contact metric structure as in Remark 2, and ( , g ) the induced statistical structure on N as in (42), then the following hold
i ) F A v = 0 , i i ) g ( A X 1 , v ) = u ( A v ) u ( X 1 ) , i i i ) A X 1 = v F X 1 F v * X 1 F X 1 * v + u ( X 1 ) A v , i v ) τ ( X 1 ) = g ( X 1 * v , v ) g ( X 1 , v v ) u ( X 1 ) τ ( v ) , v ) X 1 F X 2 F X 1 * X 2 + X 2 F X 1 F X 2 * X 1 = 2 g ( A X 1 , X 2 ) v + u ( X 2 ) A X 1 + u ( X 1 ) A X 2 , v i ) g ( X 1 v , X 2 ) + g ( X 2 v , X 1 ) = g ( F A * X 1 , X 2 ) + g ( F A * X 2 , X 1 ) u ( X 1 ) τ ( X 2 ) u ( X 2 ) τ ( X 1 ) ,
for any X 1 , X 2 Γ ( T N ) .
Proof. 
According to Definition 2 and (45), we can write
0 = ˜ X 1 J v ˜ X 1 n = J ˜ X 1 * v ˜ v J X 1 + J ˜ v * X 1 ˜ X 1 n .
Applying (42), () and () in the above equation, we have
0 = J ( X 1 * v + g ( A X 1 , v ) n ) ˜ v ( F X 1 + u ( X 1 ) n ) + J ( v * X 1 + g ( A v , X 1 ) n ) + A X 1 τ ( X 1 ) n = F ( X 1 * v ) g ( A X 1 , v ) v v F ( X 1 ) + u ( X 1 ) A v + F ( v * X 1 ) g ( A v , X 1 ) v + A X 1 + { u ( X 1 * v ) g ( A * v , F X 1 ) v ( u ( X 1 ) ) u ( X 1 ) τ ( v ) + u ( v * X 1 ) τ ( X 1 ) } n .
Vanishing tangential part yields
A X 1 = v F X 1 F v * X 1 F X 1 * v + 2 g ( A X 1 , v ) v u ( X 1 ) A v .
Setting X 1 = v in the above equation, it follows
A v = u ( A v ) v ,
hence F A v = 0 and implies (i), from which (ii) follows because 0 = g ( F A v , F X 1 ) = g ( A v , X 1 ) u ( A v ) u ( X 1 ) . From (48) and (49) we have (iii). Vanishing vertical part in (47) and using (i) and
v ( u ( X 1 ) ) = g ( v * X 1 , v ) + g ( X 1 , v v ) ,
we get (iv). As
˜ X 1 J X 2 J ˜ X 1 * X 2 + ˜ X 2 J X 1 J ˜ X 2 * X 1 = 0 ,
thus (42), (), (45) and () imply
X 1 F X 2 u ( X 2 ) A X 1 F ( X 1 * X 2 ) + g ( A X 1 , X 2 ) v + X 2 F X 1 u ( X 1 ) A X 2 F ( X 2 * X 1 ) + g ( A X 2 , X 1 ) v + { g ( A * X 1 , F X 2 ) + g ( X 1 v , X 2 ) + u ( X 2 ) τ ( X 1 ) + g ( A * X 2 , F X 1 ) + g ( X 1 , X 2 v ) + u ( X 1 ) τ ( X 2 ) } n = 0 .
From the above equation, (v) and (vi) follow. □
Remark 3. 
In the analogous setting in Lemma 4, we have equations for the dual connection * . For example, equation (i) is given as
F A * v = 0 .
We note to this equation as (i) * for brief if there is no danger of confusion.
Theorem 3. 
Let ( N ˜ , ˜ , g ˜ , J ) be a nearly Kähler statistical manifold and ( N , , g , F , v ) an almost contact metric statistical hypersurface in N ˜ given by (42), (), (45) and (). Then ( N , , g , F , v ) is a nearly Sasakian statistical manifold if and only if
A X 1 = X 1 + u ( X 1 ) ( A v v ) ,
A * X 1 = X 1 + u ( X 1 ) ( A * v v ) ,
for any X 1 Γ ( T N ) .
Proof. 
Let ( , g , F , v ) be a nearly Sasakian statistical structure on N. According to Definition 1 we have
X 1 F X 2 F X 1 * X 2 + X 2 F X 1 F X 2 * X 1 = 2 g ( X 1 , X 2 ) v + u ( X 1 ) X 2 + u ( X 2 ) X 1 ,
which gives us
v F X 1 F X 1 * v F v * X 1 = u ( X 1 ) v + X 1 .
Putting the last equation in the part (iii) of Lemma 4, we obtain (50). Similarly, we can prove (). Conversely, let the shape operators satisfy (50). From the part (v) of Lemma 4 yields
X 1 F X 2 F X 1 * X 2 + X 2 F X 1 F X 2 * X 1 = 2 g ( X 1 + u ( X 1 ) ( A v v ) , X 2 ) v + u ( X 2 ) ( X 1 + u ( X 1 ) ( A v v ) ) + u ( X 1 ) ( X 2 + u ( X 2 ) ( A v v ) ) = 2 g ( X 1 , X 2 ) v + u ( X 1 ) X 2 + u ( X 2 ) X 1 .
In the same way, (v) * and () imply
X 1 * F X 2 F X 1 X 2 + X 2 * F X 1 F X 2 X 1 = 2 g ( X 1 , X 2 ) v + u ( X 1 ) X 2 + u ( X 2 ) X 1 .
According to the above equations and Theorem 2, the proof completes. □

5. Submanifolds of Nearly Sasakian Statistical Manifolds

Let N be a n-dimensional submanifold of an almost contact metric statistical manifold ( N ˜ , ˜ , g , F ˜ , v ˜ , u ˜ ) . We denote the induced metric on N by g. For all U 1 Γ ( T N ) and ζ Γ ( T N ) , we put F ˜ U 1 = F U 1 + F ¯ U 1 and F ˜ ζ = F ζ + F ¯ ζ where F U 1 , F ζ Γ ( T N ) and F ¯ U 1 , F ¯ ζ Γ ( T N ) . If F ˜ ( T p N ) T p N and F ˜ ( T p N ) T p N for any p N , then N is called F ˜ -invariant and F ˜ -anti-invariant, respectively.
Proposition 4. 
[13] Any F ˜ -invariant submanifold N imbedded in an almost contact metric manifold ( N ˜ , ˜ , g , F ˜ , v ˜ , u ˜ ) in such a way that the vector field v ˜ is always tangent to N has the induced almost contact metric structure ( g , F , v , u ) .
For any U 1 , U 2 Γ ( T N ) , the corresponding Gauss formulas are given by
˜ U 1 U 2 = U 1 U 2 + h ( U 1 , U 2 ) , ˜ U 1 * U 2 = U 1 * U 2 + h * ( U 1 , U 2 ) .
It is proved that ( , g ) and ( * , g ) are statistical structures on N and h and h * are symmetric and bilinear. The mean curvature vector field with respect to ˜ is described by
H = 1 m t r a c e ( h ) .
The submanifold N is a ˜ -totally umbilical submanifold if h ( U 1 , U 2 ) = g ( U 1 , U 2 ) H for all U 1 , U 2 Γ ( T N ) . Also, the submanifold N is called ˜ -autoparallel if h ( U 1 , U 2 ) = 0 for any U 1 , U 2 Γ ( T N ) . The submanifold N is said to be dual-autoparallel if it is both ˜ - and ˜ * -autoparallel, i.e., h ( U 1 , U 2 ) = h * ( U 1 , U 2 ) = 0 for any U 1 , U 2 Γ ( T N ) . If h g ( U 1 , U 2 ) = 0 for any U 1 , U 2 Γ ( T N ) , the submanifold N is called totally geodesic. Moreover, the submanifold N is called ˜ -minimal ( ˜ * -minimal) if H = 0 ( H * = 0 ).
For any U 1 Γ ( T N ) and ζ Γ ( T N ) , the Weingarten formulas are
˜ U 1 ζ = A ζ U 1 + D U 1 ζ , ˜ X 1 * ζ = A ζ * U 1 + D U 1 * ζ ,
where D and D * are the normal connections on Γ ( T N ) and the tensor fields h , h * , A and A * satisfy
g ( A ζ U 1 , U 2 ) = g ( h * ( U 1 , U 2 ) , ζ ) , g ( A ζ * U 1 , U 2 ) = g ( h ( U 1 , U 2 ) , ζ ) .
Also, the Levi-Civita connections g and ˜ g are associated to the second fundamental form h g by
˜ U 1 g U 2 = U 1 g U 2 + h g ( U 1 , U 2 ) , ˜ U 1 g ζ = A ζ g U 1 + D U 1 g ζ ,
where g ( A ζ g U 1 , U 2 ) = g ( h g ( U 1 , U 2 ) , ζ ) .
On a statistical submanifold ( N , , g ) of a statistical manifold ( N ˜ , ˜ , g ) , for any tangent vector fields U 1 , U 2 Γ ( T N ) , we consider the difference tensor K on N as
2 K U 1 U 2 = U 1 U 2 U 1 * U 2 .
From (7), (52) and the above equation, it follows that
2 K ˜ U 1 U 2 = 2 K U 1 U 2 + h ( U 1 , U 2 ) h * ( U 1 , U 2 ) .
More precisely, for the tangential part and the normal part we have
( K ˜ U 1 U 2 ) = K U 1 U 2 , ( K ˜ U 1 U 2 ) = 1 2 ( h ( U 1 , U 2 ) h * ( U 1 , U 2 ) ) ,
respectively. Similarly, for U 1 Γ ( T N ) and ζ Γ ( T N ) we have
K ˜ U 1 ζ = ( K ˜ U 1 ζ ) + ( K ˜ U 1 ζ ) ,
where
( K ˜ U 1 ζ ) = 1 2 ( A ζ * U 1 A ζ U 1 ) , ( K ˜ U 1 ζ ) = 1 2 ( D U 1 ζ D U 1 * ζ ) .
Now suppose that ( N , g ) be a submanifold of a nearly Sasakian statistical manifold ( N ˜ , ˜ , g , F ˜ , v ˜ ) . As a tensor field h ˜ of type ( 1 , 1 ) on N ˜ is described by ˜ g v ˜ = F ˜ + h ˜ , we can set h ˜ U 1 = h U 1 + h ¯ U 1 and h ˜ ζ = h ζ + h ¯ ζ where h U 1 , h ζ Γ ( T N ) and h ¯ U 1 , h ¯ ζ Γ ( T N ) for any U 1 Γ ( T N ) and ζ Γ ( T N ) . Furthermore, if h ˜ ( T p N ) T p N and h ˜ ( T p N ) T p N , then N is called h ˜ -invariant and h ˜ -anti-invariant, respectively.
Proposition 5. 
Let N be a submanifold of a nearly Sasakian statistical manifold ( N ˜ , ˜ , g , F ˜ , v ˜ , u ˜ ) , where the vector field v ˜ is normal to N. Then
g ( F ˜ U 1 , U 2 ) = g ( U 1 , h ˜ U 2 ) , U 1 , U 2 Γ ( T N ) .
Moreover,
i) N is a h ˜ -anti-invariant submanifold if and only if N is a F ˜ -anti-invariant submanifold.
ii) If h ˜ = 0 , then N is a F ˜ -anti-invariant submanifold.
iii) If N is a h ˜ -invariant and F ˜ -invariant submanifold, then h U 1 = F U 1 , for any U 1 Γ ( T N ) .
Proof. 
Using (22) and Proposition 1 for any U 1 , U 2 Γ ( T N ) , we can write
g ( F ˜ U 1 + h ˜ U 1 , U 2 ) = g ( ˜ U 1 g v ˜ , U 2 ) = g ( ˜ U 1 v ˜ , U 2 ) .
(53) and the above equation imply
g ( F ˜ U 1 + h ˜ U 1 , U 2 ) = g ( A v ˜ U 1 + D U 1 v ˜ , U 2 ) = g ( A v ˜ U 1 , U 2 ) = g ( v ˜ , h * ( U 1 , U 2 ) ) .
As h * is symmetric and the operators h ˜ and g are skew-symmetric, the above equation yields
g ( F ˜ U 1 + h ˜ U 1 , U 2 ) = g ( F ˜ U 2 + h ˜ U 2 , U 1 ) = g ( F ˜ U 1 + h ˜ U 1 , U 2 ) .
Hence g ( F ˜ U 1 + h ˜ U 1 , U 2 ) = 0 , which gives (57). If N is a h ˜ -anti-invariant submanifold, we have g ( U 1 , h ˜ U 2 ) = 0 . Thus (i) follows from (57). Similarly, we have (ii) and (iii). □
Lemma 5. 
Let ( N , , g ) be a F ˜ -anti-invariant statistical submanifold of a nearly Sasakian statistical manifold ( N ˜ , ˜ , g , F ˜ , v ˜ , u ˜ ) such that the structure ( F , v , u ) on N is given by Proposition 4.
i) If v ˜ is tangent to N, then
U 1 v = u ( U 1 ) K v v = U 1 * v , h ( U 1 , v ) = F ¯ U 1 + h ¯ U 1 = h * ( U 1 , v ) , U 1 Γ ( T N ) .
ii) If v ˜ is normal to N, then
A v ˜ = 0 = A v ˜ * , D U 1 v ˜ = F ¯ U 1 + h ¯ U 1 = D U 1 * v ˜ , U 1 Γ ( T N ) .
Proof. 
Applying (22), (52) and Proposition 1 and using K ˜ v v = K v v = g ( v v , v ) v , we have
F ¯ U 1 + h ¯ U 1 + u ( U 1 ) K v v = ˜ U 1 g v + u ( U 1 ) K v v = ˜ U 1 v = U 1 v + h ( U 1 , v ) .
Thus the normal part is h ( U 1 , v ) = F ¯ U 1 + h ¯ U 1 and the tangential part is U 1 v = u ( U 1 ) K v v . Similarly, we get their dual parts. Hence (i) holds. If v ˜ is normal to N, from (22) and (53), it follows
F ¯ U 1 + h ¯ U 1 = ˜ U 1 g v ˜ = ˜ U 1 v ˜ = A v ˜ U 1 + D U 1 v ˜ .
Considering the normal and tangential components of the last equation we get (ii). Since ˜ U 1 v = ˜ U 1 g v = ˜ U 1 * v , we have the dual part of assertion. □
Lemma 6. 
Let ( N , , g ) be a F ˜ -invariant and h ˜ -invariant statistical submanifold of a nearly Sasakian statistical manifold ( N ˜ , ˜ , g , F ˜ , v ˜ , u ˜ ) . Then for any U 1 Γ ( T N ) if
i) v ˜ is tangent to N, then
U 1 v = F U 1 + h U 1 + u ( U 1 ) K v v , U 1 * v = F U 1 + h U 1 u ( U 1 ) K v v , h ( U 1 , v ) = 0 = h * ( U 1 , v ) .
ii) v ˜ is normal to N, then
A v ˜ U 1 = F U 1 h U 1 = A v ˜ * U 1 , D v ˜ = 0 = D * v ˜ .
Proof. 
The relations are proved using a same way applied to the proof of Lemma 5. □
Theorem 4. 
On a nearly Sasakian statistical manifold ( N ˜ , ˜ , g , F ˜ , v ˜ , u ˜ ) , if N is a F ˜ -anti-invariant ˜ -totally umbilical statistical submanifold of N ˜ and v ˜ is tangent to N, then N is ˜ -minimal in N ˜ .
Proof. 
According to Lemma 5, h ( v , v ) = 0 . As N is a totally umbilical submanifold, thus it follows
0 = h ( v , v ) = g ( v , v ) H = H ,
which gives us the assertion. □
Theorem 5. 
Let N be a F ˜ -invariant submanifold of a nearly Sasakian statistical manifold ( N ˜ , ˜ , g , F ˜ , v ˜ , u ˜ ) , where the vector field v ˜ is tangent to N. If
h g ( U 1 , F U 2 ) = F ˜ h g ( U 1 , U 2 ) ,
h ( U 1 , F U 2 ) h * ( U 1 , F U 2 ) = F ˜ h * ( U 1 , U 2 ) F ˜ h ( U 1 , U 2 ) ,
for all U 1 , U 2 Γ ( T N ) , then ( , g , F , v , u ) forms a nearly Sasakian statistical structure on N.
Proof. 
According to Proposition 4, N has the induced almost contact metric structure ( g , F , v , u ) . Also, (52) show that ( , g ) is a statistical structure on N. Applying (54), we can write
˜ U 1 g F ˜ U 2 = U 1 g F U 2 + h g ( U 1 , F U 2 ) = ( U 1 g F ) U 2 + F U 1 g U 2 + h g ( U 1 , F U 2 ) .
As h is symmetric, from (58), we have h g ( F U 1 , U 2 ) = h g ( U 1 , F U 2 ) . Hence the above equation implies
˜ U 1 g F ˜ U 2 + ˜ U 2 g F ˜ U 1 = ( U 1 g F ) U 2 + ( U 2 g F ) U 1 + F U 1 g U 2 + F U 2 g U 1 + 2 h g ( U 1 , F U 2 ) .
On the other hand, since N ˜ has a nearly Sasakian structure, we have
˜ U 1 g F ˜ U 2 + ˜ U 2 g F ˜ U 1 = ( ˜ U 1 g F ˜ ) U 2 + ( ˜ U 2 g F ˜ ) U 1 + F ˜ ( ˜ U 1 g U 2 + ˜ U 2 g U 1 ) = ( ˜ U 1 g F ˜ ) U 2 + ( ˜ U 2 g F ˜ ) U 1 + F ˜ ( U 1 g U 2 + U 2 g U 1 + 2 h g ( U 1 , U 2 ) ) = 2 g ( U 1 , U 2 ) v + u ( U 1 ) U 2 + u ( U 2 ) U 1 + F ˜ ( U 1 g U 2 + U 2 g U + 2 h g ( U , U 2 ) ) = 2 g ( U 1 , U 2 ) v + u ( U 1 ) U 2 + u ( U 2 ) U 1 + F U 1 g U 2 + F U 2 g U 1 + 2 F ˜ h g ( U 1 , U 2 ) .
(58) and the above two equations yield
( U 1 g F ) U 2 + ( U 2 g F ) U 1 = 2 g ( U 1 , U 2 ) v + u ( U 1 ) U 2 + u ( U 2 ) U 1 .
Thus ( N , g , g , F , v , u ) is a nearly Sasakian statistical manifold. For the nearly Sasakian statistical manifold N ˜ , using (27) we have
K ˜ U 1 F U 2 + K ˜ U 2 F U 1 = 2 F K ˜ U 1 U 2 ,
for any U 1 , U 2 Γ ( T N ) . Applying (56) in the last equation, it follows
K U 1 F U 2 + 1 2 ( h ( U 1 , F U 2 ) h * ( U 1 , F U 2 ) ) + K U 2 F U 1 + 1 2 ( h ( U 2 , F U 1 ) h * ( U 2 , F U 1 ) ) = 2 F K U 1 U 2 + F h * ( U 1 , U 2 ) F h ( U 1 , U 2 ) .
From the above equation and (), we get
K U 1 F U 2 + K U 2 F U 1 = 2 F K U 1 U 2 .
Therefore ( N , g , g , F , v , u ) is a nearly Sasakian statistical manifold. Hence the proof completes. □
Proposition 6. 
Let N be a F ˜ -invariant and h ˜ -invariant statistical submanifold of a nearly Sasakian statistical manifold ( N ˜ , ˜ , g , F ˜ , v ˜ , u ˜ ) such that v ˜ is tangent to N. Then
( ˜ U 1 h ) ( U 2 , v ) = ( ˜ U 1 * h ) ( U 2 , v ) = ( ˜ U 1 g h ) ( U 2 , v ) = h ( U 2 , F U 1 + h U 1 ) ,
and
( ˜ U 1 h * ) ( U 2 , v ) = ( ˜ U 1 * h * ) ( U 2 , v ) = ( ˜ U 1 g h * ) ( U 2 , v ) = h * ( U 2 , F U 1 + h U 1 ) ,
for any U 1 , U 2 Γ ( T N ) .
Proof. 
We have
( ˜ U 1 h ) ( U 2 , v ) = ˜ U 1 h ( U 2 , v ) h ( ˜ U 1 U 2 , v ) h ( U 2 , ˜ U 1 v ) ,
for any U 1 , U 2 Γ ( T N ) . According to Proposition 1, the part (i) of Lemma 6 and the above equation, we have
( ˜ U 1 h ) ( U 2 , v ) = h ( U 2 , ˜ U 1 v ) = h ( U 2 , F U 1 + h U 1 + u ( U 1 ) K v v ) = h ( U 2 , F U 1 + h U 1 ) .
Similarly, other parts are obtained. □
Corollary 6. 
Let N be a F ˜ -invariant and h ˜ -invariant statistical submanifold of a nearly Sasakian statistical manifold ( N ˜ , ˜ , g , F ˜ , v ˜ , u ˜ ) . If v ˜ is tangent to N, then the following conditions are equivalent
i) h and h * are parallel with respect to the connection ˜ ;
ii) N is dual-autoparallel.

Author Contributions

Writing—original draft, S.U., E.P. and L.N.; Writing—review & editing, R.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research work was funded by Institutional Fund Projects under grant no. (IFPIP: 1184-130-1443). The authors gratefully acknowledge the technical and financial support provided by the Ministry of Education and King Abdulaziz University, DSR, Jeddah, Saudi Arabia.

Data Availability Statement

Not applicable

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. D. V. Alekseevsky, V. D. V. Alekseevsky, V. Cortés, A. S. Galaev and T. Leistner, Cones over pseudo-Riemannian manifolds and their holonomy, Journal fur die Reine und Angewandte Mathematik, 2009(635) (2009), 23–69. [CrossRef]
  2. S. Amari, Information geometry of the EM and em algorithms for neural networks, Neural Networks, 8(9) (1995), 1379–1408. [CrossRef]
  3. H. Baltazar and A. Da Silva, On static manifolds and related critical spaces with cyclic parallel Ricci tensor, Advances in Geometry, 21(3) (2021), 443–450. [CrossRef]
  4. A. Barros, R. A. Barros, R. Diógenes and E. Ribeiro Jr., Bach-Flat Critical Metrics of the Volume Functional on 4-Dimensional Manifolds with Boundary, J. Geom. Anal., 25 (2015), 2698–2715. [CrossRef]
  5. M. Belkin, P. M. Belkin, P. Niyogi and V. Sindhwani, Manifold regularization: a geometric framework for learning from labeled and unlabeled examples, Journal of Machine Learning Research, 7 (2006), 2399–2434.
  6. D.E. Blair, Riemannian geometry of contact and symplectic manifolds, Progr. Math. 203, Birkhäuser, Basel (2002). [CrossRef]
  7. D. E. Blair, D. K. D. E. Blair, D. K. Showers, K. Yano, Nearly Sasakian structures, Kodai Math. Sem. Rep., 27(1-2) (1976), 175–-180. [CrossRef]
  8. A. Caticha, Geometry from information geometry, arxiv.org/abs/1512.09076v1. [CrossRef]
  9. R. A. Fisher, On the mathematical foundations of theoretical statistics, Phil. Trans. Roy. Soc. London, 222 (1922), 309–368. [CrossRef]
  10. A. Gray, Nearly Kähler manifolds, J. Differ. Geom. 4, (1970), 283–309.
  11. Z. Olszak, Nearly Sasakian manifolds, Tensor (N.S.) 33(3), (1979), 277–286.
  12. K. Sun and S. Marchand-Maillet, An information geometry of statistical manifold learning, Proceedings of the 31st International Conference on Machine Learning (ICML-14), (2014), 1–9.
  13. K. Yano and S. Ishihara, Invariant submanifolds of almost contact manifolds, Kōdai Math. Sem. Rep., 21 (1969), 350-364. [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2024 MDPI (Basel, Switzerland) unless otherwise stated