Preprint

Article

Altmetrics

Downloads

157

Views

26

Comments

0

This version is not peer-reviewed

The orthogonal matrix with cube be symmetric is a common class of matrices with important properties. We pay attention to this kind of matrices. By using the close relationship between the eigenvalues of a matrix and the trace of its power, we obtain the algorithm for its all possible different eigenvalues and multiplicities. The calculation formula is expressed only by the trace of a matrix and its power, avoiding solving the characteristic polynomial. The method is simple and practical. Furthermore, a new essential characterization for the sum of orthogonal matrix pairs being orthogonal is given as well.

Keywords:

Submitted:

04 May 2023

Posted:

08 May 2023

Read the latest preprint version here

Alerts

This version is not peer-reviewed

Submitted:

04 May 2023

Posted:

08 May 2023

Read the latest preprint version here

Alerts

The orthogonal matrix with cube be symmetric is a common class of matrices with important properties. We pay attention to this kind of matrices. By using the close relationship between the eigenvalues of a matrix and the trace of its power, we obtain the algorithm for its all possible different eigenvalues and multiplicities. The calculation formula is expressed only by the trace of a matrix and its power, avoiding solving the characteristic polynomial. The method is simple and practical. Furthermore, a new essential characterization for the sum of orthogonal matrix pairs being orthogonal is given as well.

Keywords:

Subject: Computer Science and Mathematics - Applied Mathematics

A scalar $\lambda $ is called an eigenvalue of an $n\times n$ complex matrix A if there is a nontrivial solution x of $Ax=\lambda x$[1]. The eigenvalues of a matrix A are the roots of the $det(A-\lambda E)=0$ and so are difficult to evaluate in general[2,3]. “ Computer software such as Mathematica and Maple can use symbolic calculations to find the characteristic polynomial of a moderate-sized matrix"[4]. But there is no formula or finite algorithm to solve the characteristic equation of a general $n\times n$ matrix for $n\ge 5$, and the best numerical methods for finding eigenvalues avoid the characteristic polynomial entirely[4]. Is there a way to represent each eigenvalue of a matrix by some simple numerical features, avoiding feature polynomials?

In this paper, we focus on the spectrum of the orthogonal matrix with cube be symmetric. It is a common matrix class, as it may not be symmetric, but its cube is. It is difficult to get the eigenvalues by solving the roots of characteristic polynomial of an orthogonal matrix. The trace is a useful tool. Smith O.K. have obtained the accurate eigenvalues of a $3\times 3$ real symmetric matrix represented by trace[5]. Of course, the expression is somewhat complicated. Lin et al.[6] discussed the solutions to the trace equation of orthogonal matrices with all eigenvalues of real or pure imaginary numbers, whose square is symmetric. The trace of a matrix is a simple but useful tool. Chen.et al.[7] obtained the spectrum of $3\times 3$ orthogonal matrix with trace be an integer. Chen.et al.[8] showed the explict expression for the spectrum of $3\times 3$ orthogonal matrix by trace.

Inspired by the close relationship between all possible eigenvalues of a matrix and the traces of its square power, we discuss the properties of orthogonal matrices with cube be symmetric, and obtain the exact eigenvalues expressed by their square power trace in Section 2. The numerical examples in Section 3 show that our calculation method is simple and practical, it only uses the matrix trace and avoids solving characteristic polynomials. As an application, we draw a conclusion on the judgment of matrix orthogonal similarity in Section 4 and obtain the condition for the sum of the orthogonal matrices be orthogonal.

Throughout this paper, we denote the transpose, determinant, and trace of a square matrix A by ${A}^{T}$, $\left|A\right|$, and $tr\left(A\right)$ respectively. ${E}_{n}$ represents the identity matrix of order n, sometimes represent it as E briefly. Denote the imaginary unit by $i\in \mathbb{C}$ , that is, ${i}^{2}=-1$. Furthermore, ${\mathrm{O}}^{n\times n}$ and ${\mathrm{SO}}^{n\times n}$ stand for the sets of $n\times n$ orthogonal matrices, symmetric orthogonal matrices respectively.

As we know, the eigenvalues of an orthogonal matrix is always 1 or $-1$ or some pairs of conjugate complex roots. Now let the multiplicities of eigenvalues 1 and $-1$ be s, t respectively, and the pair of pairwise conjugate complex(non-real) roots be k. We shall first introduce the standard form of an orthogonal matrix A, denoted it as ${\mathrm{O}}_{A}$.

[1] For an orthogonal matrix A of order n, there exists $Q\in {\mathrm{O}}^{n\times n}$ such that

$${Q}^{-1}AQ=diag\left(\right)open="("\; close=")">{E}_{t},-{E}_{s},{W}_{1},\cdots ,{W}_{k}$$

and $A\in {\mathrm{SO}}^{n\times n}\iff \phantom{\rule{1.em}{0ex}}k=0$, where ${W}_{j}=\left(\right)open="("\; close=")">\begin{array}{cc}{a}_{j}& {b}_{j}\\ -{b}_{j}& {a}_{j}\end{array}$, ${a}_{j}^{2}+{b}_{j}^{2}=1$, $-1<{a}_{j}<1$,${a}_{j},{b}_{j}\in \mathbb{R},j=1,2,\cdots ,k.$

The set of eigenvalues of a matrix A is also called the spectrum of A, denoted as $\sigma \left(A\right)$. We have noticed details for the matrix ${F}_{2}=\frac{1}{2}\left(\right)open="("\; close=")">\begin{array}{cc}1& \sqrt{3}\\ -\sqrt{3}& 1\end{array}$, and list it as as follows. It is easy to prove by checking the calculation.

Let ${F}_{2}=\frac{1}{2}\left(\right)open="("\; close=")">\begin{array}{cc}1& \sqrt{3}\\ -\sqrt{3}& 1\end{array}$, then

- (1)
- $\sigma \left(\right)open="("\; close=")">{F}_{2}$, and $\sigma \left(\right)open="("\; close=")">-{F}_{2}$;
- (2)
- ${F}_{2}^{2}=-{F}_{2}^{T},\phantom{\rule{0.166667em}{0ex}}{F}_{2}^{3}=-{E}_{2},\phantom{\rule{0.166667em}{0ex}}{\left(\right)}^{-}3$;
- (3)
- $tr{F}_{2}=1,\phantom{\rule{0.166667em}{0ex}}tr\left(\right)open="("\; close=")">-{F}_{2}=2$.

In view of Lemma2, we see that $\frac{1}{2}(1\pm \sqrt{3}i)$ is eigenvalues of A iff ${F}_{2}$ or ${F}_{2}^{T}$ is a sub-block of the standard form ${\mathrm{O}}_{A}$. In fact, if $\frac{1}{2}(1\pm \sqrt{3}i)\in \sigma \left(A\right)$, then there will be a sub-block $W=\left(\right)open="("\; close=")">\begin{array}{cc}a& b\\ -b& a\end{array}$. Therefore $\mathrm{tr}W=2a=1$, it follows $a=\frac{1}{2}$. Then $b=\pm \frac{\sqrt{3}}{2}$ by $\left|W\right|=1$, so $W={F}_{2}$ or $W={F}_{2}^{T}$.

The correspondence between sub-blocks of the standard form ${\mathrm{O}}_{A}$ and the eigenvalues will be an important fact for our later discussion.

Based on the background in Section 2, we consider how to calculate the eigenvalue of a symmetric orthogonal matrix with cube be symmetric quickly. We show a simple fact firstly.

For an orthogonal matrix A, it has

$$\sigma \left(A\right)=\left(\right)open="\{"\; close="\}">1,-1,\frac{1}{2}(1+\sqrt{3}i),\frac{1}{2}(1-\sqrt{3}i),-\frac{1}{2}(1+\sqrt{3}i),-\frac{1}{2}(1-\sqrt{3}i)$$

For the necessity, if the all possible eigenvalue of A is 1 or $-1$ or $\frac{1}{2}(1\pm \sqrt{3}i)$ or$-\frac{1}{2}(1\pm \sqrt{3}i)$, then it follows ${E}_{t}$, ${E}_{s}$, ${F}_{2}$, $-{F}_{2}$ be the all possible different sub-blocks of the standard form ${\mathrm{O}}_{A}$. By (1), we can assume

$${\mathrm{O}}_{A}=diag\left(\right)open="("\; close=")">{E}_{t},-{E}_{s},\underset{{k}_{1}}{\underbrace{{F}_{2},\cdots ,{F}_{2}}},\underset{{k}_{2}}{\underbrace{-{F}_{2},\cdots ,-{F}_{2}}}.$$

Combined with Lemma 2 yields

$$\begin{array}{cc}\hfill {\mathrm{O}}_{{A}^{2}}& =diag\left(\right)open="("\; close=")">{E}_{t+s},\underset{{k}_{1}}{\underbrace{-{F}_{2}^{T},\cdots ,-{F}_{2}^{T}}},\underset{{k}_{2}}{\underbrace{-{F}_{2}^{T},\cdots ,-{F}_{2}^{T}}},n=t+s+2{k}_{1}+2{k}_{2};\hfill \end{array}$$

For the sufficiency, as ${A}^{3}\in {\mathrm{SO}}^{n\times n}$, it has $\sigma \left({A}^{3}\right)=\left(\right)open="\{"\; close="\}">1,-1$, then the all possible eigenvalues of A should be cube unit roots. As
then the all possible eigenvalue of A is 1 or $-1$ or $\frac{1}{2}(1\pm \sqrt{3}i)$ or$-\frac{1}{2}(1\pm \sqrt{3}i)$. □

$$\begin{array}{cc}\hfill {x}^{3}-1& =(x-1)\left(\right)open="["\; close="]">x+\frac{1}{2}(1+\sqrt{3}i)\left(\right)open="["\; close="]">x+\frac{1}{2}(1-\sqrt{3}i)\hfill & ,\end{array}\hfill {x}^{3}+1& =(x+1)\left(\right)open="["\; close="]">x-\frac{1}{2}(1+\sqrt{3}i)\left(\right)open="["\; close="]">x-\frac{1}{2}(1-\sqrt{3}i)\hfill & ,$$

Now we can obtain the standard form of an orthogonal matrix A with cube be symmetric.

For an orthogonal matrix A, if its cube is symmetric, then there exists an orthogonal matrix Q such that

$${\mathrm{O}}_{A}={Q}^{-1}AQ=diag\left(\right)open="("\; close=")">{E}_{t},-{E}_{s},\underset{{k}_{1}}{\underbrace{{F}_{2},\cdots ,{F}_{2}}},\underset{{k}_{2}}{\underbrace{-{F}_{2},\cdots ,-{F}_{2}}}.$$

Besides, the parameters t,s,${k}_{1}$,${k}_{2}$ are as follows,

$$\begin{array}{c}t=\frac{1}{6}\left(\right)open="("\; close=")">n+tr{A}^{3}+2tr{A}^{2}+2trA,\hfill \\ s=\frac{1}{6}\left(\right)open="("\; close=")">n-tr{A}^{3}+2tr{A}^{2}-2trA\hfill \end{array}{k}_{1}=\frac{1}{6}\left(\right)open="("\; close=")">n-tr{A}^{3}-tr{A}^{2}+trA,\hfill \\ {k}_{2}=\frac{1}{6}\left(\right)open="("\; close=")">n+tr{A}^{3}-tr{A}^{2}-trA.\hfill $$

As ${A}^{3}\in {\mathrm{SO}}^{n\times n}$, by Theorem1, all possible eigenvalue of A is 1 or $-1$ or $\frac{1}{2}(1\pm \sqrt{3}i)$ or$-\frac{1}{2}(1\pm \sqrt{3}i)$, then Eq.(2) follows by Lemma2 and Theorem1.

By Lemma2, we see that

$$\begin{array}{cc}\hfill trA& =t-s+{k}_{1}-{k}_{2}\hfill \end{array}$$

$$\begin{array}{cc}\hfill tr{A}^{2}& =t+s-{k}_{1}-{k}_{2},\hfill \end{array}$$

$$\begin{array}{cc}\hfill tr{A}^{3}& =t-s-2{k}_{1}+2{k}_{2},\hfill \end{array}$$

$$\begin{array}{cc}\hfill n& =t+s+2{k}_{1}+2{k}_{2}.\hfill \end{array}$$

Plusing Eqs.(4) and Eqs.(5) gives $\frac{1}{2}\left(\right)open="("\; close=")">tr{A}^{2}+trA$, and plusing Eqs.(6) and Eqs.(7) gives $\frac{1}{2}\left(\right)open="("\; close=")">n+tr{A}^{3}$ Then

$${k}_{2}=\frac{1}{6}\left(\right)open="("\; close=")">n+tr{A}^{3}-tr{A}^{2}-trA.$$

By Eqs.(5) and Eqs.(7) follows

$${k}_{1}+{k}_{2}=t+s-tr{A}^{2}=t+s+2{k}_{1}+2{k}_{2}-2\left(\right)open="("\; close=")">{k}_{1}+{k}_{2}-tr{A}^{2}.$$

Therefore,

$${k}_{1}+{k}_{2}=\frac{1}{3}\left(\right)open="("\; close=")">n-tr{A}^{2}$$

Combined with Eqs.(8) yields
namely, ${k}_{1}=\frac{1}{6}\left(\right)open="("\; close=")">n-tr{A}^{3}-tr{A}^{2}+trA$

$${k}_{1}=\frac{1}{3}\left(\right)open="("\; close=")">n-tr{A}^{2}-\frac{1}{6}\left(\right)open="("\; close=")">n+tr{A}^{3}-tr{A}^{2}-trA$$

Consequently, we have

$$\begin{array}{cc}\hfill s& =n-2\left(\right)open="("\; close=")">{k}_{1}+{k}_{2}-t\hfill \end{array}$$

So Eqs.(3) is established . □

Theorem2 indicates that the exact eigenvalue and multiplicities of an orthogonal matrix with cube be symmetric could be obtained by Eqs.(3), which only relates to trace of matrices, avoiding to calculate the characteristic polynomials.

In 1962, Hua Loo-Keng once obtained the necessary and sufficient condition for two orthogonal matrices be similar is that their characteristic matrices have the same elementary factor, when solving the similarity problem of the symplectic square under the symplectic group[9, Theorem 1]. It is equivalent to Lemma1, namely whether the orthogonal similarity of two orthogonal matrices is determined by all their eigenvalues.

For orthogonal matrices A,B, if their cubes are both symmetric, then $A,B$ are orthogonal similar if and only $trA=trB,\phantom{\rule{0.166667em}{0ex}}tr{A}^{2}=tr{B}^{2},\phantom{\rule{0.166667em}{0ex}}tr{A}^{3}=tr{B}^{3}$.

For the necessity, if $A,B$ are orthogonal similar, it is obvious that $trA=trB$, so are the square and cube of trace.

For the sufficiency, in view of Theorem2 yields the standard form of an orthogonal matrix with cube be symmetric could be determined by Eqs.(3), which only relates to the trace of the an orthogonal matrix and its square and cube. Consequently, when $trA=trB,\phantom{\rule{0.166667em}{0ex}}tr{A}^{2}=tr{B}^{2},\phantom{\rule{0.166667em}{0ex}}tr{A}^{3}=tr{B}^{3}$, $A,B$ are orthogonal similar. □

We see an example from [10, Example6.2] firstly.

Let $D=\frac{1}{8}\left(\right)open="("\; close=")">\begin{array}{cccc}6& 2\sqrt{3}& -\sqrt{6}-\sqrt{2}& -\sqrt{6}+\sqrt{2}\\ -2\sqrt{3}& 6& -\sqrt{6}+\sqrt{2}& \sqrt{6}+\sqrt{2}\\ \sqrt{6}-\sqrt{2}& \sqrt{6}+\sqrt{2}& 6& -2\sqrt{3}\\ \sqrt{6}+\sqrt{2}& -\sqrt{6}+\sqrt{2}& 2\sqrt{3}& 6\end{array}$, then D is asymmetric. By [10], we see that it belongs to the subgroup $\overline{G}$ of symmetric groupG, which is composed of positive hexagons, it is also orthogonal. By calculating, it has

$\begin{array}{cc}& {D}^{2}=\frac{1}{8}\left(\right)open="("\; close=")">\begin{array}{cccc}2& 2\sqrt{3}& -\sqrt{6}-3\sqrt{2}& -\sqrt{6}+3\sqrt{2}\\ -2\sqrt{3}& 2& -\sqrt{6}+\sqrt{2}& \sqrt{6}+\sqrt{2}\\ \sqrt{6}-3\sqrt{2}& \sqrt{6}+3\sqrt{2}& 2& -2\sqrt{3}\\ \sqrt{6}+3\sqrt{2}& -\sqrt{6}+3\sqrt{2}& 2\sqrt{3}& 2\end{array};\hfill \end{array}$ We see ${D}^{3}$ is symmetric. It would be very complicated to calculate its eigenvalues.

Since ${D}^{2}\notin {\mathrm{SO}}^{4\times 4},\phantom{\rule{0.166667em}{0ex}}{D}^{3}\in \mathrm{S}{O}^{4\times 4}$, $trD=3,tr{D}^{2}=1,\phantom{\rule{0.166667em}{0ex}}tr{D}^{3}=0$, it follows $t=2,{k}_{1}=1,s={k}_{2}=0$ by Eqs.(3). Then combined with Eqs.(2) yields

$${O}_{D}=diag\left(\right)open="("\; close=")">1,1,\frac{1}{2}\left(\right)open="("\; close=")">\begin{array}{cc}1& \sqrt{3}\\ -\sqrt{3}& 1\end{array},\phantom{\rule{1.em}{0ex}}\sigma \left(D\right)=\left(\right)open="\{"\; close="\}">1,1,\frac{1}{2}(1\pm \sqrt{3}i)$$

From [10, example 8.1], there are four orthogonal matrices as follows,

$${\sigma}_{1}=\left(\right)open="("\; close=")">\begin{array}{cccc}0\hfill & 0\hfill & 1\hfill & 0\hfill \\ 1\hfill & 0\hfill & 0\hfill & 0\hfill \\ 0\hfill & 1\hfill & 0\hfill & 0\hfill \\ 0\hfill & 0\hfill & 0\hfill & 1\hfill \end{array},{\sigma}_{3}=\left(\right)open="("\; close=")">\begin{array}{cccc}0\hfill & 0\hfill & 1\hfill & 0\hfill \\ 0\hfill & 1\hfill & 0\hfill & 0\hfill \\ 0\hfill & 0\hfill & 0\hfill & 1\hfill \\ 1\hfill & 0\hfill & 0\hfill & 0\hfill \end{array}$$

By checking, ${\sigma}_{1}^{2},{\sigma}_{2}^{2},{\sigma}_{3}^{2},{\sigma}_{4}^{2}$ are different and all asymmetric. It is easy to know

$${\sigma}_{j}^{3}={E}_{4},tr{\sigma}_{j}=tr{\sigma}_{j}^{2}=1,tr{\sigma}_{j}^{3}=4,\phantom{\rule{0.166667em}{0ex}}j=1,2,3,4.$$

Therefore ${\sigma}_{1},{\sigma}_{2},{\sigma}_{3},{\sigma}_{4}$ are orthogonal matrices with cube be symmetry.

Let $B=\left(\right)open="("\; close=")">\begin{array}{cccc}-\frac{1}{2}& \frac{\sqrt{3}}{2}& 0& 0\\ -\frac{\sqrt{3}}{2}& -\frac{1}{2}& 0& 0\\ 0& 0& 1& 0\\ 0& 0& 0& 1\end{array},$ then by Lemma2 yields

$${B}^{2}=diag{\left(\right)}^{-}2,{E}_{2}^{2}$$

$${B}^{3}=diag{\left(\right)}^{-}3,{E}_{2}^{3}$$

So $trB=tr{B}^{2}=1,\phantom{\rule{0.166667em}{0ex}}tr{B}_{4}^{3}=4.$ Then ${\sigma}_{1}^{2},{\sigma}_{2}^{2},{\sigma}_{3}^{2},{\sigma}_{4}^{2},B$ are orthogonal similar, they have the same orthogonal standard form and the same spectrum.

From [11, Theorem 3.3],

$$A=\left(\right)open="("\; close=")">\begin{array}{ccc}0& {a}_{12}& 0\\ 0& 0& {a}_{23}\\ {a}_{31}& 0& 0\end{array}$$

are orthogonal matrices of order 3 with only three non-zero elements and all the diagonal elements are zero. By calculation, it has

$${A}^{2}=\left(\right)open="("\; close=")">\begin{array}{ccc}0& 0& {a}_{12}{a}_{23}\\ {a}_{23}{a}_{31}& 0& 0\\ 0& {a}_{31}{a}_{12}& 0\end{array},$$

and

$${A}^{3}=diag\left(\right)open="("\; close=")">{a}_{12}{a}_{23}{a}_{31},{a}_{23}{a}_{31}{a}_{12},{a}_{31}{a}_{12}{a}_{23}$$

$${B}^{3}=diag\left(\right)open="("\; close=")">{b}_{13}{b}_{32}{b}_{21},{b}_{21}{b}_{13}{b}_{32},{b}_{32}{b}_{21}{b}_{13}$$

It shows $A,B$ are orthogonal with cubic be symmetry, and

$$trA=trB=tr{A}^{2}=tr{B}^{2}=0,\phantom{\rule{0.166667em}{0ex}}tr{A}^{3}=3\left|A\right|,tr{B}^{3}=3\left|B\right|,$$

it meets Theorem2.

As we know, if $A,B$ are orthogonal similar, then $A|=|B|$. Here the inverse is also correct. In fact, if $A|=|B|$, then it follows $tr{A}^{3}=tr{B}^{3}$ by Eq.(9-10), which indicates $A,B$ meet Theorem3, namely,$A,B$ are orthogonal similar.

Furthermore, if $\left|A\right|=1$, then by Eq.(9) yields $tr{A}^{3}=3$. Then from Eq.(3), it follows

$${O}_{A}=diag\left(\right)open="("\; close=")">1,-\frac{1}{2}\left(\right)open="("\; close=")">\begin{array}{cc}1& \sqrt{3}\\ -\sqrt{3}& 1\end{array},\phantom{\rule{0.166667em}{0ex}}\sigma \left(A\right)=\left(\right)open="\{"\; close="\}">1,-\frac{1}{2}(1\pm \sqrt{3}i)$$

If $\left|A\right|=1$, then by Eq.(9) yields $tr{A}^{3}=-3$. Combined with Eq.(3) yields

$${O}_{A}=diag\left(\right)open="("\; close=")">-1,-\frac{1}{2}\left(\right)open="("\; close=")">\begin{array}{cc}1& \sqrt{3}\\ -\sqrt{3}& 1\end{array},\phantom{\rule{0.166667em}{0ex}}\sigma \left(A\right)=\left(\right)open="\{"\; close="\}">-1,-\frac{1}{2}(1\pm \sqrt{3}i)$$

As everyone knows, the sum of orthogonal matrices is not necessarily orthogonal. For example, $A=\left(\right)open="("\; close=")">\begin{array}{cc}1\hfill & 0\hfill \\ 0\hfill & 1\hfill \end{array}$ both are orthogonal, however, $A+B=\left(\right)open="("\; close=")">\begin{array}{cc}2\hfill & 0\hfill \\ 0\hfill & 2\hfill \end{array}$ is not orthogonal. So it is natural to consider under what conditions, the sum of the orthogonal matrices coule be orthogonal? It is an interesting question.

[12] Let $A,B$ be orthogonal, then $A+B$ is orthogonal if and only if $A,B$ are orthogonal matrices of order $2n$, and $B=APF{P}^{T}$, where P is an arbitrary orthogonal matrix, $F=diag\left(\right)open="("\; close=")">{E}_{t},-{E}_{s},-{F}_{2}^{T},\cdots -{F}_{2}^{T}$ and ${F}_{2}=\frac{1}{2}\left(\right)open="("\; close=")">\begin{array}{cc}1& \sqrt{3}\\ -\sqrt{3}& 1\end{array}$.

There are some mistakes in Proposition1. If $t\ne 0,$ and $s\ne 0$, then ${A}^{T}(A+B)=E+PF{P}^{T}=Pdiag\left(\right)open="("\; close=")">2{E}_{t},{0}_{s},{E}_{2}-{F}_{2}^{T},\cdots ,{E}_{2}-{F}_{2}^{T}$, that is, ${A}^{T}B$ has eigenvalues 2 or 0, it is contradictory.

[13] Let $A,B$ be orthogonal, then $A+B$ is orthogonal if and only if $A,B$ are orthogonal matrices of order $2n$, and $B=APF{P}^{T}$, where P is an arbitrary orthogonal matrix, $F=diag\left(\right)open="("\; close=")">-{F}_{2}^{T},\cdots -{F}_{2}^{T}$ and ${F}_{2}=\frac{1}{2}\left(\right)open="("\; close=")">\begin{array}{cc}1& \sqrt{3}\\ -\sqrt{3}& 1\end{array}$.

Proposition2 is not rigorous. As P is arbitrary in Proposition1, then suppose ${P}_{1}={E}_{2}$, it follows ${A}^{T}B=-{F}_{2}^{T}$ by $B=A{P}_{1}\left(\right)open="("\; close=")">-{F}_{2}^{T}$. If suppose ${P}_{1}={E}_{2}$, it follows ${A}^{T}B=-{F}_{2}^{T}$ from ${P}_{2}\left(\right)open="("\; close=")">-{F}_{2}^{T}$. They are contradictory.

As an application, we will give a new characterization for the sum of orthogonal matrices being orthogonal.

Let $A,B$ be orthogonal, then $A+B$ is orthogonal if and only if n is an even number and ${A}^{T}B$ is a tri-potent matrix with no real-eigenvalue, namely, ${\left(\right)}^{{A}^{T}}3$

For the necessity, as $A+B$ is orthogonal, then ${(A+B)}^{T}(A+B)={E}_{n}$, namely, $2{E}_{n}+{A}^{T}B+{B}^{T}A={E}_{n}$. Therefore,
is a anti-symmetric matrix with no real eigenvalue. Now we can suppose the eigenvalue of ${A}^{T}B+\frac{1}{2}{E}_{n}$ is ${c}_{j}i$, here i is the imaginary unit of the complex field $\mathbb{C}$ and the coefficient ${c}_{j}\in \mathbb{R}$. Then the eigenvalue of ${A}^{T}B=-\frac{1}{2}{E}_{n}+\left(\right)open="("\; close=")">{A}^{T}B+\frac{1}{2}{E}_{n}$ is $-\frac{1}{2}+{c}_{j}i$. Consequently, it has $\mid \frac{1}{2}+{c}_{j}i\mid =1$, namely, ${\left(\right)}^{-}2$, it follows ${c}_{j}=\pm \frac{\sqrt{3}}{2}.$ It indicated that the eigenvalues of ${A}^{T}B$ are only $-\frac{1}{2}\left(\right)open="("\; close=")">1\pm \sqrt{3}i$. From Eqs.(1) and Lemma2, it gets
where $Q\in {\mathrm{O}}^{n\times n}.$ It shows nis an even number and ${A}^{T}B$ has no real-eigenvalue. Combining with Lemma2 yields
that means ${\left(\right)}^{{A}^{T}}3$. This completes the proof of the necessity.

$${A}^{T}B+\frac{1}{2}{E}_{n}=-{\left(\right)}^{{A}^{T}}T$$

$${Q}^{T}\left(\right)open="("\; close=")">{A}^{T}B={\mathrm{O}}_{{A}^{T}B},$$

$${\mathrm{O}}_{{\left(\right)}^{{A}^{T}}},\phantom{\rule{0.166667em}{0ex}}{\mathrm{O}}_{{\left(\right)}^{{A}^{T}}}$$

For the sufficiency, since${\left(\right)}^{{A}^{T}}3$, if follows ${x}^{3}-1$ is the annihilating polynomial of ${A}^{T}B$. As ${x}^{3}-1=(x-1)\left(\right)open="["\; close="]">x+\frac{1}{2}(1+\sqrt{3}i)$ and ${A}^{T}B$ has no real-eigenvalue, then the eigenvalues of ${A}^{T}B$ are only $-\frac{1}{2}\left(\right)open="("\; close=")">1\pm \sqrt{3}i$, it follows Eqs(11). Then there exists ${H}_{2}=\left(\right)open="("\; close=")">\begin{array}{cc}0& 1\\ -1& 0\end{array}$ such that

$${A}^{T}B=Qdiag\left(\right)open="("\; close=")">-{F}_{2},\cdots ,-{F}_{2}$$

By Lemma2, it follows

$\begin{array}{cc}\hfill {A}^{T}(A+B)& ={E}_{n}+{A}^{T}B=Qdiag\left(\right)open="("\; close=")">\frac{1}{2}{E}_{2}-\frac{\sqrt{3}}{2}{H}_{2},\cdots ,\frac{1}{2}{E}_{2}-\frac{\sqrt{3}}{2}{H}_{2}{Q}^{T}\hfill \end{array}$

Furthermore, $A+B=A\left(\right)open="["\; close="]">Qdiag\left(\right)open="("\; close=")">{F}_{2}^{T},\cdots \xb7{F}_{2}^{T}$ is orthogonal. □

For a $2n\times 2n$ orthogonal matrix A, there exists an orthogonal matrix B with same order meets $A+B$ is orthogonal.

By Lemma1 and Lemma2, we can assume
where Q is orthogonal with order $2n$. Then B is orthogonal. Hence

$$B=AQdiag\left(\right)open="("\; close=")">-{F}_{2},\cdots ,-{F}_{2}$$

$\begin{array}{cc}\hfill A+B& =A\left(\right)open="["\; close="]">{E}_{n}+Qdiag\left(\right)open="("\; close=")">-{F}_{2},\cdots \xb7{E}_{2}-{F}_{2}{Q}^{T}\hfill \end{array}& =A\left(\right)open="["\; close="]">Qdiag\left(\right)open="("\; close=")">{E}_{2}-{F}_{2},\cdots \xb7{E}_{2}-{F}_{2}{Q}^{T}\hfill $

□

Corollary1 indicates there are many pairs of orthogonal matrices such that their sum is orthogonal as well. We can construct many orthogonal matrices of the form as Eqs.(12). We seek into the example as follows.

Let A be an orthogonal matrix of order 4, and ${Q}_{1}=diag\left(\right)open="("\; close=")">\left(\right)open="("\; close=")">\begin{array}{cc}0\hfill & 1\hfill \\ 1\hfill & 0\hfill \end{array}$ ${Q}_{2}=diag\left(\right)open="("\; close=")">{F}_{2},-{F}_{2}$ be orthogonal, then

$${Q}_{1}^{T}=diag\left(\right)open="("\; close=")">\left(\right)open="("\; close=")">\begin{array}{cc}0\hfill & 1\hfill \\ 1\hfill & 0\hfill \end{array}$$

Since $\left(\right)open="("\; close=")">\begin{array}{cc}0\hfill & 1\hfill \\ 1\hfill & 0\hfill \end{array}\left(\right)open="("\; close=")">\begin{array}{cc}0\hfill & 1\hfill \\ 1\hfill & 0\hfill \end{array}$ and

${F}_{2}\left(\right)open="("\; close=")">-{F}_{2}\left(\right)open="("\; close=")">-{F}_{2}$, by Eqs.(12), it follows

$\begin{array}{cc}& {B}_{1}=A\left(\right)open="["\; close="]">{Q}_{1}diag\left(\right)open="("\; close=")">-{F}_{2},-{F}_{2}{Q}_{1}^{T}\hfill & =A\left(\right)open="["\; close="]">diag\left(\right)open="("\; close=")">-{F}_{2}^{T},-{F}_{2}\end{array}$

Then ${B}_{1}-{B}_{2}=A\left(\right)open="["\; close="]">diag\left(\right)open="("\; close=")">-{F}_{2}^{T}+{F}_{2},-{F}_{2}+{F}_{2}=A\left(\right)open="["\; close="]">diag\left(\right)open="("\; close=")">\sqrt{3}\left(\right)open="("\; close=")">\begin{array}{cc}0& 1\\ -1& 0\end{array}$, that is, ${B}_{1}\ne {B}_{2}$, which means ${B}_{1},{B}_{2}$ are different.

From Theorem4, we get [14, Theorem 3] as follows easily.

Let $A,B$ be orthogonal matrices of order n, then the following statements are equivalent:

- (1)
- $A+B$ is orthogonal;
- (2)
- ${A}^{T}B+\frac{1}{2}{E}_{n}$ be real anti-symmetric;
- (3)
- the eigenvalues of ${A}^{T}B$ are $-\frac{1}{2}(1\pm \sqrt{3}i)$.

It shows some equivalent characterization for the sum of orthogonal matrices be orthogonal.

Investigations, Meixiang Chen, Zhongpeng Yang and Zhixing Lin; writing—review and editing, Meixiang Chen; Formal analysis, Zhixing Lin; Project administration, Meixiang Chen. All the authors contributed equally to all the parts of this work. All authors have read and agreed to the published version of the manuscript.

This research was funded by the National Natural Science Foundation of China No. 61772292, the Natural Science Foundation of Fujian Province No. 2021J011103, and by the Science and Technology Project of Putian City No. 2022SZ3001ptxy05.

Not applicable.

Not applicable.

The authors wish to thank the anonymous referees for their detailed and very helpful comments and suggestions that improved this article.

The authors declare no conflict of interest.

- HORN R. A., JOHNSON C. R. Matrix Analysis,2nd ed.New York: Cambridge University press, 2013.
- WOLKOWICZ, H. , STYAN G. P. H. Bounds for eigenvalues using Traces. Linear Algebra and its Application
**1980**, 29, 471–506. [Google Scholar] [CrossRef] - WOLKOWICZ, H. , STYAN G. P. H. More bounds for eigenvalues using Traces. Linear Algebra and its Application
**1980**, 31, 1–17. [Google Scholar] [CrossRef] - LAY D C. Linear algebra and its applications,5th. New Jersey: Pearson Education, Inc. 2019:281.
- SMITH O. K. Eigenvalues of a Symmetric 3×3 Matrix. Communications ACM
**1961**,4(4):168. - LIN Z.X., YANG Z.P.,CHEN M.X.,et al. Some researches on orthogonal solutions to a class of matrix trace equations
**2020**,46(2):115-121. - CHEN M. X.,YANG Z. P., YAN Y. Y., et al. Spectrum of 3×3 Orthogonal Matrices Whose Traces are Integers. ournal of Beihua University ( Natural Science)
**2018**, 19(2):158–163. - CHEN M. X.,YANG Z. P., LI Y. X., et al. The Determination of the Spectrum of 3×3 Orthogonal Matrices and its Applications. Journal of Fujian Normal University( Natural Science Edition)
**2020**,36(4):1–8. - HUA L. K. Symplectic Similarity of Symplectic Square Matrix. Acta Scientiarum Naturalium Universitatis Sunyatseni,
**1962**, (4):1–12. - ZHANG S. G., LIN H. L. Algorithms for Symmetric Groups of Simplexes. Applied Mathematics and Computation
**2007**(188): 1610–1634. - YANG J. The Diagonalization of some Orthogonal Matrices. Journal of Southwest University for Nationalities-Natural Science Edition
**2011**, 37(6):889–894. - LIU Z. M. Discussion on the Properties of Orthogonal Matrix. Journal of Chongqing Normal University ( Natural Science Edition )
**2000**, (S1):162-164. - REN F. T,.WANG Y. X.,YANG S. L. On the Necessary and Sufficient Conditions for the Sum of Orthogonal Matrices to be Orthogonal Matrices. Journal of Mathematics(China)
**1999**, (05):48-49. - JIANG L. Some Notes on the Row Simplicity of Matrix and the Sum of Two Orthogonal Matrices.Journal of Mathematics for Technology,
**1998**,14(2): 134-137.

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |

© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.

The Spectrum of the Orthogonal Matrix with its Cube being Symmetric

Meixiang Chen

et al.

,

2023

Matrix equalities equivalent to the reverse order law $(AB)^{\dag} = B^{\dag}A^{\dag}$

Yongge Tian

,

2020

C-S and Strongly C-S Orthogonal Matrices

Xiaoji Liu

et al.

,

2023

© 2024 MDPI (Basel, Switzerland) unless otherwise stated