Preprint
Article

This version is not peer-reviewed.

Tree-Vertex Graph: New Hierarcal Graph Class

Submitted:

26 December 2025

Posted:

29 December 2025

You are already at the latest version

Abstract
A hypergraph generalizes an ordinary graph by allowing an edge to connect any nonempty subset of the vertex set. By iterating the powerset operation one step further, one obtains nested (higherorder) vertex objects and, consequently, a finite SuperHyperGraph whose vertices and edges may themselves be set-valued at multiple levels. Thus, many hierarchical graph structures exist in the literature. Moreover, not only in graph theory but also in broader fields—such as through concepts like Decision Trees and Tree Soft Sets—it is well known that tree structures are effective tools for representing hierarchical concepts. In this paper, we define a new class of graphs called Tree-Vertex Graphs. In this framework, a tree structure is imposed on the vertex set, and the edge set is defined in a manner consistent with the tree-structured vertex set. The tree structure therefore serves as a key concept for representing hierarchical graphs.
Keywords: 
;  ;  

1. Preliminaries

This section introduces the notation and basic terminology.

1.1. SuperHyperGraphs

Graph theory studies vertices and edges, analyzing connectivity, structure, and algorithms for modeling relationships in mathematics, computer science, and applications [1]. A hypergraph generalizes an ordinary graph by allowing an edge to connect any nonempty subset of the vertex set. Hence, hypergraphs provide a direct language for modeling multiway interactions [2,3]. In particular, such structures have played an important role in recent years in methods such as neural networks [3,4,5,6,7].
By iterating the powerset operation one step further, one obtains nested (higher-order) vertex objects and, consequently, a finite SuperHyperGraph whose vertices and edges may themselves be set-valued at multiple levels [8,9]. Such hierarchical representations are useful, for instance, in molecular design, complex-network analysis, neural networks, and related applications [10,11,12,13,14,15,16]. Moreover, as related concepts, research has also been progressing on notions such as Directed SuperHyperGraphs [17,18] and MetaSuperHyperGraphs [19].
Throughout, the index n in PS n ( · ) and in an n-SuperHyperGraph is always taken to be a nonnegative integer.
Definition 1
(Base set). A base set S is the underlying universe of discourse:
S = { x x is an admissible object in the context under consideration } .
All sets appearing in PS ( S ) and in the iterated powersets PS n ( S ) are ultimately built from elements of S.
Definition 2
(Powerset). (see [20,21,22]) For a set S, the powerset of S is
PS ( S ) = { A A S } .
In particular, PS ( S ) and S PS ( S ) .
Definition 3
(Hypergraph [23,24]). A hypergraph is a pair H = ( V , E ) such that:
  • V is a finite set (the vertices), and
  • E is a finite family of nonempty subsets of V (the hyperedges).
Thus, a hyperedge may involve more than two vertices, capturing genuinely multiway relations.
Definition 4
(Iterated powerset and flattening). Let V 0 be a finite nonempty set. Define PS 0 ( V 0 ) : = V 0 and PS k + 1 ( V 0 ) : = PS PS k ( V 0 ) for k 0 . For each k 0 , define the flattening map
Flat k : PS k ( V 0 ) { } PS ( V 0 ) { }
recursively by
Flat 0 ( x ) : = { x } ( x V 0 ) , Flat k + 1 ( X ) : = Y X Flat k ( Y ) ( X PS k + 1 ( V 0 ) { } ) .
Definition 5
(n-SuperHyperGraph). (see [25]) Let V 0 be a finite, nonempty base set. Define
PS 0 ( V 0 ) : = V 0 , PS k + 1 ( V 0 ) : = PS PS k ( V 0 ) ( k N ) .
For n 0 , an n-SuperHyperGraph on V 0  is a pair
SHG ( n ) = ( V , E )
satisfying
V PS n ( V 0 ) and E PS ( V ) { } .
Elements of V are called n-supervertices, and elements of E are called n-superedges (i.e., each n-superedge is a nonempty subset of V).
As a reference, a high-level comparison of Graphs, Hypergraphs, and SuperHyperGraphs is presented in Table 1.

1.2. Rooted Tree

A rooted tree is a connected acyclic graph with a distinguished root, where each non-root node has exactly one parent.
Definition 6
(Rooted tree). A rooted tree is a triple T = ( N , F , r ) where N is a finite node set, r N is the root, and F N × N is a set of parent–child arcs such that every node u N { r } has a unique parent and the underlying undirected graph is connected and acyclic. For u N , let Ch ( u ) denote its children and let Leaf ( T ) N denote the set of leaves.

2. Main Results

This section presents the main results of this paper.

2.1. Depth-n Tree-Vertex Graph

A depth-n Tree-Vertex Graph is a rooted tree of depth n whose nodes represent hierarchical units via nested iterated-powerset labels, with edges only within each level.
Definition 7
(Depth-n Tree-Vertex Graph with level edges). Let V 0 be a finite, nonempty set of base vertices and let n N . A depth-n Tree-Vertex Graph (TVG) on V 0  is a quadruple
TVG ( n ) = V 0 , T , η , { E ( k ) } k = 0 n ,
where:
(i)
T = ( N , F , r ) is a rooted tree whose leaves have depth 0 and whose root has depth n. For each k { 0 , , n } define the level set
N k : = { u N depth ( u ) = k } .
(ii)
η is a nested labeling (level-typed label map)
η : N k = 0 n PS k ( V 0 ) { }
satisfying:
(a)
(Level-typing) For every u N k , one has
η ( u ) PS k ( V 0 ) { } .
(b)
(Leaf grounding) The restriction η | N 0 : N 0 V 0 is a bijection (so each leaf represents exactly one base vertex).
(c)
(Recursive nesting along the tree) For every internal node u N k with k 1 ,
η ( u ) = { η ( v ) v Ch ( u ) } ,
so the label of u is literally the set of its children’s labels (hence lives in the next iterated powerset).
(iii)
The support (flattened base-vertex set) of a tree-vertex u N k is defined by
λ ( u ) : = Flat k ( η ( u ) ) V 0 ,
where Flat k is from Definition 4.
(iv)
For each level k { 0 , , n } , E ( k ) is a set of level-k graph edges :
E ( k ) { u , v } N k u v .
An edge { u , v } E ( k ) encodes a binary relation within the same abstraction depth k between the hierarchical units represented by u and v.
Remark 1
(Why this matches the n-th powerset depth). In Definition 7, the constraint
η ( u ) PS depth ( u ) ( V 0 )
is enforced by construction: leaves satisfy η ( ) V 0 = PS 0 ( V 0 ) , and each upward step applies one powerset operation via η ( u ) = { η ( v ) v Ch ( u ) } . Hence the tree depth is literally the iterated-powerset depth. The flattened support λ ( u ) = Flat depth ( u ) ( η ( u ) ) recovers the base vertices contained in u.
Remark 2
(Option: level-wise hyperedges instead of edges). If one wants multiway relations at each level, replace each E ( k ) by a hyperedge family
E ( k ) PS ( N k ) { } ,
so that each hyperedge e E ( k ) relates an arbitrary finite subset of nodes at level k. This yields a level-hypergraph TVG while retaining the same nested labeling η.
A comparison of an n-SuperHyperGraph and a depth-n Tree-Vertex Graph is presented in Table 2.
Specific examples are given below.
Example 1
(A depth-2 TVG on four base vertices). Let
V 0 = { a , b , c , d } , n = 2 .
We define TVG ( 2 ) = ( V 0 , T , η , { E ( k ) } k = 0 2 ) as follows.
(i)
Rooted tree. Let the node set be
N = { a , b , c , d , u 1 , u 2 , r } .
Declare r to be the root. Define the parent–child arcs
F = { ( r , u 1 ) , ( r , u 2 ) , ( u 1 , a ) , ( u 1 , b ) , ( u 2 , c ) , ( u 2 , d ) } .
Then the leaves are N 0 = { a , b , c , d } , the level-1 nodes are N 1 = { u 1 , u 2 } , and the unique level-2 node is N 2 = { r } .
(ii)
Nested labeling η . We specify η : N k = 0 2 PS k ( V 0 ) { } by
η ( a ) = a , η ( b ) = b , η ( c ) = c , η ( d ) = d ,
η ( u 1 ) = { η ( a ) , η ( b ) } = { a , b } , η ( u 2 ) = { η ( c ) , η ( d ) } = { c , d } ,
η ( r ) = { η ( u 1 ) , η ( u 2 ) } = { a , b } , { c , d } .
By construction:
  • η ( x ) PS 0 ( V 0 ) = V 0 for each x N 0 ,
  • η ( u i ) PS 1 ( V 0 ) for u i N 1 ,
  • η ( r ) PS 2 ( V 0 ) for r N 2 ,
and the recursive nesting condition η ( u ) = { η ( v ) v Ch ( u ) } holds for the internal nodes.
(iii)
Supports (flattened base-vertex sets). Using λ ( u ) = Flat depth ( u ) ( η ( u ) ) , we obtain
λ ( a ) = { a } , λ ( b ) = { b } , λ ( c ) = { c } , λ ( d ) = { d } ,
λ ( u 1 ) = { a , b } , λ ( u 2 ) = { c , d } , λ ( r ) = { a , b , c , d } .
(iv)
Level edges { E ( k ) } k = 0 2 . Define
E ( 0 ) = { a , c } , { b , d } , E ( 1 ) = { u 1 , u 2 } , E ( 2 ) = .
Example 2
(A depth-3 TVG on six base vertices with nontrivial middle levels). Let
V 0 = { p 1 , p 2 , p 3 , p 4 , p 5 , p 6 } , n = 3 .
We define TVG ( 3 ) = ( V 0 , T , η , { E ( k ) } k = 0 3 ) as follows.
(i)
Rooted tree. Let the leaves (depth 0) be
N 0 = { 1 , 2 , 3 , 4 , 5 , 6 } .
Let the level-1 nodes be
N 1 = { a 1 , a 2 , a 3 } ,
the level-2 nodes be
N 2 = { b 1 , b 2 } ,
and the root (depth 3) be
N 3 = { r } .
Set N = N 0 N 1 N 2 N 3 and define arcs
F = { ( r , b 1 ) , ( r , b 2 ) , ( b 1 , a 1 ) , ( b 1 , a 2 ) , ( b 2 , a 3 ) , ( a 1 , 1 ) , ( a 1 , 2 ) , ( a 2 , 3 ) , ( a 2 , 4 ) , ( a 3 , 5 ) , ( a 3 , 6 ) } .
(ii)
Nested labeling η . Define η on leaves by the bijection
η ( i ) = p i ( i = 1 , , 6 ) .
Then define η on internal nodes by the recursive rule:
η ( a 1 ) = { η ( 1 ) , η ( 2 ) } = { p 1 , p 2 } , η ( a 2 ) = { η ( 3 ) , η ( 4 ) } = { p 3 , p 4 } , η ( a 3 ) = { η ( 5 ) , η ( 6 ) } = { p 5 , p 6 } ,
η ( b 1 ) = { η ( a 1 ) , η ( a 2 ) } = { p 1 , p 2 } , { p 3 , p 4 } , η ( b 2 ) = { η ( a 3 ) } = { p 5 , p 6 } ,
η ( r ) = { η ( b 1 ) , η ( b 2 ) } = { p 1 , p 2 } , { p 3 , p 4 } , { p 5 , p 6 } .
(iii)
Supports (flattened base-vertex sets). Using λ ( u ) = Flat depth ( u ) ( η ( u ) ) , we obtain
λ ( i ) = { p i } ( i = 1 , , 6 ) ,
λ ( a 1 ) = { p 1 , p 2 } , λ ( a 2 ) = { p 3 , p 4 } , λ ( a 3 ) = { p 5 , p 6 } ,
λ ( b 1 ) = { p 1 , p 2 , p 3 , p 4 } , λ ( b 2 ) = { p 5 , p 6 } , λ ( r ) = V 0 .
(iv)
Level edges { E ( k ) } k = 0 3 . Define
E ( 0 ) = { 2 , 3 } , { 4 , 6 } , E ( 1 ) = { a 1 , a 2 } , E ( 2 ) = { b 1 , b 2 } , E ( 3 ) = .
Notation 1
(Descendants, subtree leaves, and level-k container). Let T = ( N , F , r ) be a rooted tree whose leaves have depth 0 and root has depth n. Write v u if v is a descendant of u (possibly v = u ), i.e. there is a directed path from u to v. For u N define the subtree leaf set
L ( u ) : = { N 0 u } .
For each level k { 0 , , n } and each base vertex x V 0 , let x N 0 denote the unique leaf with η ( x ) = x (existence and uniqueness by leaf grounding). Define the level-k container of x by
π k ( x ) : = the unique node u N k such that x L ( u ) .
(The uniqueness of π k ( x ) will be proved in Theorem 2).
Theorem 1
(Support equals the set of leaves in the subtree). Let TVG ( n ) = ( V 0 , T , η , { E ( k ) } k = 0 n ) be a depth-n TVG and let λ ( u ) = Flat depth ( u ) ( η ( u ) ) V 0 . Then for every u N ,
λ ( u ) = { η ( ) L ( u ) } V 0 .
In particular, | λ ( u ) | = | L ( u ) | .
Proof. 
We argue by induction on k = depth ( u ) .
Base case k = 0 . Then u N 0 is a leaf, so L ( u ) = { u } . Moreover η ( u ) PS 0 ( V 0 ) = V 0 and by definition λ ( u ) = Flat 0 ( η ( u ) ) = { η ( u ) } . Hence
λ ( u ) = { η ( u ) } = { η ( ) L ( u ) } .
Inductive step. Assume the claim holds for all nodes of depth < k , and let u N k with k 1 . By recursive nesting,
η ( u ) = { η ( v ) v Ch ( u ) } .
By the recursive definition of Flat k (as in Definition 4),
λ ( u ) = Flat k ( η ( u ) ) = v Ch ( u ) Flat k 1 ( η ( v ) ) = v Ch ( u ) λ ( v ) .
By the induction hypothesis, for each child v Ch ( u ) we have λ ( v ) = { η ( ) L ( v ) } , hence
λ ( u ) = v Ch ( u ) { η ( ) L ( v ) } .
Finally, the leaves under u are exactly the disjoint union of the leaves under its children:
L ( u ) = v Ch ( u ) L ( v ) ,
so the right-hand side equals { η ( ) L ( u ) } . This proves the desired identity for depth k. The cardinality statement | λ ( u ) | = | L ( u ) | follows because the map η | N 0 : N 0 V 0 is a bijection, so distinct leaves have distinct base labels. □
Theorem 2
(Level supports form a partition of V 0 ). Fix k { 0 , , n } . Then the family of supports at level k,
L k : = { λ ( u ) u N k } ,
is a partition of V 0 , i.e.
V 0 = u N k λ ( u ) and u v λ ( u ) λ ( v ) = ( u , v N k ) .
Equivalently, for every x V 0 there exists a unique u N k with x λ ( u ) .
Proof. 
Covering. Let x V 0 , and let x N 0 be the unique leaf with η ( x ) = x . Consider the unique root-to-leaf path from r to x . Since depths decrease by 1 along each parent–child arc and the root has depth n while x has depth 0, this path contains exactly one node u at depth k. Then x L ( u ) . By Theorem 1,
x = η ( x ) { η ( ) L ( u ) } = λ ( u ) ,
so x lies in the union of the level-k supports.
Disjointness and uniqueness. Let u , v N k with u v . If λ ( u ) λ ( v ) , choose x λ ( u ) λ ( v ) . By Theorem 1, there exist leaves , N 0 such that L ( u ) , L ( v ) , and η ( ) = x = η ( ) . Since η | N 0 is injective, = . Thus there is a leaf that lies in both subtrees rooted at u and v. In a tree, the path from the root to is unique, hence it cannot pass through two distinct nodes at the same depth. Therefore u = v , a contradiction. Hence λ ( u ) λ ( v ) = .
The equivalent uniqueness statement follows: if x were contained in λ ( u ) and λ ( v ) for two level-k nodes, then λ ( u ) λ ( v ) , forcing u = v . □
Corollary 1
(Support recursion and additivity across children). Let u N k with k 1 . Then
λ ( u ) = v Ch ( u ) λ ( v ) , and the union is disjoint , hence | λ ( u ) | = v Ch ( u ) | λ ( v ) | .
Proof. 
The identity λ ( u ) = v Ch ( u ) λ ( v ) was shown in the inductive step of Theorem 1. Children of u all lie at the same depth k 1 , so by Theorem 2 (applied at level k 1 ), their supports are pairwise disjoint. Additivity of cardinalities follows. □
Theorem 3
(Laminarity of supports across all tree-vertices). For any u , v N , exactly one of the following holds:
λ ( u ) λ ( v ) = , λ ( u ) λ ( v ) , λ ( v ) λ ( u ) .
Moreover,
λ ( u ) λ ( v ) ( u v ) or ( v u ) .
Proof. 
If u v , then every leaf under u is also a leaf under v, i.e. L ( u ) L ( v ) . Applying Theorem 1 gives
λ ( u ) = { η ( ) L ( u ) } { η ( ) L ( v ) } = λ ( v ) .
The same reasoning applies if v u .
Now assume neither u v nor v u . Then the subtrees rooted at u and v are disjoint: if there were a leaf belonging to both L ( u ) and L ( v ) , the unique root-to- path would contain both u and v, forcing one to be an ancestor of the other, contradicting the assumption. Hence L ( u ) L ( v ) = , and by Theorem 1,
λ ( u ) λ ( v ) = { η ( ) L ( u ) } { η ( ) L ( v ) } = ,
because η | N 0 is injective and the leaf sets are disjoint. This establishes the trichotomy and the equivalence. □

2.2. n-SuperHyperGraph with Intra-Level and Inter-Level Edges

An n-SuperHyperGraph with intra-level and inter-level edges extends an n-SuperHyperGraph by adding edges among same-level supervertices and cross-level edges linking different nesting depths.
Definition 8
(n-SuperHyperGraph with intra-level and inter-level edges). Let V 0 be a finite, nonempty set and let n N . An n-SuperHyperGraph with intra-level and inter-level edges is a tuple
SHG × ( n ) = V 0 , V , E , { E ( k ) } k = 0 n , { E ( k ) } 0 k < n ,
where:
(i)
(Graded supervertex set)
V k = 0 n PS k ( V 0 ) , V k : = V PS k ( V 0 ) ( k = 0 , 1 , , n ) .
Optionally (and typically), one assumes grounding V 0 V (equivalently V 0 = V PS 0 ( V 0 ) ), and V k PS k ( V 0 ) { } for k 1 .
(ii)
(Superhyperedges)
E PS ( V ) { } .
Each e E is a nonempty set of supervertices, possibly mixing levels.
(iii)
(Support / flattening) For u V k define its support
λ ( u ) : = Flat k ( u ) V 0 ,
where Flat k is the flattening map from Definition 4. (Thus λ ( u ) is the set of base vertices “contained in” the nested object u.)
(iv)
(Intra-level (graph) edges) For each k { 0 , , n } ,
E ( k ) { u , v } V k u v .
An edge { u , v } E ( k ) encodes a binary relation between two supervertices at the same level k.
(v)
(Inter-level (directed) edges) For each pair 0 k < n ,
E ( k ) V k × V .
An ordered pair ( u , v ) E ( k ) is an inter-level edge from level k to level ℓ.
(vi)
(Optional support-compatibility constraints) Depending on semantics, one may require every ( u , v ) E ( k ) to satisfy, for example,
λ ( u ) λ ( v ) ( upward / abstraction ) , or λ ( u ) λ ( v ) ( overlap ) .
Remark 3
(Relation to the classical n-SuperHyperGraph). If one restricts V to a single level V = V n PS n ( V 0 ) and discards all { E ( k ) } k < n and { E ( k ) } k < , then ( V , E ) is precisely an n-SuperHyperGraph in the sense of Definition 5. The present definition enriches the model by allowing graded supervertices (levels 0 through n) and by equipping them with both intra-level and inter-level graph-type edges in addition to hyperedges.
As a reference, a comparison between an n-SuperHyperGraph and an n-SuperHyperGraph with intra-level and inter-level edges is presented in Table 3.
Example 3
(A depth-2 instance with both hyperedges and cross-level edges). Let V 0 = { a , b , c , d } and n = 2 . Define the graded supervertex sets
V 0 = { a , b , c , d } , V 1 = { A , B , C } , V 2 = { U , W } ,
where
A = { a , b } , B = { c , d } , C = { b , c } PS 1 ( V 0 ) = PS ( V 0 ) ,
and
U = { A , B } , W = { A , C } PS 2 ( V 0 ) = PS ( PS ( V 0 ) ) .
Set V : = V 0 V 1 V 2 PS 0 ( V 0 ) PS 1 ( V 0 ) PS 2 ( V 0 ) .
Supports (via flattening). Using λ ( u ) = Flat level ( u ) ( u ) ,
λ ( a ) = { a } , λ ( b ) = { b } , λ ( c ) = { c } , λ ( d ) = { d } ,
λ ( A ) = { a , b } , λ ( B ) = { c , d } , λ ( C ) = { b , c } ,
λ ( U ) = { a , b , c , d } , λ ( W ) = { a , b , c } .
Superhyperedges. Let the hyperedge family be
E = { a , A } , { b , A , C } , { c , C , B } , { A , C , W } , { U , W , B } PS ( V ) { } .
These hyperedges allow genuinely multiway relations, and they may mix levels (e.g., { U , W , B } lives across levels 2 and 1).
Intra-level edges. Define
E ( 0 ) = { a , b } , { c , d } , E ( 1 ) = { A , C } , E ( 2 ) = { U , W } .
For instance, { A , C } E ( 1 ) encodes a binary relation between two level-1 group-vertices (perhaps “overlapping communities” or “related clusters”).
Inter-level edges. Let
E ( 0 1 ) = { ( a , A ) , ( b , A ) , ( b , C ) , ( c , C ) , ( c , B ) , ( d , B ) } ,
E ( 1 2 ) = { ( A , U ) , ( B , U ) , ( A , W ) , ( C , W ) } , E ( 0 2 ) = { ( a , U ) , ( d , U ) } .
If one enforces the optional upward constraint λ ( u ) λ ( v ) , then it holds here: for example, ( b , C ) E ( 0 1 ) satisfies λ ( b ) = { b } λ ( C ) = { b , c } , and ( C , W ) E ( 1 2 ) satisfies λ ( C ) = { b , c } λ ( W ) = { a , b , c } . Thus the inter-level edges can be read as membership/abstraction links across levels.
Consequently,
SHG × ( 2 ) = V 0 , V , E , { E ( 0 ) , E ( 1 ) , E ( 2 ) } , { E ( 0 1 ) , E ( 0 2 ) , E ( 1 2 ) }
is a concrete 2-SuperHyperGraph with both intra-level and inter-level edges in the sense of Definition 8.

2.3. Depth-n TVG with Intra-Level and Inter-Level Edges

A depth-n TVG with intra-level and inter-level edges is a depth-n rooted tree whose nodes carry iterated-powerset labels; edges relate nodes within the same level and across levels.
Definition 9
(Depth-n TVG with intra-level and inter-level edges). Let V 0 be a finite, nonempty set and let n N . A depth-n Tree-Vertex Graph with cross-level edges is a tuple
TVG × ( n ) = V 0 , T , η , { E ( k ) } k = 0 n , { E ( k ) } 0 k < n ,
where:
(i)
T = ( N , F , r ) is a rooted tree whose leaves have depth 0 and whose root has depth n. Let N k = { u N depth ( u ) = k } .
(ii)
η : N k = 0 n PS k ( V 0 ) { } is a nested labeling such that:
(a)
(Level-typing) If u N k , then η ( u ) PS k ( V 0 ) { } .
(b)
(Leaf grounding) η | N 0 : N 0 V 0 is a bijection.
(c)
(Recursive nesting) For every u N k with k 1 ,
η ( u ) = { η ( v ) v Ch ( u ) } .
(iii)
Define the flattened support λ ( u ) V 0 by λ ( u ) = Flat depth ( u ) ( η ( u ) ) .
(iv)
(Intra-level edges) For each k { 0 , , n } ,
E ( k ) { u , v } N k u v .
(v)
(Inter-level edges / cross edges) For each pair 0 k < n ,
E ( k ) N k × N .
An ordered pair ( u , v ) E ( k ) is an inter-level edge from level k to level ℓ.
(vi)
(Support-compatibility constraint, optional) One may additionally require that every inter-level edge ( u , v ) E ( k ) satisfies a support constraint such as either:
λ ( u ) λ ( v ) ( upward / abstraction edge ) , or λ ( v ) λ ( u ) ( downward / refinement edge ) ,
or the weaker overlap condition λ ( u ) λ ( v ) , depending on the intended semantics.
Remark 4
(Undirected cross-edges). If one prefers undirected inter-level edges, replace E ( k ) N k × N by a symmetric set
E ( k , ) { u , v } u N k , v N ,
and interpret { u , v } as an undirected relation between levels k and ℓ.
For reference, the comparison between a depth-n TVG and a depth-n TVG with intra-level and inter-level edges is presented in Table 4, and the comparison between a depth-n TVG with intra-level and inter-level edges and an n-SuperHyperGraph with intra-level and inter-level edges is presented in Table 5.
Specific examples are given below.
Example 4
(A depth-2 TVG with both intra-level and inter-level edges (overlap-compatible cross edges)). Let
V 0 = { a , b , c , d } , n = 2 .
We define
TVG × ( 2 ) = V 0 , T , η , { E ( k ) } k = 0 2 , { E ( k ) } 0 k < 2
by specifying each component.
(i)
Rooted tree TLet the node set be
N = { a , b , c , d , u 1 , u 2 , r } .
Take r as the root and define the parent–child arcs
F = { ( r , u 1 ) , ( r , u 2 ) , ( u 1 , a ) , ( u 1 , b ) , ( u 2 , c ) , ( u 2 , d ) } .
Thus
N 0 = { a , b , c , d } , N 1 = { u 1 , u 2 } , N 2 = { r } .
(ii)
Nested labeling  η . Define η on the leaves by
η ( a ) = a , η ( b ) = b , η ( c ) = c , η ( d ) = d ,
and on internal nodes by the recursive nesting rule:
η ( u 1 ) = { η ( a ) , η ( b ) } = { a , b } , η ( u 2 ) = { η ( c ) , η ( d ) } = { c , d } ,
η ( r ) = { η ( u 1 ) , η ( u 2 ) } = { a , b } , { c , d } .
(iii)
Flattened supports  λ . Using λ ( u ) = Flat depth ( u ) ( η ( u ) ) , we obtain
λ ( a ) = { a } , λ ( b ) = { b } , λ ( c ) = { c } , λ ( d ) = { d } ,
λ ( u 1 ) = { a , b } , λ ( u 2 ) = { c , d } , λ ( r ) = { a , b , c , d } .
(iv)
Intra-level edges. Set
E ( 0 ) = { a , c } , { b , d } , E ( 1 ) = { u 1 , u 2 } , E ( 2 ) = .
(v)
Inter-level edges. We give cross edges at both pairs of levels:
E ( 0 1 ) = { ( a , u 1 ) , ( c , u 2 ) } , E ( 0 2 ) = { ( b , r ) } , E ( 1 2 ) = { ( u 1 , r ) , ( u 2 , r ) } .
(vi)
Compatibility check (overlap condition). Every listed inter-level edge ( u , v ) satisfies λ ( u ) λ ( v ) : for instance, ( b , r ) E ( 0 2 ) has λ ( b ) = { b } and λ ( r ) = { a , b , c , d } , so the intersection is { b } ; similarly ( a , u 1 ) and ( c , u 2 ) overlap by construction.
Example 5
(A depth-3 TVG with both intra-level and inter-level edges (mix of upward and downward semantics)). Let
V 0 = { p 1 , p 2 , p 3 , p 4 , p 5 , p 6 } , n = 3 .
We define
TVG × ( 3 ) = V 0 , T , η , { E ( k ) } k = 0 3 , { E ( k ) } 0 k < 3 .
(i)
Rooted tree T. Let the levels be
N 0 = { 1 , 2 , 3 , 4 , 5 , 6 } , N 1 = { a 1 , a 2 , a 3 } , N 2 = { b 1 , b 2 } , N 3 = { r } .
Let the arc set be
F = { ( r , b 1 ) , ( r , b 2 ) , ( b 1 , a 1 ) , ( b 1 , a 2 ) , ( b 2 , a 3 ) , ( a 1 , 1 ) , ( a 1 , 2 ) , ( a 2 , 3 ) , ( a 2 , 4 ) , ( a 3 , 5 ) , ( a 3 , 6 ) } .
(ii)
Nested labeling  η . Define η ( i ) = p i for i = 1 , , 6 . Then recursively:
η ( a 1 ) = { η ( 1 ) , η ( 2 ) } = { p 1 , p 2 } , η ( a 2 ) = { η ( 3 ) , η ( 4 ) } = { p 3 , p 4 } , η ( a 3 ) = { η ( 5 ) , η ( 6 ) } = { p 5 , p 6 } ,
η ( b 1 ) = { η ( a 1 ) , η ( a 2 ) } = { p 1 , p 2 } , { p 3 , p 4 } ,
η ( b 2 ) = { η ( a 3 ) } = { p 5 , p 6 } ,
η ( r ) = { η ( b 1 ) , η ( b 2 ) } = { p 1 , p 2 } , { p 3 , p 4 } , { p 5 , p 6 } .
(iii)
Flattened supports  λ .
λ ( i ) = { p i } ( i = 1 , , 6 ) , λ ( a 1 ) = { p 1 , p 2 } , λ ( a 2 ) = { p 3 , p 4 } , λ ( a 3 ) = { p 5 , p 6 } ,
λ ( b 1 ) = { p 1 , p 2 , p 3 , p 4 } , λ ( b 2 ) = { p 5 , p 6 } , λ ( r ) = V 0 .
(iv)
Intra-level edges. We specify relations within each abstraction depth:
E ( 0 ) = { 2 , 3 } , { 4 , 6 } , E ( 1 ) = { a 1 , a 2 } , E ( 2 ) = { b 1 , b 2 } , E ( 3 ) = .
(v)
Inter-level edges (cross edges).We include a mix of upward cross edges (support inclusion λ ( u ) λ ( v ) ) and downward cross edges (support inclusion λ ( v ) λ ( u ) ) by choosing:
E ( 0 1 ) = { ( 1 , a 1 ) , ( 4 , a 2 ) , ( 6 , a 3 ) } , E ( 1 2 ) = { ( a 2 , b 1 ) , ( a 3 , b 2 ) } , E ( 2 3 ) = { ( b 1 , r ) , ( b 2 , r ) } ,
together with two downward edges from the top:
E ( 3 1 ) = { ( r , a 1 ) } , E ( 3 0 ) = { ( r , 5 ) } .
(Equivalently, one may encode these as E ( 1 3 ) and E ( 0 3 ) with reversed orientation, depending on whether the intended semantics is “refines” or “abstracts.”)
(vi)
Compatibility check (inclusion/overlap). For each upward edge ( u , v ) E ( 0 1 ) E ( 1 2 ) E ( 2 3 ) , we have λ ( u ) λ ( v ) , e.g. λ ( 4 ) = { p 4 } λ ( a 2 ) = { p 3 , p 4 } λ ( b 1 ) = { p 1 , p 2 , p 3 , p 4 } λ ( r ) . For each downward edge ( r , a 1 ) and ( r , 5 ) we have λ ( a 1 ) λ ( r ) and λ ( 5 ) λ ( r ) , so in particular λ ( r ) λ ( a 1 ) and λ ( r ) λ ( 5 ) .
Notation 2
(Descendants, subtree leaves, and level-k container). Let T = ( N , F , r ) be a rooted tree whose leaves have depth 0 and whose root has depth n. Write v u if v is a descendant of u (possibly v = u ). For u N define the subtree leaf set
L ( u ) : = { N 0 u } .
For each x V 0 , let x N 0 denote the unique leaf with η ( x ) = x . For each k { 0 , , n } define the level-k container map
π k : V 0 N k , π k ( x ) : = the unique node u N k such that x L ( u ) .
(The existence and uniqueness of π k is established in Theorem 5.)
Theorem 4
(Support equals the base labels of subtree leaves). Let TVG × ( n ) be as in Definition 9. Then for every u N ,
λ ( u ) = { η ( ) L ( u ) } V 0 .
In particular, | λ ( u ) | = | L ( u ) | .
Proof. 
Identical to the proof of the corresponding statement for level-only TVGs: induct on k = depth ( u ) using recursive nesting η ( u ) = { η ( v ) v Ch ( u ) } and the recursive definition of Flat k . The presence of additional edge sets { E ( k ) } does not affect the labeling argument. □
Theorem 5
(Level supports partition V 0 ). Fix k { 0 , , n } . Then the family
L k : = { λ ( u ) u N k }
is a partition of V 0 :
V 0 = u N k λ ( u ) , u v λ ( u ) λ ( v ) = ( u , v N k ) .
Equivalently, for every x V 0 there exists a unique u N k with x λ ( u ) , and thus π k in Notation 2 is well-defined.
Proof. 
Let x V 0 and let x be its unique leaf. The unique root-to- x path contains exactly one node u at depth k. Then x L ( u ) , so by Theorem 4 we have x = η ( x ) λ ( u ) , proving the covering.
For disjointness, let u , v N k with u v and suppose λ ( u ) λ ( v ) . Choose x in the intersection. By Theorem 4 there exist leaves L ( u ) and L ( v ) with η ( ) = x = η ( ) . Injectivity of η | N 0 implies = . But a root-to-leaf path cannot pass through two distinct nodes at the same depth, contradiction. Hence the supports are disjoint. □
Theorem 6
(Laminarity of supports across all tree-vertices). For any u , v N , exactly one of the following holds:
λ ( u ) λ ( v ) = , λ ( u ) λ ( v ) , λ ( v ) λ ( u ) .
Moreover,
λ ( u ) λ ( v ) ( u v ) or ( v u ) .
Proof. 
If u v , then L ( u ) L ( v ) , hence
λ ( u ) = { η ( ) L ( u ) } { η ( ) L ( v ) } = λ ( v )
by Theorem 4. Symmetrically for v u .
If neither is an ancestor of the other, the subtrees are disjoint: L ( u ) L ( v ) = . Then, using injectivity of η | N 0 ,
λ ( u ) λ ( v ) = { η ( ) L ( u ) } { η ( ) L ( v ) } = .
This yields the trichotomy and the equivalence. □
Definition 10
(Cross-edge expansion to a base relation). Fix 0 k < n . For ( u , v ) N k × N define its base expansion as the set
Exp ( u , v ) : = λ ( u ) × λ ( v ) V 0 × V 0 .
Given a cross-edge set E ( k ) N k × N , define the induced base relation
R 0 ( k ) : = ( u , v ) E ( k ) Exp ( u , v ) V 0 × V 0 .
Theorem 7
(Uniqueness of lifted endpoints for induced base pairs). Fix 0 k < n and define R 0 ( k ) as in Definition 10. If ( x , y ) R 0 ( k ) , then the nodes
u = π k ( x ) N k , v = π ( y ) N
are uniquely determined and satisfy ( u , v ) E ( k ) . Conversely, if ( u , v ) E ( k ) and x λ ( u ) , y λ ( v ) , then ( x , y ) R 0 ( k ) .
Proof. 
Assume ( x , y ) R 0 ( k ) . Then by definition there exists ( u , v ) E ( k ) such that x λ ( u ) and y λ ( v ) . By Theorem 5 applied at level k and at level , the level containers are unique:
u = π k ( x ) , v = π ( y ) .
Hence ( π k ( x ) , π ( y ) ) E ( k ) , and uniqueness is immediate from the partition property.
Conversely, if ( u , v ) E ( k ) and x λ ( u ) , y λ ( v ) , then ( x , y ) λ ( u ) × λ ( v ) = Exp ( u , v ) R 0 ( k ) . □
Theorem 8
(Support-compatibility implies monotone containment on base pairs). Fix 0 k < n and suppose the following upward support-compatibility holds:
( u , v ) E ( k ) : λ ( u ) λ ( v ) .
Then the induced base relation R 0 ( k ) satisfies:
( x , y ) R 0 ( k ) x λ ( π ( y ) ) .
Equivalently, every base pair ( x , y ) induced by a cross-edge points from a base vertex x that lies inside the (unique) level-ℓ cluster containing y.
Proof. 
Take ( x , y ) R 0 ( k ) . By Theorem 7, ( π k ( x ) , π ( y ) ) E ( k ) . By the assumed compatibility,
λ ( π k ( x ) ) λ ( π ( y ) ) .
Since x λ ( π k ( x ) ) by definition of π k , we conclude x λ ( π ( y ) ) . □
Theorem 9
(No cross-edges between disjoint supports under overlap-compatibility). Fix 0 k < n and assume the overlap-compatibility constraint:
( u , v ) E ( k ) : λ ( u ) λ ( v ) .
Then every cross-edge connects comparable nodes in the tree:
( u , v ) E ( k ) u v .
In particular, under overlap-compatibility, cross-edges cannot jump between disjoint branches.
Proof. 
Let ( u , v ) E ( k ) . By assumption λ ( u ) λ ( v ) . By Theorem 6, this implies u v or v u . But depth ( u ) = k < = depth ( v ) , so v u is impossible (a descendant must have smaller depth). Hence u v . □
Corollary 2
(Under overlap-compatibility, cross-edges are determined by ancestor constraints). Assume overlap-compatibility as in Theorem 9. Then for each 0 k < n ,
E ( k ) { ( u , v ) N k × N u v } .
Moreover, for any ( u , v ) E ( k ) one has λ ( u ) λ ( v ) .
Proof. 
The containment is exactly Theorem 9. If u v , then L ( u ) L ( v ) and thus λ ( u ) λ ( v ) by Theorem 4. □

3. Conclusions

In this paper, we defined a new class of graphs called Tree-Vertex Graphs. We expect that future work will explore extensions based on Fuzzy Sets [26], Neutrosophic Sets [27,28], and Plithogenic Sets [29,30], as well as applications to methods such as neural networks.

Funding

This study was conducted without any financial support from external organizations or grants.

Institutional Review Board Statement

As this study does not involve experiments with human participants or animals, no ethical approval was required.

Informed Consent Statement

Not applicable.

Data Availability Statement

Since this research is purely theoretical and mathematical, no empirical data or computational analysis was utilized. Researchers are encouraged to expand upon these findings with data-oriented or experimental approaches in future studies.

Public Involvement Statement

This study did not involve any clinical trials.

Use of Artificial Intelligence

No code or software was developed for this study. I use generative AI and AI-assisted tools for tasks such as English grammar checking, and I do not employ them in any way that violates ethical standards.

Acknowledgments

We would like to express our sincere gratitude to everyone who provided valuable insights, support, and encouragement throughout this research. We also extend our thanks to the readers for their interest and to the authors of the referenced works, whose scholarly contributions have greatly influenced this study. Lastly, we are deeply grateful to the publishers and reviewers who facilitated the dissemination of this work.

Conflicts of Interest

The authors declare that they have no conflicts of interest related to the content or publication of this paper.

References

  1. Diestel, R. Graph theory; Springer (print edition); Reinhard Diestel (eBooks), 2024. [Google Scholar]
  2. Gao, Y.; Zhang, Z.; Lin, H.; Zhao, X.; Du, S.; Zou, C. Hypergraph learning: Methods and practices. IEEE Transactions on Pattern Analysis and Machine Intelligence 2020, 44, 2548–2566. [Google Scholar] [CrossRef] [PubMed]
  3. Feng, Y.; You, H.; Zhang, Z.; Ji, R.; Gao, Y. Hypergraph neural networks. In Proceedings of the Proceedings of the AAAI conference on artificial intelligence, 2019; pp. 3558–3565. [Google Scholar]
  4. Chen, J.; Schwaller, P. Molecular hypergraph neural networks. The Journal of Chemical Physics 2024, 160. [Google Scholar] [CrossRef] [PubMed]
  5. Ji, S.; Feng, Y.; Di, D.; Ying, S.; Gao, Y. Mode Hypergraph Neural Network. IEEE Transactions on Neural Networks and Learning Systems 2025. [Google Scholar]
  6. Casetti, N.; Nevatia, P.; Chen, J.; Schwaller, P.; Coley, C.W. Comment on “Molecular hypergraph neural networks”[J. Chem. Phys. 160, 144307 (2024)]. The Journal of Chemical Physics 2024, 161. [Google Scholar] [CrossRef]
  7. Du, W.; Zhang, S.; Cai, Z.; Li, X.; Liu, Z.; Fang, J.; Wang, J.; Wang, X.; Wang, Y. Molecular Merged Hypergraph Neural Network for Explainable Solvation Gibbs Free Energy Prediction. Research 2025, 8, 0740. [Google Scholar] [CrossRef]
  8. Bravo, J.C.M.; Piedrahita, C.J.B.; Bravo, M.A.M.; Pilacuan-Bonete, L.M. Integrating SMED and Industry 4.0 to optimize processes with plithogenic n-SuperHyperGraphs. Neutrosophic Sets and Systems 2025, 84, 328–340. [Google Scholar]
  9. Fujita, T. Multi-SuperHyperGraph Neural Networks: A Generalization of Multi-HyperGraph Neural Networks. Neutrosophic Computing and Machine Learning 2025, 39, 328–347. [Google Scholar]
  10. Marcos, B.V.S.; Willner, M.F.; Rosa, B.V.C.; Yissel, F.F.R.M.; Roberto, E.R.; Puma, L.D.B.; Fernández, D.M.M. Using plithogenic n-SuperHyperGraphs to assess the degree of relationship between information skills and digital competencies. Neutrosophic Sets and Systems 2025, 84, 513–524. [Google Scholar]
  11. Amable, N.H.; De Salazar, E.E.V.; Isaac, M.G.M.; Sánchez, O.C.O.; Palma, J.M.S. Representation of motivational dynamics in school environments through Plithogenic n-SuperHyperGraphs with family participation. Neutrosophic Sets and Systems 2025, 92, 570–583. [Google Scholar]
  12. Berrocal Villegas, S.M.; Montalvo Fritas, W.; Berrocal Villegas, C.R.; Flores Fuentes Rivera, M.Y.; Espejo Rivera, R.; Bautista Puma, L.D.; Macazana Fernández, D.M. Using plithogenic n-SuperHyperGraphs to assess the degree of relationship between information skills and digital competencies. Neutrosophic Sets and Systems 2025, 84, 41. [Google Scholar]
  13. Roshdy, E.; Khashaba, M.; Ali, M.E.A. Neutrosophic super-hypergraph fusion for proactive cyberattack countermeasures: A soft computing framework. Neutrosophic Sets and Systems 2025, 94, 232–252. [Google Scholar]
  14. Fujita, T.; Mehmood, A. SuperHyperGraph Attention Networks. Neutrosophic Computing and Machine Learning 2025, 40, 10–27. [Google Scholar]
  15. Fujita, T.; Smarandache, F. Superhypergraph Neural Networks and Plithogenic Graph Neural Networks: Theoretical Foundations. Advancing Uncertain Combinatorics through Graphization, Hyperization, and Uncertainization: Fuzzy, Neutrosophic, Soft, Rough, and Beyond 2025, 5, 577. [Google Scholar]
  16. Hamidi, M.; Smarandache, F.; Davneshvar, E. Spectrum of superhypergraphs via flows. Journal of Mathematics 2022, 2022, 9158912. [Google Scholar] [CrossRef]
  17. Fujita, T.; Smarandache, F. Soft Directed n-SuperHyperGraphs with Some Real-World Applications. European Journal of Pure and Applied Mathematics 2025, 18, 6643–6643. [Google Scholar] [CrossRef]
  18. Fujita, T. Directed Acyclic SuperHypergraphs (DASH): A General Framework for Hierarchical Dependency Modeling. Neutrosophic Knowledge 2025, 6, 72–86. [Google Scholar]
  19. Fujita, T. MetaHyperGraphs, MetaSuperHyperGraphs, and Iterated MetaGraphs: Modeling Graphs of Graphs, Hypergraphs of Hypergraphs, Superhypergraphs of Superhypergraphs, and Beyond. Current Research in Interdisciplinary Studies 2025, 4, 1–23. [Google Scholar]
  20. Nacaroglu, Y.; Akgunes, N.; Pak, S.; Cangul, I.N. Some graph parameters of power set graphs. Advances & Applications in Discrete Mathematics 2021, 26. [Google Scholar]
  21. Shalu, M.; Yamini, S.D. Counting maximal independent sets in power set graphs; Indian Institute of Information Technology Design & Manufacturing (IIITD&M): Kancheepuram, India, 2014. [Google Scholar]
  22. Jech, T. Set theory: The third millennium edition, revised and expanded; Springer, 2003. [Google Scholar]
  23. Bretto, A. Hypergraph theory. In An introduction. Mathematical Engineering; Springer: Cham, 2013; Volume 1. [Google Scholar]
  24. Berge, C. Hypergraphs: combinatorics of finite sets; Elsevier, 1984; Vol. 45. [Google Scholar]
  25. Smarandache, F. Extension of HyperGraph to n-SuperHyperGraph and to Plithogenic n-SuperHyperGraph, and Extension of HyperAlgebra to n-ary (Classical-/Neutro-/Anti-) HyperAlgebra. Infinite Study 2020. [Google Scholar]
  26. Zadeh, L.A. Fuzzy sets. Information and control 1965, 8, 338–353. [Google Scholar] [CrossRef]
  27. Wang, H.; Smarandache, F.; Zhang, Y.; Sunderraman, R. Single valued neutrosophic sets. Infinite study 2010. [Google Scholar]
  28. Broumi, S.; Talea, M.; Bakali, A.; Smarandache, F. Single valued neutrosophic graphs. Journal of New theory 2016, 10, 86–101. [Google Scholar]
  29. Smarandache, F. Plithogenic set, an extension of crisp, fuzzy, intuitionistic fuzzy, and neutrosophic sets-revisited. Infinite study 2018. [Google Scholar]
  30. Azeem, M.; Rashid, H.; Jamil, M.K.; Gütmen, S.; Tirkolaee, E.B. Plithogenic fuzzy graph: A study of fundamental properties and potential applications. Journal of Dynamics and Games 2024, 0–0. [Google Scholar] [CrossRef]
Table 1. High-level comparison of Graphs, Hypergraphs, and SuperHyperGraphs.
Table 1. High-level comparison of Graphs, Hypergraphs, and SuperHyperGraphs.
Object Vertex objects Edge objects (what an edge can connect)
Graph G = ( V , E ) Atoms: V is a finite set of vertices. Edges are pairs of vertices (undirected) or ordered pairs (directed): E V 2 (undirected) or E V × V (directed).
Hypergraph H = ( V , E ) Atoms: V is a finite set of vertices. Hyperedges are nonempty subsets of vertices: E PS ( V ) { } . An edge may connect any number of vertices.
n-SuperHyperGraph SHG ( n ) = ( V , E ) Nested objects: V PS n ( V 0 ) , i.e., vertices may be sets-of-sets (iterated powersets). Superedges are nonempty subsets of the (possibly nested) vertex set: E PS ( V ) { } . Thus edges connect collections of (nested) vertices.
Table 2. Concise comparison of an n-SuperHyperGraph and a depth-n Tree-Vertex Graph.
Table 2. Concise comparison of an n-SuperHyperGraph and a depth-n Tree-Vertex Graph.
Aspect n-SuperHyperGraph SHG ( n ) = ( V , E ) Depth-n Tree-Vertex Graph TVG ( n ) = ( V 0 , T , η , { E ( k ) } k = 0 n )
Base objects A finite base set V 0 is given, but vertices are chosen as V PS n ( V 0 ) . A finite base set V 0 is given; leaves of T (level 0) are in bijection with V 0 .
Vertex representation Vertices are uniform-depth set-objects: v PS n ( V 0 ) . Vertices are tree-vertices  u N , each typed by its depth k and labeled by η ( u ) PS k ( V 0 ) .
Hierarchy mechanism Hierarchy is implicit in the iterated powerset level n; there is no canonical parent–child structure among vertices. Hierarchy is explicit: T = ( N , F , r ) provides parent–child arcs, and labels satisfy η ( u ) = { η ( v ) v Ch ( u ) } .
Depth / levels A single global level n for vertices (although edges are subsets of V). Multiple levels k = 0 , , n present simultaneously; N k = { u depth ( u ) = k } .
Flattened support in V 0 Not intrinsic; one may apply a flattening map on PS n ( V 0 ) if desired. Built-in: each tree-vertex has support λ ( u ) = Flat depth ( u ) ( η ( u ) ) V 0 .
Edges Hyperedges are nonempty subsets of V: E PS ( V ) { } (multiway relations between supervertices). Level edges are binary within each level: E ( k ) { { u , v } N k u v } .
Granularity of relations Relations are expressed as set-valued edges (hyperedges) among set-valued vertices. Relations are expressed as graph edges among hierarchical units at the same abstraction level.
Typical modeling intent Nested (higher-order) vertex objects with hyperedges capturing multiway interactions at the chosen level. A hierarchical decomposition of V 0 via a tree, with additional same-level relational structure among the resulting clusters at each depth.
Table 3. Comparison between an n-SuperHyperGraph and an n-SuperHyperGraph with intra-level and inter-level edges.
Table 3. Comparison between an n-SuperHyperGraph and an n-SuperHyperGraph with intra-level and inter-level edges.
Aspect Classical n-SuperHyperGraph SHG ( n ) = ( V , E ) n-SHG with intra-/inter-level edges SHG × ( n ) (Definition 8)
Base set Finite nonempty V 0 Finite nonempty V 0
Vertex objects Single-level supervertices: V PS n ( V 0 ) Graded supervertices: V k = 0 n PS k ( V 0 ) with level slices V k = V PS k ( V 0 )
Grounding at level 0 Not required by definition (often implicit via choosing V PS n ( V 0 ) ) Optional but typical: V 0 V (i.e., base vertices appear as level-0 supervertices)
Primary relations Hyperedges only: E PS ( V ) { } Hyperedges E PS ( V ) { }  plus binary edges within and across levels
Hyperedges mix levels? Not applicable (only level n vertices exist) Yes: e E may contain supervertices from different levels (unless restricted)
Intra-level edges None For each k, undirected graph edges E ( k ) { { u , v } V k u v }
Inter-level edges None For 0 k < n , directed cross edges E ( k ) V k × V
Support / flattening to V 0 Not part of the minimal definition (can be added externally) Built-in via λ ( u ) = Flat k ( u ) for u V k , enabling semantic constraints on cross edges
Expressivity gain Encodes higher-order (set-valued) vertices and hyperedges at one fixed nesting depth n Adds explicit multi-resolution (levels 0 to n) and explicit binary relations both within a level and between different depths
Recovery of classical model By definition If one restricts to V = V n and discards { E ( k ) } k = 0 n and { E ( k ) } k < , then ( V , E ) reduces to a classical n-SuperHyperGraph (Remark 3).
Table 4. Comparison between a depth-n TVG and a depth-n TVG with intra-level and inter-level edges.
Table 4. Comparison between a depth-n TVG and a depth-n TVG with intra-level and inter-level edges.
Aspect Depth-n TVG (level edges only) Depth-n TVG with intra- and inter-level edges
Core structure Rooted tree T = ( N , F , r ) of fixed depth n with level sets N k Same rooted tree backbone T with the same level decomposition N = k = 0 n N k
Objects (vertices) Tree-vertices N (hierarchical units) Same tree-vertices N
Typing / hierarchy encoding Nested labeling η is level-typed: u N k η ( u ) PS k ( V 0 ) , and η ( u ) = { η ( v ) v Ch ( u ) } for k 1 Same nested, level-typed labeling η (hierarchy still enforced by the tree and recursion)
Grounding at level 0 η | N 0 : N 0 V 0 is a bijection Same leaf grounding
Flattened support λ ( u ) = Flat depth ( u ) ( η ( u ) ) V 0 Same support definition λ ( u )
Intra-level edges For each k, an undirected edge set E ( k ) { { u , v } N k u v } Same intra-level edge sets E ( k )
Inter-level (cross) edges Not present Present as directed relations across levels: E ( k ) N k × N for 0 k < n
Typical semantics of extra edges Relations only among units at the same abstraction depth Relations can connect units of different abstraction depths (e.g., “refines/abstracts/depends-on/explains”)
Optional constraints (Usually none beyond the tree/label recursion) May impose support-compatibility on ( u , v ) E ( k ) (e.g., λ ( u ) λ ( v ) , or λ ( u ) λ ( v ) )
Table 5. Comparison between a depth-n TVG with intra-level and inter-level edges and an n-SuperHyperGraph with intra-level and inter-level edges.
Table 5. Comparison between a depth-n TVG with intra-level and inter-level edges and an n-SuperHyperGraph with intra-level and inter-level edges.
Aspect Depth-n TVG with intra- and inter-level edges n-SHG with intra- and inter-level edges
Underlying “hierarchy carrier” A rooted tree T fixes the hierarchy: every non-leaf has children and the nesting is along parent–child arcs No tree is required; hierarchy is encoded by membership in iterated powersets (i.e., by level k in PS k ( V 0 ) ) and by chosen supervertices
Vertex objects Tree-vertices N (each represents a hierarchical unit) Supervertices V k = 0 n PS k ( V 0 ) (chosen nested set-objects)
Level decomposition Levels are N k = { u N depth ( u ) = k } with fixed depth n Levels are V k = V PS k ( V 0 ) (graded by iterated powerset depth)
How higher-level objects are formed Rigid recursion: η ( u ) = { η ( v ) v Ch ( u ) } (every internal node is exactly the set of its children labels) Flexible selection: a supervertex in V k is any element of PS k ( V 0 ) included in V (no requirement that it equals the set of “children” of something)
Grounding at level 0 η | N 0 : N 0 V 0 is a bijection (all base vertices appear as leaves) Typically assumes grounding V 0 V ; otherwise optional in general formulations
Support map λ ( u ) = Flat depth ( u ) ( η ( u ) ) V 0 λ ( u ) = Flat k ( u ) for u V k (same flattening idea, but applied directly to nested set-objects)
Intra-level (graph) edges E ( k ) { { u , v } N k u v } E ( k ) { { u , v } V k u v }
Inter-level (directed) edges E ( k ) N k × N E ( k ) V k × V
Higher-arity relations Not part of the core (unless separately added); relations are primarily graph-type edges Includes hyperedges E PS ( V ) { } in addition to graph-type edges
Modeling emphasis Tree-shaped, laminar, and recursively generated clusters; edges attach to explicitly organized hierarchical units General nested set-valued vertices with both hyperedges (multiway) and graph-type edges (binary), without a required tree backbone
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated