1. Introduction
1.1. Graph, Hypergraph, and Superhypergraph Theory
Graph theory offers a rigorous framework for analyzing discrete relational systems: a set of vertices is connected by edges, and the resulting structure naturally captures pairwise interactions [
1,
2]. Owing to the clarity and flexibility of this abstraction, graphs have become standard models in many disciplines, including data mining, algorithm design, artificial intelligence, neural computation, quantum information, and chemistry (see, e.g., [
3,
4]). However, requiring every edge to link exactly two vertices may be inadequate when the underlying phenomenon involves genuine multiway relations or layered (hierarchical) organization. This motivates the parallel development of
hypergraph and
superhypergraph theories, which enlarge the classical setting in complementary directions.
A
hypergraph replaces ordinary edges by
hyperedges, where each hyperedge may connect an arbitrary nonempty subset of the vertex set, thus encoding higher-order relations directly [
5,
6,
7]. As a strict generalization of graphs, hypergraph theory has grown into a mature area with substantial structural and algorithmic results and with a wide range of applications (see, e.g., [
8,
9,
10]). Beyond this, a
superhypergraph introduces explicit nesting by iterating the powerset operation; this produces multi-level collections of vertex-sets and edge-sets and thereby allows one to represent hierarchical groupings and multi-layer connectivity within a single combinatorial object [
11,
12,
13,
14,
15]. Accordingly, superhypergraphs extend hypergraphs and are particularly suitable for modeling complex systems with several interacting layers [
16,
17,
18].
Table 1 highlights the principal differences among graphs, hypergraphs, and superhypergraphs. Throughout,
n denotes a finite positive integer. We also emphasize that extensive algorithmic literature exists for these models, ranging from classical graph algorithms to modern extensions and adaptations (see, e.g., [
19,
20]).
Note that graphs, hypergraphs, and superhypergraphs admit natural directed counterparts—Directed Graphs [
21,
22], Directed HyperGraphs [
23,
24], and Directed SuperHyperGraphs [
25]—and these directed models can be further extended to bidirected versions, namely Bidirected Graphs [
26,
27,
28], Bidirected HyperGraphs [
22], and Bidirected SuperHyperGraphs [
22]. Such directed and bidirected variants enrich the theory by explicitly incorporating orientation information into the underlying combinatorial structures.
1.2. HyperGraph and SuperHyperGraph for Artificial Intelligence
It is needless to say that research in machine learning and artificial intelligence has become extremely active in recent years and now plays a very significant role in modern society. Graph, HyperGraph, and SuperHyperGraph are also used in the field of Artificial Intelligence. Examples of related concepts are presented in
Table 2.
1.3. Our Contributions
From the above discussion, research based on SuperHyperGraphs is important and is expected to play a major role in fields such as AI. Accordingly, this paper extends several fundamental frameworks-including GCN, GraphRAG, causal graphs, graph embedding, graph-based natural kanguage processing and graph generation-by incorporating the SuperHyperGraph viewpoint. Some of these parallel notions are summarized in
Table 2.
2. Preliminaries
This section introduces the notation and basic terminology used in the sequel. Unless explicitly stated otherwise, all graphs and hypergraphs in this paper are assumed to be finite.
2.1. SuperHyperGraphs
A
hypergraph generalizes an ordinary graph by allowing an edge to connect any nonempty subset of the vertex set. Hence, hypergraphs provide a direct language for modeling multiway interactions [
5,
6,
86,
87]. By iterating the powerset operation one step further, one obtains
nested (higher-order) vertex objects and, consequently, a finite
SuperHyperGraph whose vertices and edges may themselves be set-valued at multiple levels [
16,
18,
40,
88]. Such hierarchical representations are useful, for instance, in molecular design, complex-network analysis, and related applications [
57,
89,
90]. Throughout, the index
n in
and in an
n-SuperHyperGraph is always taken to be a nonnegative integer.
Definition 1
(Base set).
A base set
S is the underlying universe of discourse:
All sets appearing in and in the iterated powersets are ultimately built from elements of S.
Definition 2
(Powerset).
(see [91,92,93]) For a set S, the powerset
of S is
In particular, and .
Definition 3
(Hypergraph).
[29,94] A hypergraph
is a pair such that:
Thus, a hyperedge may involve more than two vertices, capturing genuinely multiway relations.
Definition 4
(
n-th powerset).
[95,96] Let X be a set. Define and, for every ,
When it is convenient to exclude the empty set, we write
Definition 5
(
n-SuperHyperGraph).
(see [15,97]) Let be a finite, nonempty base set. Define
For , ann-SuperHyperGraph on
is a pair
satisfying
Elements of V are calledn-supervertices
, and elements of E are calledn-superedges
(i.e., each n-superedge is a nonempty subset of V).
3. Review: SuperHyperGraph Neural Network
A HyperGraph Neural Network learns node representations by aggregating features through hyperedges, capturing higher-order interactions among multiple vertices for prediction tasks [
5]. A SuperHyperGraph Neural Network learns base-vertex representations by aggregating through nested hyperedges, capturing hierarchical hyperedge-of-hyperedge interactions for complex reasoning tasks [
33].
Definition 6
(HyperGraph Neural Network (HGNN)).
[5] Let be a finite undirected hypergraph, where
with the (hyperedge) weight of . Let be the input feature matrix whose i-th row is the feature vector of vertex .
Define the incidence matrix by
Define the diagonal degree matrices and by
Let be a learnable weight matrix and let be a nonlinearity (e.g., ReLU). A single HGNN convolutional layer is the map
By stacking L layers, one obtains embeddings and
Definition 7
(
n-SuperHyperGraph Neural Network (
n-SHGNN)).
Let be an n-SuperHyperGraph over a finite base vertex set (so the ultimate objects of interest are the base vertices in ). Define the expanded hypergraph
by declaring that a subset belongs to if and only if there exists a superhyperedge such that
(Here each is treated as a subset of via the natural “flattening/union” operation.)
Let be the input feature matrix whose i-th row is the feature vector of base vertex . Let and define the incidence matrix by
Let be a learnable weight for , and set
Define diagonal degree matrices and by
Let be learnable and let be a nonlinearity. A single n-SHGNN convolutional layer is the map
where is the updated base-vertex feature matrix.
4. Review and Results: HyperGraphRAG and SuperHyperGraphRAG
We examine HyperGraphRAG and SuperHyperGraphRAG. Retrieval-augmented generation combines external knowledge retrieval with neural text generation, grounding model outputs in relevant, up-to-date context for accurate responses[
98,
99,
100,
101]. GraphRAG is retrieval-augmented generation using a graph linking context chunks or entities; retrieval seeds similar nodes, expands along edges to gather connected evidence, serializes the subgraph, then an LLM generates answers [
58,
59,
102,
103,
104].
Definition 8
(GraphRAG: graph-based retrieval-augmented generation). Let be a finite corpus and let be a finite set of atomic contexts (e.g., chunks, passages, tables, figure captions) extracted from . Let be a set of queries.
A GraphRAG system
is a tuple
where:
-
(i)
is an embedding map.
-
(ii)
is a finite graph whose vertex set V indexes retrievable units (e.g., , or with entities ). Each edge encodes a binary relation between u and v (e.g., co-mention, hyperlink, citation, temporal adjacency, or a knowledge-graph triple).
-
(iii)
-
is a vertex scoring function; a typical choice is
with the convention that means the embedding of the context (or entity) represented by v.
-
(iv)
is an expansion operator that enlarges a seed set of vertices using the graph structure (e.g., k-hop neighborhood, weighted random walk, or personalized PageRank).
-
(v)
is a selection operator producing the final vertex set to be used as evidence.
-
(vi)
converts a vertex-induced subgraph (and optionally edge labels) into an LLM-readable context (e.g., ordered passages, entity summaries, relation triples).
-
(vii)
is a conditional generator (LLM) that maps to an answer.
Given , GraphRAG computes:
-
(a)
Seeding:choose (e.g., top-k vertices by ).
-
(b)
Graph expansion: set .
-
(c)
Final selection: set .
-
(d)
Remark 1
(How GraphRAG differs from classical RAG). Classical RAG typically retrieves a set of contexts by similarity alone. GraphRAG additionally uses E to expand and organize evidence, so that retrieved contexts are not only similar to q, but also structurally connected via relations encoded in G.
HyperGraphRAG is retrieval-augmented generation using a hypergraph, where vertices are context chunks and hyperedges encode multiway relations; retrieval expands through relevant hyperedges, selects evidence, then an LLM generates grounded answers [
60].
Definition 9
(HyperGraphRAG: hypergraph-based retrieval-augmented generation).
[60] Let be as in Definition 8. A HyperGraphRAG system
is a tuple
where:
-
(i)
is a hypergraph (Definition 3) whose vertices V index retrievable units, and whose hyperedges encode multiway relations. A hyperedge represents that participate in a shared interaction (e.g., same event, same topic, same table row, same co-author group, or the same multi-entity mention).
-
(ii)
is a vertex scoring function (e.g., cosine similarity as in Definition 8).
-
(iii)
-
is a hyperedge scoring function; one concrete choice is
where Agg is a fixed aggregator such as max, mean, or a softmax-weighted mean.
-
(iv)
-
is a hypergraph expansion operator. A canonical expansion is:
for some threshold (possibly query-dependent).
-
(v)
selects the final evidence vertices.
-
(vi)
serializes the induced subhypergraph (optionally including hyperedge descriptions) into a finite text context.
-
(vii)
is an LLM used for generation, as before.
Given , HyperGraphRAG computes:
-
(a)
Seeding:choose (e.g., top-k by ).
-
(b)
Hyperedge-aware expansion: set .
-
(c)
Final selection: set .
-
(d)
Proposition 1
(GraphRAG is a special case of HyperGraphRAG).
Assume that every hyperedge of has size 2, i.e.,
Define the graph by
Then any HyperGraphRAG instance on H induces a GraphRAG instance on G that yields the same expanded vertex sets, provided Expand uses the same threshold rule via and neighborhood expansion.
Proof. If
for all
, then each hyperedge is of the form
with
, so
is in bijection with the edge set
E of a simple graph
defined above. For any seed set
,
which is exactly the one-step graph-neighborhood expansion (up to the same score/threshold filter applied to edges). Therefore the expanded vertex sets coincide, and hence the selected substructures and serialized contexts can be chosen identical. □
SuperHyperGraphRAG is retrieval-augmented generation over an n-superhypergraph whose supervertices are nested context-sets, expanded via superedges; it generalizes GraphRAG and HyperGraphRAG by enabling hierarchical multiway retrieval and structured evidence serialization. As a reference, the comparison of GraphRAG, HyperGraphRAG, and SuperHyperGraphRAG is presented in
Table 3.
Definition 10
(Iterated powerset and flattening). Let be a finite nonempty base set of atomic contexts (chunks, passages, entity cards, etc.). Define and for .
For each , define a map
recursively by
Thus, each n-level object is a nested set of depth n, and is the set of atomic contexts contained in v after unnesting.
Remark 2
(Where the SuperHyperGraph idea enters). In SuperHyperGraphRAG, retrievable units are allowed to besupervertices, so a single retrieved unit can represent a multi-level group of atomic contexts. Edges then connect these supervertices, enabling retrieval over nested (iterated-powerset) structures rather than only flat units.
Definition 11
(SuperHyperGraphRAG). Fix a finite set of atomic contexts and a set of queries. Fix an integer . Let be an n–SuperHyperGraph on in the sense of Definition 5, and assume .
A SuperHyperGraphRAG system of depth
nis a tuple
where:
-
(i)
is an embedding map on atomic contexts and queries.
-
(ii)
is a conditional generator (LLM) that maps a pair to an output string.
-
(iii)
The induced embedding of a supervertex is defined by the explicit average
where is from Definition 10.
-
(iv)
-
The vertex score is the cosine similarity
assuming and .
-
(v)
The superedge score uses a fixed aggregator Agg (already declared in the preamble):
-
(vi)
-
The expansion operator is
for a threshold function .
-
(vii)
The selection operator returns the final evidence set.
-
(viii)
The serialization operator converts the induced sub-superhypergraph (with optional labels and summaries) into a finite text string.
Given , the SuperHyperGraphRAG answer is computed by:
-
(a)
Seed (e.g., top-k supervertices by ).
-
(b)
-
(c)
Select .
-
(d)
Theorem 1
(HyperGraphRAG is a special case of SuperHyperGraphRAG).
Let be a hypergraph with (atomic contexts as vertices). Fix any HyperGraphRAG instance on H whose expansion rule is
Define an 0–SuperHyperGraph by
Then the depth-0 SuperHyperGraphRAG system built on , with the same
, produces exactly the same expanded sets and the same final generated output as the given HyperGraphRAG system, for every query and every step .
Proof. Since
, we have
, hence
. For any
, Definition 10 gives
. Therefore the induced supervertex embedding in Definition 11 satisfies
Hence for all
and
,
which is exactly the vertex scoring used by the HyperGraphRAG instance (under the shared choice of cosine scoring).
Now
, and for every hyperedge
we also have
and the same aggregator Agg, so the edge scores coincide:
Consequently, for any
and any
,
By induction on
t, starting from the same seed
, we obtain
for all
. Since Select, Serialize, and
are assumed identical, the final output strings are identical as well. □
Theorem 2
(GraphRAG is a special case of SuperHyperGraphRAG).
Let be a (finite, undirected) graph on atomic contexts . Assume a GraphRAG instance whose expansion is realized by iterating one-step neighbor expansion T times, i.e.,
Define a depth-0 superhypergraph by
Assume further that the SuperHyperGraphRAG expansion uses (no threshold filtering), so that every incident edge expands the set. Then, for every query and every , the SuperHyperGraphRAG iterates satisfy
and hence the induced evidence substructures (and outputs, if coincide) match the GraphRAG instance.
Proof. Because consists only of 2-element subsets, is a 2-uniform hypergraph on , and by construction it encodes exactly the same adjacencies as G.
Fix
. Since
, the expansion rule becomes
We prove the set equality
First, let
. Then either
, which implies
, or else
so there exists
such that
and
. Because
, we may write
with
and
. Also
means either
or
. Hence
x is either in
S or adjacent in
G to a vertex in
S, so
. Thus
.
Conversely, let . If , then holds immediately. Otherwise, by definition of , there exists with . Then , and since . Therefore . So .
Combining both inclusions gives
. Starting from the same seed
, induction on
t yields
hence
for all
. If
are shared between the two systems, the final outputs coincide as well. □
5. Review and Results: SuperHyperGraph Generation
Graph generation produces new graphs satisfying constraints by learning node–edge patterns, enabling molecule design, networks synthesis, and simulation [
63,
64,
65,
105,
106]. Hypergraph generation produces new hypergraphs by learning multi-vertex interaction patterns, enabling group-relation synthesis in chemistry, text, and systems [
107,
108,
109,
110]. Superhypergraph generation produces nested higher-order hypergraphs by learning hyperedge-of-hyperedge patterns, enabling hierarchical interaction synthesis and complex structure design. The comparison of Graph Generation, HyperGraph Generation, and SuperHyperGraph Generation is presented in
Table 4.
Definition 12 (Graph Generation (probabilistic / model-based)).
Fix a finite vertex set V and a measurable condition space
(e.g., labels, text, constraints, statistics). A graph generator
is a triple
where
is a latent (random-seed) space,
is a probability distribution on ,
is a measurable map.
Given ,graph generation conditioned on
cis the random output
Equivalently, induces a conditional distribution on by
Definition 13
(Hypergraph Generation (probabilistic / model-based)).
Fix a finite vertex set V and a measurable condition space . Ahypergraph generator
is a triple
where is a latent space, is a probability distribution on , and is a measurable map. Given ,hypergraph generation conditioned on
cis the random output
which induces a conditional distribution on in the same way as above.
Definition 14
(Constraint-based (deterministic) Generation).
Let V be a finite vertex set and let Φ be a predicate (constraint) on . Aconstraint-based graph generation procedure
is an algorithm that, given Φ, outputs a graph such that
The enumerative
variant outputs the entire solution set
Replacing by gives the corresponding notions for hypergraphs.
Definition 15
(
n-SuperHypergraph generation (probabilistic / model-based)).
Fix , a finite, nonempty base set , and a measurable condition space . Let
denote the set of all n-SuperHypergraphs on .
Ann-SuperHypergraph generator
is a triple
where
-
(i)
is a latent (random-seed) space,
-
(ii)
is a probability distribution on ,
-
(iii)
is a measurable map.
Given ,generation conditioned on
cis the random output
Equivalently, induces a conditional distribution on by
Definition 16
(Constraint-based (deterministic)
n-SuperHypergraph generation).
Fix and . Let Ψ be a predicate (constraint) on . Aconstraint-based
n-SuperHypergraph generation procedure
is an algorithm that outputs such that
The enumerative
variant outputs the entire solution set
Theorem 3
(Hypergraph generation is a special case of
n-SuperHypergraph generation).
Let be a finite vertex set, and let be the set of hypergraphs on :
Then is naturally identified with the subset of consisting of those 0-SuperHyperGraphs with . Consequently, every hypergraph generator on is an instance of a 0-SuperHypergraph generator .
Proof. By the Definition,
and
(since
). Thus, by the Definition, a 0-SuperHypergraph on
is a pair
with
If we impose the additional condition
, then the allowable edge families are exactly
. Hence the correspondence
is a bijection onto the subset
.
Now let
be a hypergraph generator, so
. Define
by
viewing
as the 0-SuperHypergraph
. Then
is measurable whenever
is measurable, and the output distributions coincide. Therefore
is a special case of
. □
Theorem 4
(Graph generation is a special case of
n-SuperHypergraph generation).
Let be a finite vertex set and let be the set of simple graphs on :
Then is naturally identified with the subset of consisting of those 0-SuperHyperGraphs with and . Consequently, every graph generator is an instance of a 0-SuperHypergraph generator.
Proof. As in the proof of Theorem 3, a 0-SuperHypergraph with
is of the form
where
. If we additionally require
, then each edge is a 2-element subset of
, so
is precisely a simple undirected graph on
. Thus the map
is a bijection onto
.
Given any graph generator
with
, define
by the same output map viewed in
:
Then the induced output distributions coincide, so graph generation is a special case of 0-SuperHypergraph generation. □
6. Review and Results: Causal SuperHyperGraph
A causal graph is a directed graph that represents direct causal dependencies among variables, usually derived from structural equations and used to define do-interventions and identify causal effects[
111,
112,
113,
114,
115]. A causal hypergraph extends this idea by using directed hyperedges so that a set of variables can jointly act as a single causal mechanism on a target variable[
116,
117,
118,
119]. A causal superhypergraph further extends causal hypergraphs by allowing vertices to be nested set-objects obtained via iterated powersets, enabling hierarchical multiway causal mechanisms across multiple abstraction levels.
Table 5.
Concise comparison: causal graph vs. causal hypergraph vs. causal superhypergraph.
Table 5.
Concise comparison: causal graph vs. causal hypergraph vs. causal superhypergraph.
| Aspect |
Causal Graph |
Causal HyperGraph |
Causal SuperHyperGraph |
| Underlying object |
Directed graph with
|
Hypergraph with
|
n–SuperHyperGraph with and
|
| Causal primitive |
Edge (direct cause) |
Hyperedge (multiway relation), optionally oriented
|
Superedge among supervertices (nested groups), optionally oriented across levels |
| Structural equation form |
|
or
|
for supervertices
|
| Independence reading |
d-separation on G
|
Separation via hyperedge-based paths / factorization over (model-dependent) |
Separation on superpaths and level-wise neighborhoods; supports hierarchical blocking (model-dependent) |
| What it represents |
Pairwise directed influence |
Multiway (group) influence in one step |
Multiway influence with explicit nesting / hierarchy of groups and relations |
| Reduction to simpler model |
— |
If all hyperedges have size 2 and are oriented, reduces to a causal graph |
If and supervertices are singletons, reduces to a causal hypergraph; if also edges are size 2, reduces to a causal graph |
Definition 17
(Structural Causal Model (SCM)).
Fix a finite index set of endogenous variables . For each , let be the state space of . Let index exogenous noises, with spaces .
A structural causal model
is a quadruple
where
and ;
is a probability distribution on ;
-
is a family of measurable maps
with some parent set .
The endogenous variables are determined by the structural equations
where .
Definition 18
(Causal graph induced by an SCM).
Let be an SCM with endogenous index set V. The causal graph
(also called thefunctional graph
) of M is the directed graph
whose edge set is
We write for and interpret as a direct causal dependence
of on in the model M. If is acyclic, then M is called recursive
(or acyclic
).
Definition 19
(Intervention (do-operator)).
Let and fix values . The intervention produces the intervened SCM
where
The corresponding interventional distribution is the distribution of the unique solution X (when it exists) under the modified equations and .
Definition 20
(Directed hypergraph).
A directed hypergraph
is a pair
where V is a finite node set and is a finite family of directed hyperedges
. Each is an ordered pair
where the tail
is nonempty and the head
.
Definition 21
(Causal hypergraph model).
Fix state spaces and noise spaces . A causal hypergraph model
is a tuple
where
, , and is a distribution on ;
is a directed hypergraph;
-
is a family ofhyperedge mechanisms
for some intermediate spaces ;
is a family ofaggregation functions
The structural equations of are
where and . Intuitively, each hyperedge represents a joint causal mechanism
acting on the tuple .
Definition 22
(Graph induced by a causal hypergraph).
Let be a directed hypergraph. Its 2-section causal graph
(edge-expansion) is the directed graph
Thus every hyperedge is expanded into ordinary directed edges for all .
Theorem 5
(Causal graphs are a special case of causal hypergraphs).
Let be a directed graph. Define a directed hypergraph by
Then:
Proof.
(1) By construction, a directed edge corresponds to the hyperedge . Hence if and only if , so .
(2) Let
be an SCM with
Define
on
as follows. For each hyperedge
, set
and
For each
, list the incoming hyperedges as
, identify
, and set
Then the structural equation of
becomes
which is exactly the SCM equation. The induced graph statement follows from (1). □
Definition 23
(Granger-causal hypergraph (time-series causal hypergraph)).
Let be a multivariate time series and fix a maximum lag . Form a node set of lagged variables
For a target component , let be a nonempty set of lagged predictors. We say Granger-causes
(relative to the history V) if, in a chosen model class, using improves the prediction of beyond what is achievable without (e.g., via likelihood-ratio tests, information criteria, or out-of-sample error reduction).
A Granger-causal hypergraph
is a directed hypergraph whose hyperedges are
interpreted as a multi-source temporal influence relation. (Granger causality is predictive/temporal and does not, by itself, assert interventional causality.)
Definition 24
(Directed
n-SuperHyperGraph).
Fix and a finite, nonempty base set . A directed
n-SuperHyperGraph on
is a pair
such that
Each directed superhyperedge is written as
where is nonempty and . Elements of V aren-supervertices
. The pair encodes a multi-source directed relation.
Remark 3
(Use of the SuperHyperGraph idea). Definition 24 uses the SuperHyperGraph viewpoint by allowing vertices to be nested objects (iterated powerset level n). For , one recovers ordinary vertices .
Definition 25
(Causal n-SuperHyperGraph model). Fix and a directed n-SuperHyperGraph on . For each , let be a measurable state space and a measurable noise space. Set and .
A causal
n-SuperHyperGraph model
is a tuple
where:
-
(i)
is a probability distribution on .
-
(ii)
is a family of superhyperedge mechanisms
such that for each there exists a measurable space and a measurable map
-
(iii)
-
is a family of aggregation functions
where, for each , letting
we require a measurable map
The endogenous variables are determined by the structural equations
where and .
Definition 26
(Intervention (do-operator) on a causal n-SuperHyperGraph).
Let be as in Definition 25. Fix a nonempty index set and values . The intervention produces the intervened model
where is defined by
The interventional distribution is the distribution of the solution under and , when a (measurable) solution exists.
Theorem 6
(Causal hypergraphs are a special case(
)).
Let be a directed hypergraph whose hyperedges are pairs with , , and . Consider any causal hypergraph model on H given by structural equations
Then it is a special case of a causal 0-SuperHyperGraph model (Definition 25) by taking and .
Proof. Since
and
, we have
. Hence
satisfies
, so
is a directed 0-SuperHyperGraph in the sense of Definition 24. Keeping the same state spaces
, noise spaces
, distribution
, mechanisms
, and aggregators
, the structural equations in Definition 25 become
which is exactly the given causal hypergraph model. Therefore the causal hypergraph model is a special case. □
Theorem 7
(Causal graphs (SCMs) are a special case (
, singleton tails)).
Let be a directed graph and let an SCM be given by
Define a directed hypergraph by
Then the SCM is a special case of a causal 0-SuperHyperGraph model on .
Proof. By , the pair is a directed 0-SuperHyperGraph. We now construct and so that the 0-SuperHyperGraph equations coincide with the SCM equations.
For each edge
, let
and set
and
For a fixed
, the incoming hyperedges are
Hence
Define
by
Then the structural equation in Definition 25 becomes
which is exactly the SCM equation. Therefore the causal graph (SCM) is a special case of a causal 0-SuperHyperGraph model. □
7. Review and Results: SuperHyperGraph Convolutional Network
GCN aggregates each node’s features from its neighbors via normalized adjacency, then applies a learnable linear transform and nonlinearity per layer [
120,
121,
122,
123,
124]. HGCN aggregates vertex features through hyperedges via an incidence-based diffusion operator, then applies learnable transformations to model multiway interactions [
69,
71,
76,
125]. SHGCN propagates features on supervertices using super-incidence across nested superedges, enabling hierarchical multiway diffusion and generalizing both GCN and HGCN.
Table 6 presents the comparison among GCN, HGCN, and SHGCN.
Definition 27
(Graph and node features).
Let be a finite (undirected) graph with . Let be its adjacency matrix, and let be a node-feature matrix whose ith row is the feature vector of vertex . Define the self-loop augmented adjacency
and its degree matrix by
Definition 28
(One GCN layer).
Fix an activation function applied entrywise. A graph convolutional layer
maps to by
where is a learnable weight matrix. AGCN
is a composition of such layers (optionally followed by task-specific readout).
Remark 4
(Interpretation). The operator performs degree-normalized neighborhood averaging (including self-loops), so each layer mixes information along edges and then applies a learnable linear map.
Definition 29
(Weighted hypergraph and incidence matrix).
A (finite) weighted hypergraph is a triple where is a finite vertex set, is a finite family of nonempty subsets of V (hyperedges), and assigns positive weights. The incidence matrix
is defined by
Let be diagonal with . Define the vertex-degree matrix and the hyperedge-degree matrix by
Definition 30
(One HGCN layer (spectral/message-passing form)).
Let be vertex features at layer ℓ. Ahypergraph convolutional layer
maps to by
where is learnable and σ is applied entrywise. An HGCN
is a composition of such hypergraph convolutional layers (optionally with readout).
Remark 5
(Two-stage aggregation: vertex→hyperedge→vertex). The matrix product aggregates vertex features to hyperedges, and then propagates hyperedge information back to vertices. Thus one layer performs a normalized diffusion through hyperedges, capturing multiway interactions.
Remark 6
(Row-normalized variant).
A commonly used simplification replaces the symmetric normalization by
which still implements the vertex→hyperedge→vertex propagation with degree control.
Definition 31
(Iterated powersets and flattening).
Let be a finite nonempty set. Define
Define the flattening maps recursively by
Definition 32
(
n-SuperHyperGraph and super-incidence).
Fix and a base set . Ann-SuperHyperGraph
is a pair
Write and .
The super-incidence matrix
is defined by
Let be a diagonal matrix of positive superedge weights.
Define degree matrices and by
Two canonical propagation operators on supervertices are:
(1)Superhypergraph diffusion operator
(incidence-based):
(2)Supervertex 2-section (clique-expansion) operator
: Let be the adjacency matrix of the 2-section graph on V, i.e., for ,
Define and by . Then set
Definition 33
(Lifting base features to
n-supervertices).
Let be on base set . Let be base vertex features, where row corresponds to .Define the support
of a supervertex by
Define lifted supervertex features by mean aggregation: for ,
Definition 34
(One n-SuperHyperGraph convolutional layer). Let and let be a chosen propagation operator on V (e.g., or from Definition 32). Let be supervertex features at layer ℓ.
Ann-SuperHyperGraph convolutional layer
is the map
where is learnable and σ is applied entrywise.
Definition 35
(n-SuperHyperGraph Convolutional Network (n-SHGCN)). Let be an n-SuperHyperGraph on base set . Given base features , form lifted features by Definition 33. Fix a propagation operator on V.
AnL-layer n-SHGCN
is a sequence
where each step is an n-SuperHyperGraph convolutional layer (Definition 34). Optionally, a readout map is applied for downstream tasks.
Theorem 8
(n-SHGCN generalizes HGCN and GCN). Fix .
-
(i)
-
(HGCN as a special case) Let be a weighted hypergraph. Consider the 0-SuperHyperGraph defined by
Choose . Then a one-layer n-SHGCN update on coincides exactly with a one-layer HGCN update on H.
-
(ii)
(GCN as a special case) Let be a (simple) graph. Form the 2-uniform hypergraph where , and view it as the 0-SuperHyperGraph . Choose . Then a one-layer n-SHGCN update on coincides exactly with a one-layer GCN update on G.
Proof.
(i) Let be given. Define by and . Since , the flattening satisfies for , hence Definition 33 gives (no change of features).
The super-incidence matrix
(Definition 32) is precisely the usual hypergraph incidence matrix, because its entries satisfy
iff
. The degree matrices
and
also match the standard hypergraph degrees:
Therefore the n-SHGCN diffusion operator
is exactly the HGCN propagation matrix. Substituting into the layer update
shows that the one-layer n-SHGCN update coincides with the one-layer HGCN update.
(ii) Let be a simple graph and let . Consider and choose .
We show that the 2-section adjacency matrix
equals the usual adjacency matrix
A of
G. For distinct
,
Thus
. Consequently,
where
is the degree matrix of
. Hence
which is exactly the standard GCN propagation matrix. Substituting into the layer update yields
so the one-layer n-SHGCN update coincides with the one-layer GCN update.
This proves both claims. □
8. SuperHyperGraph Embedding
Graph embedding maps nodes or graphs into low-dimensional vectors, preserving adjacency and structural similarity for efficient prediction and retrieval tasks [
126,
127,
128,
129]. Hypergraph embedding maps vertices and hyperedges into vectors, preserving higher-order relations among multiple vertices for robust classification and link prediction [
61,
130,
131,
132]. Superhypergraph embedding maps nested hyperedges into vectors, capturing hyperedge-of-hyperedge structure to support hierarchical reasoning, representation learning, and prediction tasks efficiently. For reference,
Table 7 presents a comparison of Graph Embedding, HyperGraph Embedding, and SuperHyperGraph Embedding.
Definition 36
(Graph embedding in a vector space).
[128,133,134] Let be a (possibly weighted) simple graph with vertex set , edge set , and weight function . Fix an integer .
Ad-dimensional graph embedding
is a map
together with a prescribed graph proximity
(or relation) function
and a prescribed embedding similarity
function
such that approximates (or preserves its ordering) for relevant pairs .
Formally, one common way to specify this requirement is by defining φ as a minimizer of
where is the set of pairs used for learning/evaluation, L is a loss function, is a regularizer, and .
Definition 37
(Hypergraph and incidence representation). A (weighted) hypergraph is a triple where is a vertex set, is a set of hyperedges with and , and is a positive diagonal matrix of hyperedge weights.
The vertex-hyperedge membership is represented by the incidence matrix defined by
Define diagonal degree matrices and by
A normalized hypergraph operator and Laplacian can be defined by
Definition 38
(Hypergraph embedding in a vector space). Let be a (weighted) hypergraph with and fix .
Ad-dimensional (vertex) hypergraph embedding
is a map
together with a prescribed hyperedge proximity functional
such that the embedding preserves higher-order relations encoded by hyperedges.
A standard formal specification is to define ψ as a minimizer of
where is a loss that enforces the desired within-hyperedge geometry (e.g., small dispersion, large n-wise similarity, or consistency with observed hyperedge events), and is a regularizer.
One concrete (spectral) instance uses the normalized hypergraph Laplacian Δ: choose whose columns are eigenvectors corresponding to the d smallest nontrivial eigenvalues of Δ, and set .
Definition 39
(Hypergraph embedding for downstream tasks as a representation matrix).
In many learning settings, a hypergraph embedding is presented as a representation matrix
where each row represents the embedding of a vertex, and the representation is used for downstream tasks (e.g., node classification, graph classification) via a predictor built on .
Definition 40
(Nesting map and flattening map). Let be a finite, nonempty set, and let and for .
Define then-fold nesting map
by
Define then-level flattening map
recursively as follows:
Lemma 1
(Flattening a nested singleton).
For every and every ,
Proof. We prove by induction on n.
If , then and by definition, hence .
Assume
holds for some
. Then
Applying the recursive definition of
gives
Thus the statement holds for
. This completes the induction. □
Definition 41
(n-lift of a hypergraph). Let be a hypergraph, i.e., . Fix .
Define the n-lifted vertex set and hyperedge family by
where for each hyperedge we set
Then
is an n-SuperHyperGraph in the sense that and .
Definition 42
(n-SuperHypergraph embedding). Let be an n-SuperHyperGraph on a base set , so and . Fix an integer .
Ann-SuperHypergraph embedding
in is a pair of maps
together with (i) an aggregation rule for finite multisets of vectors and (ii) a family of hyperedge-coherence losses , where each maps the tuple to a nonnegative real number.
A concrete (canonical) choice that is widely used is:
and for each ,
In addition, to reflect theSuperHyperGraph
nesting, we enforce a hierarchical consistency term via : for each , define its base support and set
One standard way to specify that is an embedding is to require it to minimize an objective of the form
where are hyperedge weights, , and is a regularizer.
Theorem 9
(SuperHypergraph embeddings generalize hypergraph and graph embeddings). Fix and .
-
(i)
-
(Hypergraph case) Let be a hypergraph and let be any vertex embedding of H (e.g., specified by minimizing a hyperedge-coherence objective over ). Then admits an n-SuperHypergraph embedding such that
and for the canonical loss in Definition 42, the hyperedge terms on H and on coincide under this identification.
-
(ii)
(Graph case) Let be a (simple) graph and regard it as the 2-uniform hypergraph with . Then every graph embedding induces an n-SuperHypergraph embedding of via the same rule . Thus graph embeddings are contained as a special case of n-SuperHypergraph embeddings.
Proof.
(i) Consider the lifted
n-SuperHyperGraph
from Definition 41. Define
This is well-defined because each element of is uniquely of the form .
Let
be a hyperedge, and let
be its lifted hyperedge. Using the canonical coherence loss from Definition 42, we compute explicitly:
where
is the corresponding mean for the hyperedge
e under
. Therefore,
Hence the sum of hyperedge-coherence terms is preserved by the lift.
Moreover, the hierarchical consistency term is compatible with the lift. For any
, Lemma 1 yields
Therefore,
so the hierarchical term vanishes on lifted vertices. This shows that
is an
n-SuperHypergraph embedding of
and that the lifted model recovers the original hypergraph embedding behavior.
(ii) A graph is a special case of a hypergraph by taking only 2-element hyperedges. Applying part (i) to produces an n-SuperHypergraph embedding of with . Thus graph embeddings are included as a special case. □
9. Result and Reviews: SuperHyperGraph-Based Natural Language Processing
Natural language processing (NLP) studies learning algorithms that map text to structured representations, meanings, or decisions using corpora, representations, and task-specific objective functions [
135,
136,
137,
138,
139]. Related notions such as Fuzzy Natural Language [
140,
141,
142] and Natural HyperLanguage [
143,
144] are also known in the literature. Graph-based NLP encodes text as a graph of tokens, entities, or relations and learns predictors on vertices and edges via message passing and attention mechanisms [
145,
146,
147,
148,
149]. Hypergraph-based NLP encodes text as a hypergraph whose hyperedges connect multiple items simultaneously, learning representations that capture higher-order, multiway linguistic interactions. Superhypergraph-based NLP encodes text as an
n-level superhypergraph through iterated powerset constructions, enabling nested linguistic groupings and learning predictors that generalize both graph and hypergraph pipelines. The comparison of Graph-Based NLP, HyperGraph-Based NLP, and SuperHyperGraph-Based NLP is presented in
Table 8.
Definition 43
(Graph-Based Natural Language Processing (Graph-NLP)).
Fix a finite alphabet Σ and let be a set of text objects (e.g., tokens, sentences, documents, dialogue threads), and let be an output space (e.g., labels, rankings, structured outputs).
A graph construction (encoder)
is a mapping
where is a class of attributed graphs, and for each ,
with
a finite vertex set (representing linguistic units such as tokens, entities, sentences, posts),
an edge set (representing relations such as adjacency, dependency, coreference, reply-to, semantic links),
and vertex/edge attributes (discrete or continuous),
w optional weights on vertices/edges.
A graph-based NLP model family
is a set of measurable maps
where each is allowed to use the graph structure (e.g., paths, neighborhoods, walks, cuts, centrality, or message passing) in computing the output.
Given a distribution over pairs and a loss ,Graph-NLP
is the learning/optimization problem
Any pipeline that is designed and evaluated for NLP tasks (e.g., classification, extraction, retrieval, summarization, entailment) is called agraph-based NLP method.
Definition 44
(HyperGraph-Based Natural Language Processing (HyperGraph-NLP)).
Fix a finite alphabet Σ, a set of text objects , and an output space .
A hypergraph construction (encoder)
is a mapping
where is a class of finite attributed hypergraphs. For each ,
where is a finite vertex set, is a family of hyperedges, and are attribute maps, and w denotes optional weights (on vertices and/or hyperedges).
A hypergraph-based NLP model family
is a set of (measurable) maps
Given a distribution on and a loss , HyperGraph-NLP
is the learning problem
Definition 45
(SuperHyperGraph-Based Natural Language Processing (SuperHyperGraph-NLP)).
Fix as above and fix an integer .
Ann–SuperHyperGraph construction (encoder)
is a mapping
where is a class of finite attributed n–SuperHyperGraphs. For each ,
such that
Elements of aren–supervertices
and elements of aren–superedges
. The maps and assign attributes, and w denotes optional weights.
A superhypergraph-based NLP model family
is
Given and as above, SuperHyperGraph-NLP
is
Definition 46
(Graph-to-hypergraph inclusion).
Let be a (simple, undirected) graph with . Define the associated 2-uniform hypergraph
Theorem 10
(HyperGraph-NLP generalizes Graph-NLP). Every instance of Graph-Based Natural Language Processing is a special case of HyperGraph-Based Natural Language Processing.
Proof. Consider any Graph-NLP pipeline consisting of a graph encoder and a model family , where is a class of finite attributed graphs.
Define a hypergraph encoder by composition:
where
contains (at least) all 2-uniform attributed hypergraphs obtained from graphs in
. Define a hypergraph model family on this subclass by
Then for every
,
Taking expectation over shows that the optimization objective of the constructed HyperGraph-NLP instance is identical to the original Graph-NLP objective. Hence Graph-NLP is a special case of HyperGraph-NLP. □
Definition 47
(Hypergraph-to-superhypergraph inclusion at level
).
Let be a hypergraph with . Define
Definition 48
(Singleton lifting of graphs/hypergraphs to level
n).
For , define recursively by
Given a hypergraph , define itsn-lift
(Attributes and weights are transported along in the obvious way.)
Theorem 11
(SuperHyperGraph-NLP generalizes Graph-NLP and HyperGraph-NLP). For every , every instance of HyperGraph-NLP is a special case of SuperHyperGraph-NLP at level n. Consequently, every instance of Graph-NLP is also a special case of SuperHyperGraph-NLP.
Proof. Fix .
Step 1 (HyperGraph-NLP ⇒ SuperHyperGraph-NLP). Take any HyperGraph-NLP pipeline with encoder
and model family
. Define a superhypergraph encoder by
Define a superhypergraph model family on the image of
by
Then for every
,
Thus the expected risk objective is preserved, and HyperGraph-NLP is a special case of SuperHyperGraph-NLP.
Step 2 (Graph-NLP ⇒ SuperHyperGraph-NLP). By Theorem above, Graph-NLP is a special case of HyperGraph-NLP via the inclusion into 2-uniform hypergraphs. Composing with Step 1 yields a SuperHyperGraph-NLP instance (at level n) with identical objective value for all . Therefore SuperHyperGraph-NLP generalizes both Graph-NLP and HyperGraph-NLP. □
10. Conclusion
This paper extended several core frameworks—including GCN, GraphRAG, causal graphs, graph embedding, and graph generation—by integrating the SuperHyperGraph perspective. Future work will include computational experiments and quantitative evaluation on real-world datasets. We also hope that further research will explore extensions of the concepts presented here using Fuzzy Sets [
150], Rough Sets [
151,
152], Neutrosophic Sets [
79,
153,
154], Uncertain Sets[
78,
155], and Plithogenic Sets [
156,
157].
Funding
This study was conducted without any financial support from external organizations or grants.
Acknowledgments
We would like to express our sincere gratitude to everyone who provided valuable insights, support, and encouragement throughout this research. We also extend our thanks to the readers for their interest and to the authors of the referenced works, whose scholarly contributions have greatly influenced this study. Lastly, we are deeply grateful to the publishers and reviewers who facilitated the dissemination of this work.
Data Availability Statement
Since this research is purely theoretical and mathematical, no empirical data or computational analysis was utilized. Researchers are encouraged to expand upon these findings with data-oriented or experimental approaches in future studies.
Ethical Statement
As this study does not involve experiments with human participants or animals, no ethical
approval was required.
Conflicts of Interest
The authors declare that they have no conflicts of interest related to the content or publication of this paper.
Code Availability
No code or software was developed for this study.
Clinical Trial
This study did not involve any clinical trials.
Consent to Participate
Not applicable.
Use of Generative AI and AI-Assisted Tools
I use generative AI and AI-assisted tools for tasks such as English
grammar checking, and I do not employ them in any way that violates ethical standards.
Disclaimer:
This work presents theoretical ideas and frameworks that have not yet been empirically validated.
Readers are encouraged to explore practical applications and further refine these concepts. Although care has been
taken to ensure accuracy and appropriate citations, any errors or oversights are unintentional. The perspectives
and interpretations expressed herein are solely those of the authors and do not necessarily reflect the viewpoints
of their affiliated institutions.
References
- Diestel, R.; Diestel, Reinhard. Graph theory; Springer (print edition); Reinhard Diestel (eBooks), 2024. [Google Scholar]
- Gross, J.L.; Yellen, J.; Anderson, M. Graph theory and its applications; Chapman and Hall/CRC, 2018. [Google Scholar]
- Wagner, S.; Wang, H. Introduction to chemical graph theory; Chapman and Hall/CRC, 2018. [Google Scholar]
- Zhang, Y. Graph Neural Network-Based User Preference Model for Social Network Access Control. Informatica 2025, 49. [Google Scholar] [CrossRef]
- Feng, Y.; You, H.; Zhang, Z.; Ji, R.; Gao, Y. Hypergraph neural networks. Proceedings of the Proceedings of the AAAI conference on artificial intelligence 2019, Vol. 33, 3558–3565. [Google Scholar] [CrossRef]
- Gao, Y.; Zhang, Z.; Lin, H.; Zhao, X.; Du, S.; Zou, C. Hypergraph learning: Methods and practices. IEEE Transactions on Pattern Analysis and Machine Intelligence 2020, 44, 2548–2566. [Google Scholar] [CrossRef] [PubMed]
- Feng, Y.; Han, J.; Ying, S.; Gao, Y. Hypergraph isomorphism computation. IEEE Transactions on Pattern Analysis and Machine Intelligence 2024. [Google Scholar] [CrossRef] [PubMed]
- Banerjee, A.; Mishra, R.; Parui, S. On the Spectra of Threshold Hypergraphs. arXiv 2022, arXiv:2207.02528. [Google Scholar]
- Chiarelli, N.; Milanič, M. Total domishold graphs: a generalization of threshold graphs, with connections to threshold hypergraphs. Discrete Applied Mathematics 2014, 179, 1–12. [Google Scholar] [CrossRef]
- Keevash, P. Hypergraph turan problems. Surveys in combinatorics 2011, 392, 83–140. [Google Scholar]
- Ramos, E.L.H.; Ayala, L.R.A.; Macas, K.A.S. Study of Factors that Influence a Victim’s Refusal to Testify for Sexual Reasons Due to External Influence Using Plithogenic n-SuperHyperGraphs. Operational Research Journal 2025, 46, 328–337. [Google Scholar]
- Ghods, M.; Rostami, Z.; Smarandache, F. Introduction to Neutrosophic Restricted SuperHyperGraphs and Neutrosophic Restricted SuperHyperTrees and several of their properties. Neutrosophic Sets and Systems 2022, 50, 480–487. [Google Scholar]
- Fujita, T.; Smarandache, F. Neutrosophic Soft n-Super-HyperGraphs with Real-World Applications. European Journal of Pure and Applied Mathematics 2025, 18, 6621. [Google Scholar] [CrossRef]
- Alqahtani, M. Intuitionistic Fuzzy Quasi-Supergraph Integration for Social Network Decision Making. International Journal of Analysis and Applications 2025, 23, 137–137. [Google Scholar] [CrossRef]
- Smarandache, F. Extension of HyperGraph to n-SuperHyperGraph and to Plithogenic n-SuperHyperGraph, and Extension of HyperAlgebra to n-ary (Classical-/Neutro-/Anti-) HyperAlgebra; Infinite Study, 2020. [Google Scholar]
- Hamidi, M.; Taghinezhad, M. Application of Superhypergraphs-Based Domination Number in Real World; Infinite Study, 2023. [Google Scholar]
- Hamidi, M.; Smarandache, F.; Davneshvar, E. Spectrum of superhypergraphs via flows. Journal of Mathematics 2022, 2022, 9158912. [Google Scholar] [CrossRef]
- Bravo, J.C.M.; Piedrahita, C.J.B.; Bravo, M.A.M.; Pilacuan-Bonete, L.M. Integrating SMED and Industry 4.0 to optimize processes with plithogenic n-SuperHyperGraphs. Neutrosophic Sets and Systems 2025, 84, 328–340. [Google Scholar]
- Tarjan, R.E. Depth-First Search and Linear Graph Algorithms. SIAM J. Comput. 1972, 1, 146–160. [Google Scholar] [CrossRef]
- Mehlhorn, K. Graph Algorithm and NP-Completeness; 1984. [Google Scholar]
- Holland, P.W.; Leinhardt, S. An exponential family of probability distributions for directed graphs. Journal of the american Statistical association 1981, 76, 33–50. [Google Scholar] [CrossRef]
- Fujita, T. Extensions of multidirected graphs: Fuzzy, neutrosophic, plithogenic, rough, soft, hypergraph, and superhypergraph variants. International journal of topology 2025, 2, 11. [Google Scholar] [CrossRef]
- George, B.; Jose, J.; Thumbakara, R.K.; et al. Introducing soft directed hypergraphs: a fusion of soft set theory and directed hypergraphs. Utilitas Mathematica 2024, 121. [Google Scholar] [CrossRef]
- George, B.; Jose, J.; Thumbakara, R.K. Soft Directed Hypergraphs and Their AND & OR Operations. Mathematical Forum 2022, 30. [Google Scholar]
- Fujita, T.; Smarandache, F. A Concise Study of Some Superhypergraph Classes. Neutrosophic Sets and Systems 2024, 77, 548–593. [Google Scholar]
- Wei, E.L.; Tang, W.L.; Ye, D. Nowhere-zero 15-flow in 3-edge-connected bidirected graphs. Acta Mathematica Sinica, English Series 2014, 30, 649–660. [Google Scholar] [CrossRef]
- Xu, R.; Zhang, C.Q. On flows in bidirected graphs. Discrete mathematics 2005, 299, 335–343. [Google Scholar] [CrossRef]
- Wei, E.; Tang, W.; Wang, X. Flows in 3-edge-connected bidirected graphs. Frontiers of Mathematics in China 2011, 6, 339–348. [Google Scholar] [CrossRef]
- Berge, C. Hypergraphs: combinatorics of finite sets; Elsevier, 1984; Vol. 45. [Google Scholar]
- Khan, B.; Wu, J.; Yang, J.; Ma, X. Heterogeneous hypergraph neural network for social recommendation using attention network. ACM Transactions on Recommender Systems 2025, 3, 1–22. [Google Scholar] [CrossRef]
- Hu, B.; Huang, W.; Zheng, T.; Song, M.; Li, Y. A general heterogeneous hypergraph neural network for node classification. In Proceedings of the 2023 International Joint Conference on Neural Networks (IJCNN). IEEE, 2023, pp. 1–7.
- Jin, H.; Zhou, K.; Yin, J.; You, L.; Zhou, Z. Multi-Granular Attention based Heterogeneous Hypergraph Neural Network. arXiv 2025, arXiv:2505.04340. [Google Scholar] [CrossRef]
- Fujita, T.; Smarandache, F. Superhypergraph Neural Networks and Plithogenic Graph Neural Networks: Theoretical Foundations. Advancing Uncertain Combinatorics through Graphization, Hyperization, and Uncertainization: Fuzzy, Neutrosophic, Soft, Rough, and Beyond 2025, 5, 577. [Google Scholar]
- Yi, O.; Guo, B.; Tang, X.; He, X.; Xiong, J.; Yu, Z. Learning Cross-Domain Representation with Multi-Graph Neural Network. ArXiv 2019, abs/1905.10095.
- Lee, L.H.; Lu, Y. Multiple embeddings enhanced multi-graph neural networks for Chinese healthcare named entity recognition. IEEE Journal of Biomedical and Health Informatics 2021, 25, 2801–2810. [Google Scholar] [CrossRef] [PubMed]
- Song, Y.; Ye, H.; Li, M.; Cao, F. Deep multi-graph neural networks with attention fusion for recommendation. Expert Systems with Applications 2022, 191, 116240. [Google Scholar] [CrossRef]
- Wientjes, M. Hyperlink prediction with Modular Directed Hypergraph Neural Networks. PhD thesis, Master’s thesis, Eindhoven University of Technology, 2021. [Google Scholar]
- Yang, G.; Jin, T.; Dou, L. Heterogeneous directed hypergraph neural network over abstract syntax tree (ast) for code classification. arXiv 2023, arXiv:2305.04228. [Google Scholar]
- Tran, L.H.; Tran, L.H. Directed hypergraph neural network. ArXiv 2020, abs/2008.03626.
- Fujita, T. Multi-SuperHyperGraph Neural Networks: A Generalization of Multi-HyperGraph Neural Networks. Neutrosophic Computing and Machine Learning 2025, 39, 328–347. [Google Scholar]
- Zass, R.; Shashua, A. Probabilistic graph and hypergraph matching. In Proceedings of the 2008 IEEE Conference on Computer Vision and Pattern Recognition. IEEE, 2008, pp. 1–8.
- Nenadov, R. Probabilistic hypergraph containers. Israel Journal of Mathematics 2024, 261, 879–897. [Google Scholar] [CrossRef]
- Fujita, T. Review of Probabilistic HyperGraph and Probabilistic SuperHyperGraph. Spectrum of Operational Research 2026, 3, 319–338. [Google Scholar] [CrossRef]
- Acid, S.; de Campos, L.M.; Castellano, F.J.G. Learning Bayesian Network Classifiers: Searching in a Space of Partially Directed Acyclic Graphs. Machine Learning 2005, 59, 213–235. [Google Scholar] [CrossRef]
- Tariq, R.; Aadil, F.; Malik, M.F.; Ejaz, S.; Khan, M.U.; Khan, M.F. Directed Acyclic Graph Based Task Scheduling Algorithm for Heterogeneous Systems. In Proceedings of the Intelligent Systems with Applications, 2018. [Google Scholar]
- Fujita, T. Directed Acyclic SuperHypergraphs (DASH): A General Framework for Hierarchical Dependency Modeling. Neutrosophic Knowledge 2025, 6, 72–86. [Google Scholar]
- Venhorst, J.; Núnez, S.; Kruse, C.G. Design of a high fragment efficiency library by molecular graph theory. ACS Medicinal Chemistry Letters 2010, 1, 499–503. [Google Scholar] [CrossRef]
- Alhamoud, K.; Ghunaim, Y.; Alshehri, A.S.; Li, G.; Ghanem, B.; You, F. Leveraging 2D molecular graph pretraining for improved 3D conformer generation with graph neural networks. Computers & Chemical Engineering 2024, 183, 108622. [Google Scholar] [CrossRef]
- Rahman, A.; Poirel, C.L.; Badger, D.J.; Murali, T. Reverse engineering molecular hypergraphs. In Proceedings of the Proceedings of the ACM Conference on Bioinformatics, Computational Biology and Biomedicine, 2012; pp. 68–75. [Google Scholar]
- Chen, J.; Schwaller, P. Molecular hypergraph neural networks. The Journal of Chemical Physics 2024, 160. [Google Scholar] [CrossRef]
- Fujita, T. Molecular Fuzzy graphs, hypergraphs, and superhypergraphs. Journal of Intelligent Decision and Computational Modelling 2025, 1, 158–171. [Google Scholar]
- Fujita, T. An Introduction and Reexamination of Molecular Hypergraph and Molecular N-SuperHypergraph. Asian Journal of Physical and Chemical Sciences 2025, 13, 1–38. [Google Scholar] [CrossRef]
- Schmidt, J.; Pettersson, L.; Verdozzi, C.; Botti, S.; Marques, M.A. Crystal graph attention networks for the prediction of stable materials. Science advances 2021, 7, eabi7948. [Google Scholar] [CrossRef]
- Wang, X.; He, X.; Cao, Y.; Liu, M.; Chua, T.S. Kgat: Knowledge graph attention network for recommendation. In Proceedings of the Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining, 2019; pp. 950–958. [Google Scholar]
- Ding, K.; Wang, J.; Li, J.; Li, D.; Liu, H. Be more with less: Hypergraph attention networks for inductive text classification. arXiv arXiv:2011.00387. [CrossRef]
- Wang, J.; Ding, K.; Zhu, Z.; Caverlee, J. Session-based recommendation with hypergraph attention networks. In Proceedings of the Proceedings of the 2021 SIAM international conference on data mining (SDM), SIAM, 2021; pp. 82–90. [Google Scholar]
- Fujita, T.; Mehmood, A. SuperHyperGraph Attention Networks. Neutrosophic Computing and Machine Learning 2025, 40, 10–27. [Google Scholar]
- Han, H.; Wang, Y.; Shomer, H.; Guo, K.; Ding, J.; Lei, Y.; Halappanavar, M.; Rossi, R.A.; Mukherjee, S.; Tang, X.; et al. Retrieval-augmented generation with graphs (graphrag). arXiv 2024, arXiv:2501.00309. [Google Scholar] [CrossRef]
- Han, H.; Ma, L.; Shomer, H.; Wang, Y.; Lei, Y.; Guo, K.; Hua, Z.; Long, B.; Liu, H.; Aggarwal, C.C.; et al. Rag vs. graphrag: A systematic evaluation and key insights. arXiv arXiv:2502.11371. [CrossRef]
- Luo, H.; Chen, G.; Zheng, Y.; Wu, X.; Guo, Y.; Lin, Q.; Feng, Y.; Kuang, Z.; Song, M.; Zhu, Y.; et al. HyperGraphRAG: Retrieval-Augmented Generation via Hypergraph-Structured Knowledge Representation. arXiv arXiv:2503.21322.
- Wang, X.; Huang, W.; Zhao, B.; Li, S. Scientific collaborator recommendation via hypergraph embedding. Information Processing & Management 2026, 63, 104423. [Google Scholar]
- Dong, X.; Ding, H.; Gao, D.; Wang, J.; Zheng, G. Joint Node–Hyperedge Contribution Hypergraph Embedding for Scraper Conveyor Sprocket Bearings Fault Diagnosis. Available at SSRN 5390382.
- Zhu, Y.; Du, Y.; Wang, Y.; Xu, Y.; Zhang, J.; Liu, Q.; Wu, S. A survey on deep graph generation: Methods and applications. In Proceedings of the Learning on Graphs Conference. PMLR, 2022; pp. 47–1. [Google Scholar]
- Ou, X.; Boyer, W.F.; McQueen, M.A. A scalable approach to attack graph generation. In Proceedings of the Proceedings of the 13th ACM conference on Computer and communications security, 2006; pp. 336–345. [Google Scholar]
- Yang, J.; Lu, J.; Lee, S.; Batra, D.; Parikh, D. Graph r-cnn for scene graph generation. In Proceedings of the Proceedings of the European conference on computer vision (ECCV), 2018; pp. 670–685. [Google Scholar]
- Zhao, L.; Song, Y.; Zhang, C.; Liu, Y.; Wang, P.; Lin, T.; Deng, M.; Li, H. T-GCN: A temporal graph convolutional network for traffic prediction. IEEE transactions on intelligent transportation systems 2019, 21, 3848–3858. [Google Scholar] [CrossRef]
- Guo, S.; Lin, Y.; Feng, N.; Song, C.; Wan, H. Attention based spatial-temporal graph convolutional networks for traffic flow forecasting. Proceedings of the Proceedings of the AAAI conference on artificial intelligence 2019, Vol. 33, 922–929. [Google Scholar] [CrossRef]
- Zhang, S.; Tong, H.; Xu, J.; Maciejewski, R. Graph convolutional networks: a comprehensive review. Computational Social Networks 2019, 6. [Google Scholar] [CrossRef]
- Bing, R.; Yuan, G.; Zhang, Y.; Wang, S.; Li, B.; Zhou, Y. An Efficient Multi-View Heterogeneous Hypergraph Convolutional Network for Heterogeneous Information Network Representation Learning. IEEE Transactions on Big Data 2024. [Google Scholar] [CrossRef]
- Wang, J.; Li, H.; Qu, G.; Cecil, K.M.; Dillman, J.R.; Parikh, N.A.; He, L. Dynamic weighted hypergraph convolutional network for brain functional connectome analysis. Medical image analysis 2023, 87, 102828. [Google Scholar] [CrossRef]
- Nong, L.; Peng, J.; Zhang, W.; Lin, J.; Qiu, H.; Wang, J. Adaptive multi-hypergraph convolutional networks for 3D object classification. IEEE Transactions on Multimedia 2022, 25, 4842–4855. [Google Scholar] [CrossRef]
- Wang, Y.; Sun, Y.; Liu, Z.; Sarma, S.E.; Bronstein, M.M.; Solomon, J.M. Dynamic graph cnn for learning on point clouds. ACM Transactions on Graphics (tog) 2019, 38, 1–12. [Google Scholar] [CrossRef]
- Beck, F.; Burch, M.; Diehl, S.; Weiskopf, D. The State of the Art in Visualizing Dynamic Graphs. EuroVis (STARs) 2014. [Google Scholar]
- Wang, S.; Wang, K.; Li, X. Dynamic Hypergraph Neural Networks for Flotation Condition Recognition Based on Group-View Composite Features. IEEE Transactions on Instrumentation and Measurement 2025. [Google Scholar] [CrossRef]
- Zhang, Z.; Lin, H.; Gao, Y.; BNRist, K. Dynamic hypergraph structure learning. In Proceedings of the IJCAI, 2018; pp. 3162–3169. [Google Scholar]
- Yin, N.; Feng, F.; Luo, Z.; Zhang, X.; Wang, W.; Luo, X.; Chen, C.; Hua, X.S. Dynamic hypergraph convolutional network. In Proceedings of the 2022 IEEE 38th International Conference on Data Engineering (ICDE). IEEE, 2022; IEEE; pp. 1621–1634. [Google Scholar]
- Fujita, T. SuperHyperGraph Neural Network and Dynamic SuperHyperGraph Neural Network. International Journal of Complexity in Applied Science and Technology (IJCAST) 2025. Accepted.
- Fujita, T.; Smarandache, F. A Unified Framework for U-Structures and Functorial Structure: Managing Super, Hyper, SuperHyper, Tree, and Forest Uncertain Over/Under/Off Models. Neutrosophic Sets and Systems 2025, 91, 337–380. [Google Scholar]
- Broumi, S.; Talea, M.; Bakali, A.; Smarandache, F. Single valued neutrosophic graphs. Journal of New theory 2016, 10, 86–101. [Google Scholar]
- Rosenfeld, A. Fuzzy graphs. In Fuzzy sets and their applications to cognitive and decision processes; Elsevier, 1975; pp. 77–95. [Google Scholar]
- Mordeson, J.N.; Nair, P.S. Fuzzy graphs and fuzzy hypergraphs; Vol. 46, Physica, 2012.
- Fujita, T. HyperWeighted Graph, SuperHyperWeighted Graph, and MultiWeighted Graph. Pure Mathematics for Theoretical Computer Science 2025, 5, 21–33. [Google Scholar]
- Mathew, S.; Sunitha, M. Cycle connectivity in weighted graphs. Proyecciones (Antofagasta) 2011, 30, 1–17. [Google Scholar] [CrossRef]
- Banerjee, S.; Mukherjee, A.; Panigrahi, P.K. Quantum blockchain using weighted hypergraph states. Physical Review Research 2020, 2, 013322. [Google Scholar] [CrossRef]
- Fujita, T. Modeling Complex Hierarchical Systems with Weighted and Signed Superhypergraphs: Foundations and Applications. Open Journal of Discrete Applied Mathematics (ODAM) 2025, 8, 20–39. [Google Scholar] [CrossRef]
- Jose, B.K.; Tuza, Z. Hypergraph domination and strong independence. Applicable Analysis and Discrete Mathematics 2009, 3, 347–358. [Google Scholar] [CrossRef]
- Feng, S.; Heath, E.; Jefferson, B.; Joslyn, C.; Kvinge, H.; Mitchell, H.D.; Praggastis, B.; Eisfeld, A.J.; Sims, A.C.; Thackray, L.B.; et al. Hypergraph models of biological networks to identify genes critical to pathogenic viral response. BMC bioinformatics 2021, 22, 287. [Google Scholar] [CrossRef]
- Fujita, T. Advancing Uncertain Combinatorics through Graphization, Hyperization, and Uncertainization: Fuzzy, Neutrosophic, Soft, Rough, and Beyond; Biblio Publishing, 2025. [Google Scholar]
- Marcos, B.V.S.; Willner, M.F.; Rosa, B.V.C.; Yissel, F.F.R.M.; Roberto, E.R.; Puma, L.D.B.; Fernández, D.M.M. Using plithogenic n-SuperHyperGraphs to assess the degree of relationship between information skills and digital competencies. Neutrosophic Sets and Systems 2025, 84, 513–524. [Google Scholar]
- Amable, N.H.; De Salazar, E.E.V.; Isaac, M.G.M.; Sánchez, O.C.O.; Palma, J.M.S. Representation of motivational dynamics in school environments through Plithogenic n-SuperHyperGraphs with family participation. Neutrosophic Sets and Systems 2025, 92, 570–583. [Google Scholar]
- Nacaroglu, Y.; Akgunes, N.; Pak, S.; Cangul, I.N. Some graph parameters of power set graphs. Advances & Applications in Discrete Mathematics 2021, 26. [Google Scholar]
- Shalu, M.; Yamini, S.D. Counting maximal independent sets in power set graphs. Indian Institute of Information Technology Design & Manufacturing (IIITD&M) Kancheepuram, India 2014.
- Jech, T. Set theory: The third millennium edition, revised and expanded; Springer, 2003. [Google Scholar]
- Bretto, A. Hypergraph theory. An introduction. Mathematical Engineering. Cham: Springer 2013, 1.
- Smarandache, F. Foundation of SuperHyperStructure & Neutrosophic SuperHyperStructure. Neutrosophic Sets and Systems 2024, 63, 21. [Google Scholar]
- Khali, H.E.; GÜNGÖR, G.D.; Zaina, M.A.N. Neutrosophic SuperHyper Bi-Topological Spaces: Original Notions and New Insights. Neutrosophic Sets and Systems 2022, 51, 3. [Google Scholar]
- Smarandache, F. Introduction to the n-SuperHyperGraph-the most general form of graph today; Infinite Study, 2022. [Google Scholar]
- Yu, H.; Gan, A.; Zhang, K.; Tong, S.; Liu, Q.; Liu, Z. Evaluation of retrieval-augmented generation: A survey. In Proceedings of the CCF Conference on Big Data. Springer, 2024, pp. 102–120.
- Salemi, A.; Zamani, H. Evaluating retrieval quality in retrieval-augmented generation. In Proceedings of the Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval, 2024, pp. 2395–2400.
- Wang, X.; Wang, Z.; Gao, X.; Zhang, F.; Wu, Y.; Xu, Z.; Shi, T.; Wang, Z.; Li, S.; Qian, Q.; et al. Searching for best practices in retrieval-augmented generation. In Proceedings of the Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, 2024, pp. 17716–17736.
- Jiang, Z.; Xu, F.F.; Gao, L.; Sun, Z.; Liu, Q.; Dwivedi-Yu, J.; Yang, Y.; Callan, J.; Neubig, G. Active retrieval augmented generation. In Proceedings of the Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, 2023; pp. 7969–7992. [Google Scholar]
- Peng, B.; Zhu, Y.; Liu, Y.; Bo, X.; Shi, H.; Hong, C.; Zhang, Y.; Tang, S. Graph retrieval-augmented generation: A survey. ACM Transactions on Information Systems 2024. [Google Scholar] [CrossRef]
- Edge, D.; Trinh, H.; Cheng, N.; Bradley, J.; Chao, A.; Mody, A.; Truitt, S.; Metropolitansky, D.; Ness, R.O.; Larson, J. From local to global: A graph rag approach to query-focused summarization. arXiv 2024, arXiv:2404.16130. [Google Scholar] [CrossRef]
- Wu, J.; Zhu, J.; Qi, Y.; Chen, J.; Xu, M.; Menolascina, F.; Grau, V. Medical graph rag: Towards safe medical large language model via graph retrieval-augmented generation. arXiv 2024, arXiv:2408.04187. [Google Scholar] [CrossRef]
- Cordeiro, D.; Mounié, G.; Perarnau, S.; Trystram, D.; Vincent, J.M.; Wagner, F. Random graph generation for scheduling simulations. In Proceedings of the 3rd International ICST Conference on Simulation Tools and Techniques (SIMUTools 2010). ICST, 2010; p. 10. [Google Scholar]
- Liao, R.; Li, Y.; Song, Y.; Wang, S.; Hamilton, W.; Duvenaud, D.K.; Urtasun, R.; Zemel, R. Efficient graph generation with graph recurrent attention networks. Advances in neural information processing systems 2019, 32. [Google Scholar]
- Kavvadias, D.; Stavropoulos, E. An efficient algorithm for the transversal hypergraph generation. Journal of Graph Algorithms and Applications 2005, 9, 239–264. [Google Scholar] [CrossRef]
- Lin, Z.; Yan, Q.; Liu, W.; Wang, S.; Wang, M.; Tan, Y.; Yang, C. Automatic hypergraph generation for enhancing recommendation with sparse optimization. IEEE Transactions on Multimedia 2023, 26, 5680–5693. [Google Scholar] [CrossRef]
- Gailhard, D.; Tartaglione, E.; Naviner, L.; Giraldo, J.H. HYGENE: A diffusion-based hypergraph generation method. Proceedings of the Proceedings of the AAAI Conference on Artificial Intelligence 2025, Vol. 39, 16682–16690. [Google Scholar] [CrossRef]
- Wen, W.; Yu, T. HyperPLR: Hypergraph Generation through Projection, Learning, and Reconstruction. In Proceedings of the The Thirteenth International Conference on Learning Representations, 2025. [Google Scholar]
- Spirtes, P.; Glymour, C. An algorithm for fast recovery of sparse causal graphs. Social science computer review 1991, 9, 62–72. [Google Scholar] [CrossRef]
- Helmert, M. A Planning Heuristic Based on Causal Graph Analysis. Proceedings of the ICAPS 2004, Vol. 16, 161–170. [Google Scholar]
- Jiralerspong, T.; Chen, X.; More, Y.; Shah, V.; Bengio, Y. Efficient causal graph discovery using large language models. arXiv 2024, arXiv:2402.01207. [Google Scholar] [CrossRef]
- Helmert, M.; Geffner, H. Unifying the Causal Graph and Additive Heuristics. In Proceedings of the ICAPS, 2008; pp. 140–147. [Google Scholar]
- Xu, C.; Huang, H.; Yoo, S. Scalable causal graph learning through a deep neural network. In Proceedings of the Proceedings of the 28th ACM international conference on information and knowledge management, 2019; pp. 1853–1862. [Google Scholar]
- Li, F.; Li, X.; Wen, S.; Bao, J. Multi-modal causal hypergraph reasoning for enhancing collaborative diagnosis of equipment composite failures. Advanced Engineering Informatics 2025, 68, 103611. [Google Scholar] [CrossRef]
- Harit, A.; Sun, Z.; Yu, J. From News to Returns: A Granger-Causal Hypergraph Transformer on the Sphere. In Proceedings of the Proceedings of the 6th ACM International Conference on AI in Finance, 2025; pp. 674–682. [Google Scholar]
- Gôlo, M.P.S.; Marcacini, R.M. One-class edge classification through heterogeneous hypergraph for causal discovery. Scientific Reports 2025. [Google Scholar] [CrossRef]
- Sun, Z.; Harit, A.; Lio, P. Actionable Interpretability via Causal Hypergraphs: Unravelling Batch Size Effects in Deep Learning. arXiv 2025, arXiv:2506.17826. [Google Scholar] [CrossRef]
- Yang, H.; Gu, Y.; Zhu, J.; Hu, K.; Zhang, X. PGCN-TCA: Pseudo graph convolutional network with temporal and channel-wise attention for skeleton-based action recognition. IEEE Access 2020, 8, 10040–10047. [Google Scholar] [CrossRef]
- Lee, K.; Rhee, W. DDP-GCN: Multi-graph convolutional network for spatiotemporal traffic forecasting. Transportation Research Part C: Emerging Technologies 2022, 134, 103466. [Google Scholar] [CrossRef]
- Zhu, D.; Zhang, Z.; Cui, P.; Zhu, W. Robust Graph Convolutional Networks Against Adversarial Attacks. Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining 2019. [Google Scholar]
- Xia, T.; Lin, J.; Li, Y.; Feng, J.; Hui, P.; Sun, F.; Guo, D.; Jin, D. 3dgcn: 3-dimensional dynamic graph convolutional network for citywide crowd flow prediction. ACM Transactions on Knowledge Discovery from Data (TKDD) 2021, 15, 1–21. [Google Scholar] [CrossRef]
- Liu, T.; Hu, Y.; Wang, B.; Sun, Y.; Gao, J.; Yin, B. Hierarchical graph convolutional networks for structured long document classification. IEEE transactions on neural networks and learning systems 2022, 34, 8071–8085. [Google Scholar] [CrossRef]
- Heilman, A.J.; Gong, W.; Yan, Q. Crystal Hypergraph Convolutional Networks. arXiv 2024, arXiv:2411.12616. [Google Scholar] [CrossRef]
- Wang, Q.; Mao, Z.; Wang, B.; Guo, L. Knowledge graph embedding: A survey of approaches and applications. IEEE transactions on knowledge and data engineering 2017, 29, 2724–2743. [Google Scholar] [CrossRef]
- Ou, M.; Cui, P.; Pei, J.; Zhang, Z.; Zhu, W. Asymmetric Transitivity Preserving Graph Embedding. Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining 2016. [Google Scholar]
- Cai, H.; Zheng, V.W.; Chang, K.C.C. A comprehensive survey of graph embedding: Problems, techniques, and applications. IEEE transactions on knowledge and data engineering 2018, 30, 1616–1637. [Google Scholar] [CrossRef]
- Wang, Z.; Zhang, J.; Feng, J.; Chen, Z. Knowledge graph embedding by translating on hyperplanes. In Proceedings of the Proceedings of the AAAI conference on artificial intelligence, 2014, Vol. 28.
- Hayat, M.K.; Xue, S.; Wu, J.; Yang, J. Heterogeneous hypergraph embedding for node classification in dynamic networks. IEEE Transactions on Artificial Intelligence 2024. [Google Scholar] [CrossRef]
- Dong, X.; Ding, H.; Gao, D.; Wang, J.; Zheng, G. Joint Node–Hyperedge Contribution Hypergraph Embedding for Scraper Conveyor Sprocket Bearings Fault Diagnosis. Available at SSRN 5390382.
- Fatemi, B.; Taslakian, P.; Vazquez, D.; Poole, D. Knowledge hypergraph embedding meets relational algebra. Journal of Machine Learning Research 2023, 24, 1–34. [Google Scholar]
- Goyal, P.; Ferrara, E. Graph embedding techniques, applications, and performance: A survey. Knowledge-Based Systems 2018, 151, 78–94. [Google Scholar] [CrossRef]
- Xu, M. Understanding graph embedding methods and their applications. SIAM Review 2021, 63, 825–853. [Google Scholar] [CrossRef]
- Chowdhary, K.; Chowdhary, K. Natural language processing. Fundamentals of artificial intelligence 2020, 603–649. [Google Scholar]
- Bird, S.; Klein, E.; Loper, E. Natural language processing with Python: analyzing text with the natural language toolkit; O’Reilly Media, Inc., 2009. [Google Scholar]
- Manning, C.D. Foundations of statistical natural language processing; The MIT Press, 1999. [Google Scholar]
- Collobert, R.; Weston, J.; Bottou, L.; Karlen, M.; Kavukcuoglu, K.; Kuksa, P. Natural language processing (almost) from scratch. Journal of machine learning research 2011, 12, 2493–2537. [Google Scholar]
- Manning, C.D.; Surdeanu, M.; Bauer, J.; Finkel, J.R.; Bethard, S.; McClosky, D. The Stanford CoreNLP natural language processing toolkit. In Proceedings of the Proceedings of 52nd annual meeting of the association for computational linguistics: system demonstrations, 2014; pp. 55–60. [Google Scholar]
- Adel, N. Fuzzy natural language similarity measures through computing with words. PhD thesis, Manchester Metropolitan University, 2022. [Google Scholar]
- Glöckner, I.; Knoll, A. A formal theory of fuzzy natural language quantification and its role in granular computing. In Granular computing: An emerging paradigm; Springer, 2001; pp. 215–256. [Google Scholar]
- Xie, J.; Zhao, L. Research on resume screening and career matching model based on fuzzy natural language processing. Journal of Computational Methods in Sciences and Engineering 2025. [Google Scholar] [CrossRef]
- Fujita, T. Natural n-Superhyper Plithogenic Language. Advancing Uncertain Combinatorics through Graphization, Hyperization, and Uncertainization: Fuzzy, Neutrosophic, Soft, Rough, and Beyond 2025, 294. [Google Scholar]
- Fujita, T. Theoretical interpretations of large uncertain and hyper language models: Advancing natural uncertain and hyper language processing. Advancing Uncertain Combinatorics through Graphization, Hyperization, and Uncertainization: Fuzzy, Neutrosophic, Soft, Rough, and Beyond 2025, 245. [Google Scholar]
- Mihalcea, R.; Radev, D. Graph-based natural language processing and information retrieval; Cambridge university press, 2011. [Google Scholar]
- Agarwal, V.; Joglekar, S.; Young, A.P.; Sastry, N. Graphnli: A graph-based natural language inference model for polarity prediction in online debates. In Proceedings of the Proceedings of the ACM Web Conference 2022, 2022; pp. 2729–2737. [Google Scholar]
- Li, W.; Peng, R.; Wang, Y.; Yan, Z. Knowledge graph based natural language generation with adapted pointer-generator networks. Neurocomputing 2020, 382, 174–187. [Google Scholar] [CrossRef]
- Mihalcea, R.; Radev, D. Graph-Based Natural Language Processing and Information Retrieval. 2012. [Google Scholar]
- Dumitriu, A.; Molony, C.; Daluwatte, C. Graph-based natural language processing for the pharmaceutical industry. In Provenance in Data Science: From Data Models to Context-Aware Knowledge Graphs; Springer, 2020; pp. 75–110. [Google Scholar]
- Zadeh, L.A. Fuzzy sets. Information and control 1965, 8, 338–353. [Google Scholar] [CrossRef]
- Pawlak, Z. Rough sets. International journal of computer & information sciences 1982, 11, 341–356. [Google Scholar]
- Pawlak, Z. Rough sets and intelligent data analysis. Information sciences 2002, 147, 1–12. [Google Scholar] [CrossRef]
- Hussain, S.S.; Durga, N.; Hossein, R.; Ganesh, G. New Concepts on Quadripartitioned Single-Valued Neutrosophic Graph with Real-Life Application. International Journal of Fuzzy Systems 2022, 24, 1515–1529. [Google Scholar] [CrossRef]
- Smarandache, F. A unifying field in Logics: Neutrosophic Logic. In Philosophy; American Research Press, 1999; pp. 1–141. [Google Scholar]
- Fujita, T.; Smarandache, F. A Dynamic Survey of Fuzzy, Intuitionistic Fuzzy, Neutrosophic, Plithogenic, and Extensional Sets; Neutrosophic Science International Association (NSIA), 2025. [Google Scholar]
- Smarandache, F. Plithogenic set, an extension of crisp, fuzzy, intuitionistic fuzzy, and neutrosophic sets-revisited; Infinite study, 2018. [Google Scholar]
- Smarandache, F. Extension of soft set to hypersoft set, and then to plithogenic hypersoft set. Neutrosophic sets and systems 2018, 22, 168–170. [Google Scholar]
Table 1.
Comparison of Graph, Hypergraph, and Superhypergraph.
Table 1.
Comparison of Graph, Hypergraph, and Superhypergraph.
| Concept |
Notation |
Edge Type |
How it extends the model |
| Graph |
|
|
Uses pairwise edges; each edge joins exactly two distinct vertices. |
| Hypergraph |
|
|
Uses hyperedges; each hyperedge may join any nonempty subset of vertices. |
| Superhypergraph |
|
|
Introduces nested objects via an n-fold iterated powerset, enabling hierarchical (multi-level) connectivity. |
Table 2.
Parallel terminology list: Graph vs. HyperGraph vs. SuperHyperGraph.
Table 2.
Parallel terminology list: Graph vs. HyperGraph vs. SuperHyperGraph.
| Category |
Graph |
HyperGraph |
SuperHyperGraph |
| Structure |
Graph [1] |
HyperGraph [29] |
SuperHyperGraph [15,25] |
| Neural-network family |
GNN (Graph Neural Network) |
HGNN (HyperGraph Neural Network) [5,30,31,32] |
SHGNN (SuperHyperGraph Neural Network) [33] |
| Directed / multi-edge variants (neural) |
Directed GNN (DGNN); MultiGraph Neural Network (MGNN) [34,35,36] |
Directed HyperGraph Neural Network[37,38,39]; MultiHyperGraph Neural Network[40] |
Directed SuperHyperGraph Neural Network; Multi-SuperHyperGraph Neural Network[40] |
| Uncertainty modeling |
Probabilistic Graph [41] |
Probabilistic HyperGraph [42,43] |
Probabilistic SuperHyperGraph [43] |
| DAG modeling |
Directed Acyclic Graph [44,45] |
Directed Acyclic HyperGraph [46] |
Directed Acyclic SuperHyperGraph [46] |
| Molecular modeling |
Molecular Graph [47,48] |
Molecular HyperGraph [49,50] |
Molecular SuperHyperGraph [51,52] |
| Attention models |
Graph Attention Network (GAT) [53,54] |
HyperGraph Attention Network (HGAT) [55,56] |
SuperHyperGraph Attention Network (SHGAT) [57] |
| RAG (retrieval-augmented generation) |
GraphRAG [58,59] |
HyperGraphRAG [60] |
SuperHyperGraphRAG |
| Representation learning |
Graph Embedding [61] |
HyperGraph Embedding [61,62] |
SuperHyperGraph Embedding |
| Generative modeling |
Graph Generation [63,64] |
HyperGraph Generation [65] |
SuperHyperGraph Generation |
| Convolutional models |
Graph Convolutional Network (GCN) [66,67,68] |
HyperGraph Convolutional Network (HGCN) [69,70,71] |
SuperHyperGraph Convolutional Network (SHGCN) |
| Dynamics (temporal) |
Dynamic Graph [72,73] |
Dynamic HyperGraph [74,75,76] |
Dynamic SuperHyperGraph [77] |
Uncertain (Fuzzy/ Neutrosophic/Plithogenic) [78] |
Uncertain Graph [79,80] |
Uncertain HyperGraph [81] |
Uncertain SuperHyperGraph [15] |
| Weights / attributes |
Weighted Graph [82,83] |
Weighted HyperGraph [70,84] |
Weighted SuperHyperGraph
[85] |
Table 3.
Concise comparison of GraphRAG, HyperGraphRAG, and SuperHyperGraphRAG.
Table 3.
Concise comparison of GraphRAG, HyperGraphRAG, and SuperHyperGraphRAG.
| Concept |
Index structure |
Unit of retrieval |
What the structure helps capture |
| GraphRAG |
Graph
|
Relevant subgraph (nodes/edges) around the query entities |
Pairwise relations (who-relates-to-whom), paths, neighborhoods, and evidence chains |
| HyperGraphRAG |
Hypergraph
|
Relevant sub-hypergraph (vertices and hyperedges matched to the query) |
Higher-order relations among multiple entities at once (group facts, co-occurrence, multi-actor events) |
| SuperHyperGraphRAG |
Superhypergraph with nested hyperrelations |
Relevant nested substructure (hyperedges and hyperedges-of-hyperedges) |
Multi-level higher-order structure (hierarchical groupings, topic → claim → evidence layers, nested interactions) |
Table 4.
Concise comparison of Graph Generation, HyperGraph Generation, and SuperHyperGraph Generation.
Table 4.
Concise comparison of Graph Generation, HyperGraph Generation, and SuperHyperGraph Generation.
| Concept |
Generated object |
Core relations captured |
Typical outputs / uses |
| Graph Generation |
A graph
|
Pairwise relations (edges) between two vertices |
Synthetic networks; molecule graphs; link/structure completion |
| HyperGraph Generation |
A hypergraph with
|
Higher-order relations (hyperedges) among multiple vertices |
Group interactions; co-author/topic groups; multi-entity chemical interactions |
| SuperHyperGraph Generation |
A superhypergraph with nested hyperrelations (hyperedge-of-hyperedge structures) |
Multi-level higher-order relations (nested, hierarchical interactions) |
Hierarchical group systems; layered knowledge structures; nested interaction synthesis |
Table 6.
Concise comparison: GCN vs. HGCN vs. SHGCN.
Table 6.
Concise comparison: GCN vs. HGCN vs. SHGCN.
| Aspect |
GCN |
HGCN |
SHGCN |
| Underlying structure |
Graph
|
Hypergraph
|
n–SuperHyperGraph with
|
| Adjacency / incidence |
Adjacency matrix
|
Incidence matrix
|
Super-incidence between supervertices and superedges |
| Propagation operator |
,
|
|
|
| One layer (typical) |
|
|
|
| Interaction modeled |
Pairwise message passing along edges |
Multiway aggregation through hyperedges |
Multiway aggregation through superedges among nested (multi-level) supervertices |
| Specialization / reduction |
— |
If every hyperedge has size 2, becomes a graph model and aligns with GCN-style propagation |
If , supervertices are singletons and SHGCN reduces to an HGCN on a hypergraph; if hyperedges are size 2, further reduces to GCN |
Table 7.
Concise comparison: Graph Embedding vs. HyperGraph Embedding vs. SuperHyperGraph Embedding.
Table 7.
Concise comparison: Graph Embedding vs. HyperGraph Embedding vs. SuperHyperGraph Embedding.
| Aspect |
Graph Embedding |
HyperGraph Embedding |
SuperHyperGraph Embedding |
| Underlying structure |
Graph
|
Hypergraph
|
n–SuperHyperGraph with
|
| Object to embed |
Vertex (and/or edge, whole-graph) |
Vertex and/or hyperedge
|
Supervertex and/or superedge (possibly multi-level) |
| Embedding map (typical) |
|
,
|
, (with level-aware features) |
| Main signal used |
Pairwise adjacency / random-walk proximity / neighborhoods |
Incidence (vertex–hyperedge) and multiway co-occurrence |
Incidence between nested supervertices and superedges; cross-level co-membership |
| Typical objective (example) |
Preserve neighborhood similarity:
|
Preserve incidence:
|
Preserve super-incidence:
|
| Reduction / specialization |
— |
If every hyperedge has size 2, reduces to a graph embedding setting |
If , supervertices are singletons and reduces to hypergraph embedding; if moreover every hyperedge has size 2, reduces to graph embedding |
Table 8.
Concise comparison: Graph-Based NLP vs. HyperGraph-Based NLP vs. SuperHyperGraph-Based NLP.
Table 8.
Concise comparison: Graph-Based NLP vs. HyperGraph-Based NLP vs. SuperHyperGraph-Based NLP.
| Category |
Graph-Based NLP |
HyperGraph-Based NLP |
SuperHyperGraph-Based NLP |
| Encoder |
|
|
|
| Structure |
,
|
,
|
, ,
|
| Relations |
Pairwise |
Multiway (hyperedges) |
Nested multiway (multi-level) |
| Typical units |
Tokens, entities, dependency edges |
Phrases, co-reference sets, events |
Nested discourse units (token ⊂ phrase ⊂ sentence ⊂ doc) |
| Learner |
|
|
|
| Benefit |
Local syntax/semantics |
Higher-order semantics |
Hierarchical, multi-resolution semantics |
|
Disclaimer/Publisher’s Note:
The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).