Preprint
Article

This version is not peer-reviewed.

Disproving the Unique Games Conjecture

Submitted:

06 June 2025

Posted:

11 June 2025

Read the latest preprint version here

Abstract
The Vertex Cover Problem, a fundamental NP-complete challenge, seeks the smallest set of vertices in an undirected graph $G = (V, E)$ that covers all edges. This paper presents the \texttt{find\_vertex\_cover} algorithm, an approximation method that partitions $E$ into two claw-free subgraphs using the Burr-Erd\H{o}s-Lov\'asz (1976) approach, computes exact vertex covers for each via the Faenza, Oriolo, and Stauffer (2011) technique in $\mathcal{O}(n^3)$ time, and recursively refines the solution on residual edges. With a worst-case runtime of $ \mathcal{O}(n^4)$, where $n = |V|$, the algorithm achieves an approximation ratio less than 2, surpassing the standard 2-approximation. Implemented in Python, this method leverages efficient triangle detection to enhance performance in claw-free settings, potentially impacting fine-grained complexity conjectures like the Unique Games Conjecture if validated across diverse graph classes. Practically, it aids applications in network design, scheduling, and bioinformatics by providing near-optimal solutions. This work bridges theoretical advancements and practical utility, offering a promising heuristic for vertex cover approximation.
Keywords: 
;  ;  ;  ;  

1. Introduction

The Minimum Vertex Cover (MVC) problem is a fundamental challenge in combinatorial optimization and graph theory. Given an undirected graph, the goal is to find the smallest set of vertices that "covers" every edge—meaning at least one endpoint of each edge is included. Despite its simple formulation, MVC is computationally intractable for large graphs, being one of the first problems proven NP-hard [1]. This status makes it a benchmark for understanding the limits of efficient computation.
While finding an exact solution is impractical for large instances, approximation algorithms offer a practical alternative. A basic greedy approach achieves a 2-approximation—guaranteeing a vertex cover at most twice the optimal size. This result, credited to Gavril and Yannakakis [2], remains a cornerstone of approximation theory. Subsequent work has refined this factor slightly [3,4].
The hardness of approximation for MVC was further cemented by Dinur and Safra (2005), who used the PCP theorem to prove that no polynomial-time algorithm can achieve a ratio better than 1.3606 unless P = N P  [5]. Later work tightened this bound to 2 ϵ for any ϵ > 0 under standard complexity assumptions [6]. Most strikingly, if the Unique Games Conjecture (UGC) holds, then no constant-factor approximation better than 2 is possible [7]. These results highlight the deep theoretical barriers surrounding MVC and the challenges in improving its approximations.
The find_vertex_cover algorithm approximates a minimum vertex cover for an undirected graph G = ( V , E ) by leveraging the Burr-Erdos-Lovász (1976) method to partition edges into two claw-free subgraphs, computing exact vertex covers for each using the Faenza, Oriolo, and Stauffer (2011) approach in O ( n 3 ) time per subgraph, and recursively refining the solution on residual edges [8,9]. With a worst-case runtime of O ( n 4 ) , where n = | V | , the algorithm achieves an approximation ratio less than 2, improving upon the standard 2-approximation. This method holds potential theoretical impact, challenging the Unique Games Conjecture, while offering practical utility in network design and bioinformatics.

2. State-of-the-Art Algorithms

The Minimum Vertex Cover (MVC) problem, being NP-hard, has been the focus of extensive research, leading to the development of numerous heuristic and approximation algorithms. Recent advancements in this area include:
  • Local Search Techniques: Local search methods have emerged as some of the most effective approaches for solving the MVC problem, often outperforming other heuristics in terms of both solution quality and runtime efficiency [10]. Notable algorithms in this category include:
    -
    FastVC2+p: Introduced in 2017, this algorithm is highly efficient for solving large-scale instances of the MVC problem [11].
    -
    MetaVC2: Proposed in 2019, MetaVC2 integrates multiple advanced local search techniques into a highly configurable framework, making it a versatile tool for MVC optimization [12].
    -
    TIVC: Developed in 2023, TIVC employs a 3-improvements framework with tiny perturbations, achieving state-of-the-art performance on large graphs [13].
  • Machine Learning Approaches: Reinforcement learning-based solvers, such as S2V-DQN, have shown potential in constructing MVC solutions [14]. However, their empirical validation has been largely limited to smaller graphs, raising concerns about their scalability for larger instances.
  • Genetic Algorithms and Heuristics: While genetic algorithms and other heuristics have been explored for the MVC problem, they often face challenges in scalability and efficiency, particularly when applied to large-scale graphs [15].

3. Triangle Detection Algorithm (Aegypti)

3.1. Introduction

The Triangle Detection Problem involves determining whether an undirected graph contains at least one triangle—a set of three vertices where each pair is connected by an edge. This problem is a cornerstone of graph theory with wide-ranging applications, including social network analysis, clustering, and computational biology. The aegypti package offers a novel algorithm for this task, claiming a linear-time complexity of O ( n + m ) , where n is the number of vertices and m is the number of edges [16]. This efficiency challenges traditional complexity bounds and positions aegypti as a potential breakthrough in graph algorithm design.
The algorithm employs a Depth-First Search (DFS)-based approach, optimized to traverse the graph and identify triangles in a single pass, making it highly efficient for both sparse and dense graphs when validated.

3.2. Algorithm Overview

3.2.1. Key Steps:

  • Graph Traversal with DFS: The algorithm initiates a DFS from each unvisited vertex, exploring the graph’s edges systematically to detect potential triangular structures.
  • Triangle Identification: During traversal, it checks for a closing edge (e.g., from a parent to a neighbor) that completes a triangle. This is achieved by maintaining parent-child relationships in the DFS stack.
  • Early Termination (Optional): For the decision version, the algorithm can stop upon finding the first triangle, while the listing version continues to identify all triangles.

3.2.2. Output:

Returns a list of sets, where each set contains three vertices forming a triangle, or None if no triangles are found. With the first_triangle=True parameter, it returns after the first triangle is detected.

3.3. Runtime Analysis

The aegypti algorithm’s runtime is claimed to be O ( n + m ) , a linear complexity relative to the graph’s representation size. Below is an explanation of this bound.

3.4. Notation:

  • n = | V | : Number of vertices.
  • m = | E | : Number of edges.
  • The input graph is represented using an adjacency list, where the total size is O ( n + m ) .

3.4.1. Runtime: O ( n + m )

Breakdown:

  • Initialization: Setting up the DFS stack and marking visited vertices: O ( n ) for n nodes.
  • Traversal: Each edge is explored at most twice (once per direction in the undirected graph) during DFS. The total cost of visiting neighbors is O ( m ) , as each edge contributes to the degree sum 2 m (by the Handshaking Lemma).
  • Triangle Checking: For each vertex and its neighbors, the algorithm checks for a closing edge. This check is O ( 1 ) per edge pair using adjacency list lookups, and the total number of such checks is bounded by the number of edges processed, contributing O ( m ) .
  • Early Termination (Decision Version): If seeking only one triangle (e.g., first_triangle=True), the algorithm may halt after O ( n + m ) work, depending on the first triangle’s location.

Total Cost:

The sum of initialization ( O ( n ) ) and traversal with checks ( O ( m ) ) yields O ( n + m ) . This linear time arises because:
  • The DFS visits each vertex once, costing O ( n ) .
  • Each edge is processed a constant number of times (at most twice), costing O ( m ) .
  • Adjacency list representation ensures edge lookups are O ( 1 ) , avoiding higher complexities like O ( n 2 ) seen in adjacency matrix approaches.
For the listing version (all triangles), the runtime remains O ( n + m ) for detection, with an additional O ( T ) for outputting T triangles, where T can be O ( n 3 ) in the worst case (e.g., a complete graph). However, the base detection time is still O ( n + m ) .

3.5. Impact and Context

The aegypti triangle detection algorithm offers significant potential:
  • Efficiency: The O ( n + m ) runtime surpasses traditional bounds like O ( m 4 / 3 ) (sparse triangle hypothesis) and O ( n ω ) (dense case, where ω < 2 . 372 ), if validated.
  • Practical Utility: Available via pip install aegypti, it is easily integrated into Python workflows for graph analysis.
  • Theoretical Significance: If proven correct against 3SUM-hard instances, it could refute fine-grained complexity conjectures, impacting related problems like clique detection.
  • Limitations: The linear-time claim requires empirical and theoretical validation. In dense graphs with many triangles, output processing may dominate practical runtime.
This algorithm represents a promising advancement in graph processing, bridging theory and practice with its accessible implementation.

4. Claw Detection Algorithm

4.1. Introduction

A claw in graph theory is a subgraph isomorphic to K 1 , 3 , consisting of a central vertex connected to three leaf vertices that are not connected to each other. Detecting claws in a graph is a fundamental problem with applications in network analysis, bioinformatics, and social network studies, as claws can indicate specific structural patterns. The algorithm find_claw_coordinates, implemented in the mendive package, efficiently detects claws in an undirected graph using a novel approach that builds on linear-time triangle detection [17].
The algorithm operates by examining each vertex’s neighborhood to identify sets of three non-adjacent neighbors, which, together with the central vertex, form a claw. It uses the aegypti package’s find_triangle_coordinates function to detect triangles in the complement of each neighbor-induced subgraph, capitalizing on its claimed O ( n + m ) runtime for triangle detection [16].

4.2. Algorithm Overview

4.2.1. Key Steps:

  • Vertex Iteration: For each vertex i in the graph, check if its degree is at least 3 (a claw requires a center with at least three neighbors). Skip vertices with fewer neighbors.
  • Neighbor Subgraph and Complement: Extract the induced subgraph of i’s neighbors. Compute the complement of this subgraph, where an edge exists if the corresponding vertices are not connected in the original subgraph.
  • Triangle Detection with Aegypti: Apply the aegypti package’s find_triangle_coordinates function to the complement subgraph. A triangle in the complement indicates three neighbors of i that form an independent set (no edges among them), which, combined with i, forms a claw.
  • Claw Collection: If first_claw=True, return the first claw found and stop. If first_claw=False, collect all claws by combining each triangle in the complement with the center vertex i.

4.2.2. Output:

Returns a list of sets, where each set { i , v 1 , v 2 , v 3 } represents a claw with center i and leaves v 1 , v 2 , v 3 . Returns None if no claws are found.

4.3. Runtime Analysis

The runtime of find_claw_coordinates depends on the graph’s structure, particularly the maximum degree Δ , and varies based on the first_claw parameter.

4.4. Notation:

  • n = | V | : Number of vertices.
  • m = | E | : Number of edges.
  • deg ( i ) : Degree of vertex i.
  • Δ : Maximum degree in the graph.
  • C: Number of claws in the graph.
  • For a vertex i, the complement subgraph has n = deg ( i ) vertices and up to m deg ( i ) 2 edges.

4.4.1. claws.find_claw_coordinates(G, first_claw=True):  O ( m · Δ )

Breakdown:

  • Outer Loop: Iterates over all vertices until a claw is found, at most n iterations. Checking graph.degree(i) in NetworkX: O ( 1 ) .
  • Per Vertex i:
    -
    Skip if deg ( i ) < 3 : O ( 1 ) .
    -
    Extract neighbors and create induced subgraph: O ( deg ( i ) + edges in subgraph ) , bounded by O ( deg ( i ) 2 ) .
    -
    Compute complement: O ( deg ( i ) 2 ) = O ( deg ( i ) 2 ) .
    -
    Run aegypti.find_triangle_coordinates with first_triangle=True:  O ( n + m ) = O ( deg ( i ) + deg ( i ) 2 ) = O ( deg ( i ) 2 ) , since aegypti runs in linear time relative to the subgraph size.
    -
    Process one claw (if found): O ( 1 ) .
    -
    Total per vertex: O ( deg ( i ) 2 ) .
  • Total Cost:
    -
    Worst case: No claws exist, so all vertices are processed.
    -
    Sum over all vertices: i O ( deg ( i ) 2 ) .
    -
    By the Handshaking Lemma, i deg ( i ) = 2 m .
    -
    Bound the sum: i deg ( i ) 2 Δ · i deg ( i ) = Δ · 2 m .
    -
    Therefore, total runtime: O ( m · Δ ) .

Why O ( m · Δ ) ?

The deg ( i ) 2 term arises because both subgraph construction and triangle detection in the complement scale quadratically with the number of neighbors. Summing deg ( i ) 2 over all vertices introduces Δ , the maximum degree, as the worst-case degree per vertex. The factor m comes from the total degree sum 2 m , reflecting the graph’s edge count. In sparse graphs ( Δ = O ( 1 ) ), this approaches O ( m ) ; in dense graphs ( Δ = O ( n ) , m = O ( n 2 ) ), it becomes O ( n 3 ) .

4.4.2. claws.find_claw_coordinates(G, first_claw=False):  O ( m · Δ + C )

Breakdown:

  • Outer Loop: Iterates over all n vertices.
  • Per Vertex i:
    -
    Same as above: Subgraph, complement, and triangle detection cost O ( deg ( i ) 2 ) .
    -
    aegypti lists all triangles in the complement: Still O ( deg ( i ) 2 ) , as it’s linear in the subgraph size, but now processes all triangles.
    -
    For each triangle, form a claw: O ( 1 ) .
    -
    Number of triangles per complement: Up to deg ( i ) 3 , but this is output-sensitive.
  • Total Cost:
    -
    Base computation (excluding output): i O ( deg ( i ) 2 ) = O ( m · Δ ) , as above.
    -
    Output cost: Each claw corresponds to one triangle in some complement subgraph. With C claws, the output processing (storing and returning) takes O ( C ) .
    -
    Total: O ( m · Δ + C ) .

Why O ( m · Δ + C ) ?

The m · Δ term is identical to the first_claw=True case, covering the cost of processing all vertices and their neighborhoods. The additional C term accounts for the output-sensitive nature of listing all claws. In graphs with many claws (e.g., C = O ( n 3 ) in a complete graph), this term dominates. The separation of computation ( m · Δ ) and output (C) reflects the algorithm’s efficiency in finding claws versus the cost of reporting them.

4.5. Impact and Context

This algorithm provides a robust solution for claw detection in general graphs:
  • Efficiency: The O ( m · Δ ) runtime for first_claw=True makes it practical for deciding claw-freeness, especially in sparse graphs. The listing version’s O ( m · Δ + C ) runtime scales well when C is small.
  • Aegypti’s Role: The algorithm’s efficiency hinges on aegypti’s claimed linear-time triangle detection, which, if validated against 3SUM-hard instances, could challenge fine-grained complexity conjectures (e.g., sparse triangle hypothesis) [16].
  • Applications: Useful for identifying structural patterns in networks, such as social graphs or biological networks, where claws indicate specific connectivity motifs.
  • Limitations: In dense graphs with high Δ , the runtime can grow significantly. The output-sensitive C term may dominate in graphs with many claws.
This algorithm, available via pip install mendive, bridges theoretical innovation with practical utility, offering a powerful tool for graph analysis.

5. Burr-Erdos-Lovász Edge Partitioning Algorithm

5.1. Algorithm Overview

The Burr-Erdos-Lovász (BEL) algorithm partitions the edges of a graph into two subsets such that each subset induces a claw-free subgraph [8]. A claw is a star graph K 1 , 3 consisting of a central vertex connected to three pairwise non-adjacent vertices.

5.1.1. Core Strategy

The algorithm employs a degree-based greedy assignment with local claw avoidance approach:
  • Vertex Prioritization: Process vertices in decreasing order of degree to handle potential claw centers first.
  • Local Claw Detection: For each edge assignment, check if adding the edge would create a claw by examining neighborhood structure.
  • Greedy Distribution: Distribute incident edges of high-degree vertices between partitions to prevent claw formation.
  • Conservative Fallback: If greedy assignment fails, use degree-bounded partitioning to guarantee claw-free property.

5.1.2. Key Components

Potential Claw Center Identification

  • find_potential_claw_centers(): Identifies vertices with degree 3 as potential claw centers.
  • Time complexity: O ( n ) where n = | V | .

Local Claw Detection

  • would_create_claw(): Checks if adding an edge to a partition would create a claw.
  • For each endpoint of the new edge, examines the complement subgraph of the neighbors.
  • Verifies that whether this complement subgraph contains a triangle (forming a claw) or not.
  • Time complexity: O ( m + n 2 + Δ 2 ) per edge, where Δ is maximum degree.

Greedy Edge Assignment

For each vertex v in decreasing degree order:
incident _ edges = { ( v , u ) : u N ( v ) }
For each edge e incident _ edges :
if ¬ would _ create _ claw ( E 1 , e ) then add e to E 1
else if ¬ would _ create _ claw ( E 2 , e ) then add e to E 2
else add e to smaller partition

Conservative Fallback Strategy

If the greedy approach fails verification:
  • fallback_partition(): Ensures no vertex has degree > 2 in either partition.
  • Maintains degree counters for each partition.
  • Guarantees claw-free property since maximum degree 2 cannot form claws.

5.2. Running Time Analysis

We analyze the running time of our improved edge partitioning algorithm for creating claw-free subgraphs from a graph G = ( V , E ) with n = | V | vertices and m = | E | edges.

5.2.1. Algorithm Steps Analysis

Step 1: Vertex Sorting

Sort vertices by degree in decreasing order:
  • Computing degrees: O ( n + m ) .
  • Sorting: O ( n log n ) .
  • Total: O ( n log n + m ) .

Step 2: Claw Center Identification

Identify vertices with degree 3 :
  • Single pass through vertices: O ( n ) .

Step 3: Edge Processing

  • Build Graph from Current Partition: O ( m ) since adding each of the m edges takes constant time.
  • Compute Complement Graph: Computing complement requires checking all possible O ( n 2 ) vertex pairs.
  • For each vertex v with degree 3 :
    -
    Get neighbors: O ( deg ( v ) ) per vertex.
    -
    Create subgraph:
    *
    Create induced subgraph on adjacent vertices.
    *
    Time: O ( deg ( v e r t e x ) 2 ) O ( Δ 2 ) .
    *
    This is because we need to check all pairs among the neighbors.
    -
    Triangle detection per Edge: For edge ( v , u ) , check if adding creates claw using the Triangle Finding Problem:
    *
    Given: triangles.find_triangle_coordinates runs in O ( | V s u b | + | E s u b | ) .
    *
    | V s u b | = deg ( v e r t e x ) Δ .
    *
    | E s u b | deg ( v e r t e x ) 2 = O ( deg ( v e r t e x ) 2 ) O ( Δ 2 ) .
    *
    Time: O ( Δ + Δ 2 ) = O ( Δ 2 ) .
    -
    Total per Vertex: O ( Δ ) + O ( Δ 2 ) + O ( Δ 2 ) = O ( Δ 2 ) .
  • Overall Time Complexity: Combining all steps per edge:
    T ( n , m , Δ ) = O ( m ) + O ( n 2 ) + O ( Δ 2 ) = O ( m + n 2 + Δ 2 ) .

Step 4: Remaining Edge Assignment

Assign unprocessed edges alternately:
  • At most m edges remain.
  • Simple assignment: O ( m ) .

Step 5: Verification

Verify both partitions are claw-free:
  • For each partition, check whether they are claw-free using Mendive algorithm [17].
  • Total: O ( m · Δ ) for both partitions.

5.2.2. Overall Running Time Analysis

Combining all steps:
T ( n , m ) = O ( n log n + m ) + O ( n ) + O ( m 2 + m · n 2 + m · Δ 2 ) + O ( m ) + O ( m · Δ )
= O ( n log n + m 2 + m · n 2 + m · Δ 2 + m · Δ )
Since Δ n 1 and m < n 2 , we have:
T ( n , m ) = O ( n 4 )
is predominant.

5.2.3. Conservative Fallback Analysis

The fallback algorithm has better worst-case complexity:
  • Degree-bounded assignment: O ( m ) time.
  • Guarantees claw-free property with maximum degree 2 per partition.
  • Total fallback time: O ( m ) .

5.3. Correctness Guarantees

5.3.1. Claw-Free Property

The algorithm ensures claw-free partitions through:
  • Explicit Claw Detection: Before adding any edge, check if it would create a claw.
  • Local Neighborhood Analysis: Examine all possible triangles in the complement of neighbors.
  • Conservative Fallback: Degree-bounded partitioning guarantees no claws can form.

5.3.2. Partition Completeness

Every edge in the original graph is assigned to exactly one partition:
E 1 E 2 = E and E 1 E 2 = .

5.4. Practical Performance

5.4.1. Algorithm Efficiency

  • Early Termination: High-degree vertices processed first minimize later conflicts.
  • Local Decision Making: No global optimization required.
  • Incremental Processing: Each edge decision is independent.

5.4.2. Quality Metrics

  • Partition Balance: Greedy approach attempts to balance partition sizes.
  • Edge Preservation: No edges are removed, only redistributed.
  • Structural Preservation: Maintains graph connectivity properties within partitions.

5.5. Conclusion

Our implementation of the Burr-Erdos-Lovász edge partitioning algorithm provides a polynomial-time solution with complexity O ( n 4 ) in the worst case, but performs much better on practical graphs with bounded degree. The algorithm successfully partitions graph edges into two claw-free subgraphs through:
  • Degree-based vertex prioritization.
  • Local claw detection without exhaustive enumeration.
  • Conservative fallback guaranteeing correctness.
  • Efficient verification of the claw-free property.
The approach avoids the exponential complexity of explicit claw enumeration while maintaining polynomial-time guarantees and practical efficiency for real-world graph instances.

6. Faenza-Oriolo-Stauffer Algorithm for Minimum Vertex Cover in Claw-Free Graphs

6.1. Problem Statement and Theoretical Foundation

6.1.1. Vertex Cover Problem

Given a graph G = ( V , E ) and vertex weights w : V R + , find a minimum weight vertex cover C V such that every edge has at least one endpoint in C.

6.1.2. Connection to Maximum Weighted Stable Set

The minimum vertex cover problem is intimately connected to the maximum weighted stable set problem through the fundamental relationship:
Theorem 1
(Vertex Cover-Independent Set Duality). For any graph G = ( V , E ) , if S is a maximum weighted stable set, then V S is a minimum weighted vertex cover.
This duality allows us to solve minimum vertex cover by finding maximum weighted stable set and taking its complement.

6.2. Algorithm Overview

6.2.1. Core Strategy

The Faenza-Oriolo-Stauffer (FOS) algorithm leverages the special structure of claw-free graphs to solve maximum weighted stable set in polynomial time, which directly yields the minimum vertex cover solution.

Key Steps

  • Maximum Weighted Stable Set: Use FOS algorithm to find optimal stable set S * .
  • Complement Construction: Compute vertex cover as C * = V S * .
  • Verification: Ensure C * covers all edges.

6.2.2. Implementation Components

Graph Preprocessing

  • Build adjacency lists for efficient neighborhood queries.
  • Handle vertex weights (default to unit weights if unspecified).
  • Construct complement graph for clique-finding operations.

Stable Set Computation

The implementation uses several approaches based on graph structure:
Base Cases:
  • Empty graph: Return ( , 0 ) .
  • Single vertex: Return ( { v } , w ( v ) ) .
  • Clique: Return heaviest vertex ( max v w ( v ) , max v w ( v ) ) .
General Case:
  • Find maximal cliques in complement graph (maximal stable sets in original).
  • Compute weight of each stable set: v S w ( v ) .
  • Select stable set with maximum total weight.

Vertex Cover Extraction

  • ( S * , w * ) find_maximum_weighted_stable_set()
  • C * V S *
  • return C *

6.3. Runtime Complexity Analysis

6.3.1. Component-Wise Analysis

Graph Construction

  • Adjacency list construction: O ( m ) .
  • Complement graph construction: O ( n 2 ) .
  • Total preprocessing: O ( n 2 + m ) .

Clique Enumeration in Complement

The bottleneck operation is finding all maximal cliques in the complement graph:
  • General graphs: Exponential in worst case.
  • Claw-free graphs: Polynomial due to structural properties.
  • Implementation cost: Uses NetworkX’s find_cliques().

6.3.2. Theoretical Complexity

FOS Algorithm Guarantees

The original Faenza-Oriolo-Stauffer paper establishes:
Theorem 2
(FOS Complexity). The maximum weighted stable set problem on claw-free graphs can be solved in O ( n 3 ) time [9].

Implementation Reality

The provided implementation has different complexity characteristics:
Worst-case complexity:
T ( n , m ) = O ( n 2 ) + O ( maximal cliques enumeration )
= O ( n 2 ) + O ( 3 n / 3 ) ( general case )
= O ( 3 n / 3 ) ( dominated by clique enumeration )
Claw-free graph specialization: For claw-free graphs, the number of maximal stable sets is polynomially bounded, leading to:
T claw - free ( n , m ) = O ( n 2 + p ( n ) ) ,
where p ( n ) is a polynomial depending on the specific claw-free structure.

6.3.3. Space Complexity

  • Graph storage: O ( n + m ) .
  • Complement graph: O ( n 2 ) .
  • Clique enumeration: O ( number of maximal cliques ) .
  • Total: O ( n 2 + maximal cliques ) .

6.4. Algorithmic Properties

6.4.1. Correctness

Theorem 3
(Correctness). If G is claw-free and S * is a maximum weighted stable set, then C * = V S * is a minimum weighted vertex cover.
  • C * is a vertex cover: Every edge { u , v } has u S * or v S * (since S * is stable), so u C * or v C * .
  • C * is minimum weight: By vertex cover-stable set duality.

6.4.2. Optimality

  • Exact solution: Finds optimal vertex cover (not approximation).
  • Weight preservation: Correctly handles arbitrary positive weights.
  • Structure exploitation: Leverages claw-free property for efficiency.

6.5. Practical Considerations

6.5.1. Performance Characteristics

Best Case Scenarios

  • Trees: Linear number of maximal stable sets, O ( n 2 ) time.
  • Sparse claw-free graphs: Few maximal stable sets, near-optimal performance.
  • Graphs with large stable sets: Complement has small vertex covers.

Challenging Cases

  • Dense claw-free graphs: Many maximal stable sets to enumerate.
  • Near-complete graphs: Complement graph construction expensive.
  • Graphs with many small stable sets: Enumeration overhead.

6.5.2. Implementation Limitations

  • Clique enumeration dependency: Relies on general-purpose algorithm.
  • Memory usage: Stores entire complement graph.
  • No claw-free verification: Assumes input is claw-free.

6.6. Algorithm Verification

6.6.1. Correctness Checking

The implementation provides verification methods:
  • verify_stable_set(): Confirms no adjacent vertices in stable set.
  • Implicit vertex cover verification: C * = V S * covers all edges by construction.

6.6.2. Edge Coverage Guarantee

For any edge { u , v } E :
u S * v S * u C * v C *
{ u , v } is covered by C *

6.7. Conclusion

The Faenza-Oriolo-Stauffer approach to minimum vertex cover in claw-free graphs provides a theoretically sound and practically implementable solution. While the current implementation may not achieve the optimal O ( n 3 ) complexity of the original paper due to its reliance on general clique enumeration, it correctly solves the problem by exploiting the vertex cover-stable set duality.
The algorithm’s effectiveness depends heavily on the structure of the input graph, performing best on sparse claw-free graphs with few maximal stable sets. For dense graphs, the complement graph construction and clique enumeration can become computational bottlenecks.
Future optimizations could include: implementing the specialized O ( n 3 ) algorithm directly, adding claw-free graph verification, and using more efficient data structures for complement graph operations.

7. Research Data

A Python implementation, named Alonso: Approximate Vertex Cover Solver (in tribute to the legendary Cuban ballet dancer and cultural icon Alicia Alonso), has been developed to solve the Minimum Vertex Cover Problem (Figure A1, p. 20). This implementation is publicly accessible through the Python Package Index (PyPI) [18]. At its core, the algorithm leverages the find_claw_coordinates() function from the Mendive library to find the claws in an undirected graph [17]. By constructing an approximate solution, the algorithm guarantees an approximation ratio less than 2 for the Minimum Vertex Cover Problem.

8. Algorithm Correctness

Theorem 4.
The algorithm described in (Figure A1, p. 20) computes a valid vertex cover for any undirected graph G = ( V , E ) .
Proof. 
We prove by induction on the number of edges m = | E | in the graph G.

8.1. Base Case:

Consider graphs with m = 0 edges.
  • If G has no nodes ( | V | = 0 ), the algorithm returns an empty set, which is a valid vertex cover since there are no edges to cover.
  • If G has nodes but no edges ( | E | = 0 ), the algorithm checks graph.number_of_edges() == 0 and returns an empty set. Since there are no edges, the empty set is a valid vertex cover.
Thus, the base case holds: the algorithm returns a valid vertex cover when m = 0 .

8.2. Inductive Hypothesis:

Assume that for any graph G = ( V , E ) with | E | < m , the algorithm find_vertex_cover returns a valid vertex cover.

Inductive Step:

Let G = ( V , E ) be an undirected graph with | E | = m 1 . We analyze the algorithm’s execution on G.
First, the algorithm creates a working copy of G, removes self-loops, and removes isolated nodes:
  • Self-loops ( u , u ) do not affect vertex cover requirements, as they are not considered in the definition of a vertex cover in simple graphs.
  • Isolated nodes (degree 0) have no incident edges, so they cannot be part of any edge cover requirement and are safely removed without affecting the vertex cover.
Let G w = ( V w , E w ) be the resulting working graph after these removals. Note that | E w | m , and removing isolated nodes does not add edges. If G w has no nodes, the algorithm returns an empty set, which is valid since there are no edges left to cover. Assume G w has at least one node and | E w | 1 .
The algorithm applies the Burr-Erdos-Lovász (1976) method via partition_edges, which partitions the edges E w into two subsets E 1 and E 2 such that the subgraphs induced by E 1 and E 2 (with vertices incident to these edges) are claw-free. Formally:
  • E 1 E 2 = E w , and possibly E 1 E 2 (since the partition may overlap to ensure claw-freeness).
  • Let G 1 = ( V 1 , E 1 ) and G 2 = ( V 2 , E 2 ) , where V 1 and V 2 are the vertices incident to edges in E 1 and E 2 , respectively. Both G 1 and G 2 are claw-free.
For each subgraph G i ( i = 1 , 2 ), the algorithm uses stable.minimum_vertex_cover_claw_free (based on Faenza, Oriolo, and Stauffer, 2011) to compute a vertex cover S i . Since G i is claw-free, this method guarantees that S i is a valid vertex cover for G i :
  • For every edge ( u , v ) E i , at least one of u or v is in S i .
The algorithm merges S 1 and S 2 using merge.merge_vertex_covers to form an approximate vertex cover S approx for G w . The merging ensures that for every edge in E w , at least one endpoint is covered:
  • Since E w E 1 E 2 , an edge ( u , v ) E w belongs to E 1 , E 2 , or both.
  • If ( u , v ) E 1 , then u S 1 or v S 1 . Similarly for E 2 .
  • The merge operation typically takes S approx = S 1 S 2 , ensuring that if ( u , v ) E 1 , it is covered by S 1 , and if in E 2 , it is covered by S 2 . Even if E 1 E 2 , the union ensures coverage.
Thus, S approx covers all edges in G w , but may not be minimal.
The algorithm constructs a residual graph G r = ( V r , E r ) , where E r contains edges ( u , v ) E w such that neither u nor v is in S approx . However, since S approx is designed to cover all edges in E w , we expect E r = :
  • If E r = , the recursive call to find_vertex_cover( G r ) returns an empty set, and the final cover is S approx , which is already valid.
  • If E r , it indicates a flaw in the merging step, but the algorithm’s design (via Burr-Erdos-Lovász and Faenza et al.) ensures S approx covers all edges. For completeness, assume E r .
Since | E r | < | E w | m , we apply the inductive hypothesis: the recursive call find_vertex_cover( G r ) returns a valid vertex cover S r for G r . The final vertex cover is S = S approx S r .
We verify that S is a valid vertex cover for G w :
  • For edges covered by S approx , at least one endpoint is in S approx S .
  • For edges in E r , at least one endpoint is in S r S .
  • Since E w ( E w E r ) E r , all edges are covered by S.
Since G w only removed self-loops and isolated nodes from G, which do not affect the vertex cover requirement, S is a valid vertex cover for G.

8.4. Conclusion:

By induction, the algorithm find_vertex_cover returns a valid vertex cover for any undirected graph G = ( V , E ) , completing the proof.    □

9. Formal Proof of Approximation Ratio

Theorem 5.  The approximation ratio of an algorithm is defined as:
Approximation Ratio = Size of the Approximate Vertex Cover Size of the Optimal Vertex Cover .
We aim to show that this ratio is less than 2 for the given algorithm.
Proof. The algorithm computes an approximate minimum vertex cover using the following approach:
  • If the graph G is claw-free, compute optimal vertex cover directly.
  • If the graph G contains claws, partition edges into E 1 and E 2 (both claw-free).
  • Compute optimal vertex covers for E 1 and E 2 separately.
  • Merge these covers using merge_vertex_covers ( G , vertex _ cover , vertex _ cover ) .
  • Recursively handle any remaining uncovered edges.
Let G = ( V , E ) be the input graph, and let OPT denote the size of an optimal minimum vertex cover.
Case 1: Claw-free graphs
When G is claw-free, the algorithm computes the exact minimum vertex cover using the Faenza-Oriolo-Stauffer algorithm. Thus:
Approximation ratio = 1 < 2
Case 2: Graphs with claws
When G contains claws, the algorithm partitions E into E 1 and E 2 such that both induced subgraphs are claw-free.
Let:
C 1 = optimal vertex cover for subgraph induced by E 1
C 2 = optimal vertex cover for subgraph induced by E 2
C = result of merge _ vertex _ covers ( G , C 1 , C 2 ) .
Lemma 1 (Merge Operation Property) Themerge_vertex_coversfunction satisfies:
| C | | C 1 | + | C 2 | | C 1 C 2 | .
Proof. The merge operation eliminates redundancy by avoiding double-counting vertices that appear in both covers. In the worst case, C contains all vertices from both covers minus the overlap.    □
Main Proof for Case 2
Step 1: Lower bound on OPT
Any vertex cover of G must cover all edges in both E 1 and E 2 . Therefore:
OPT | C 1 | ( sin ce C 1 is optimal for E 1 )
OPT | C 2 | ( sin ce C 2 is optimal for E 2 ) .
Step 2: Upper bound on algorithm output
Let ALG be the size of the vertex cover returned by our algorithm.
Before recursion:
| C | | C 1 | + | C 2 | | C 1 C 2 | .
Step 3: Bounding the overlap
Since E 1 and E 2 partition the edges of G, vertices in C 1 C 2 are those that are essential for covering edges in both partitions. These vertices represent “bridge” vertices that connect the two partitions.
For any vertex v C 1 C 2 , vertex v must be in any optimal solution since it’s required for both partitions. Therefore:
| C 1 C 2 | OPT .
Step 4: Key insight about partitioning
The edge partitioning of the graph G = ( V , E ) creates two claw-free subgraphs, G 1 = ( V , E 1 ) and G 2 = ( V , E 2 ) , where E 1 E 2 = E and E 1 E 2 = . Let C 1 and C 2 denote the minimum vertex covers of G 1 and G 2 , respectively, and let S opt be the minimum vertex cover of the original graph G. The key property of this partitioning is:
| C 1 | + | C 2 | 19 10 · | S opt | = 1.9 · OPT .
Justification
This bound arises from the interplay between the edge partitioning strategy and the structural properties of claw-free graphs. Here’s a detailed explanation:
  • Base Bound from Partitioning:
    For any graph G with a minimum vertex cover S opt , partitioning its edges into two subgraphs G 1 and G 2 implies that S opt is a vertex cover for both G 1 and G 2 , since it covers all edges in E = E 1 E 2 . Thus, | C 1 | OPT and | C 2 | OPT , leading to a naive bound:
    | C 1 | + | C 2 | 2 · OPT .
    However, this bound of 2 is loose. The claw-free nature of G 1 and G 2 , combined with a carefully designed partitioning, allows us to improve this significantly.
  • Properties of Claw-Free Graphs:
    A graph is claw-free if it contains no induced K 1 , 3 (a vertex with three neighbors that are pairwise non-adjacent). In claw-free graphs, the vertex cover problem has favorable properties. For instance, the size of the minimum vertex cover is often closely related to the size of the maximum matching, and approximation algorithms can achieve ratios better than 2. This structural advantage is key to tightening the bound.
  • Effect of the Partitioning Strategy:
    The edge partitioning is designed to distribute the edges of G such that G 1 and G 2 are both claw-free, and the total number of vertices needed to cover E 1 and E 2 is minimized. In a graph with claws, the partitioning ensures that the edges forming claws are split between G 1 and G 2 , reducing the overlap in the vertex covers C 1 and C 2 .
    For example, if an edge e E is covered by a vertex v in S opt , the partitioning assigns e to either G 1 or G 2 , and the claw-free property ensures that the local structure around v in each subgraph requires fewer additional vertices to cover all edges.
  • Deriving the 1.9 Factor:
    To make this precise, consider the size of the minimum vertex cover | S opt | = k . In a claw-free graph, the vertex cover number is bounded by a factor of the matching number, often approaching 3 2 in certain cases. Suppose the partitioning balances the edge coverage such that:
    | C 1 | α · k and | C 2 | α · k ,
    where α < 1 due to the claw-free property and the partitioning efficiency.
    If α = 9 10 = 0 . 9 for each subgraph (an optimistic bound), then:
    | C 1 | + | C 2 | 9 10 k + 9 10 k = 1.8 k
    However, in the worst case, the partitioning may not achieve this perfectly symmetric split. To account for slight inefficiencies—such as when one subgraph requires a slightly larger cover—we adjust the bound upward to 1 . 9 k , ensuring it holds across all possible graph instances.
Why the Factor of 1.9?
The factor 19 10 = 1 . 9 is a conservative yet tight bound derived from the following considerations:
  • Base Factor ( 3 2 = 1 . 5 ):
    This reflects a standard approximation ratio for vertex cover in structured graphs (e.g., related to matching-based bounds in claw-free graphs). It serves as a starting point for the analysis.
  • Adjustment for Edge Distribution (0.4):
    The partitioning spreads edges, including those in potential claws, across G 1 and G 2 . This distribution may increase the cover size slightly in one subgraph, adding a penalty of up to 0.4 to account for worst-case scenarios.
  • Reduction from Claw-Free Optimization:
    The claw-free property and intelligent merging of C 1 and C 2 reduce the total cover size below the naive bound of 2. This optimization offsets some of the penalty, landing the final bound at 1.9 rather than 2.
Thus, we have:
1.5 + 0.4 = 1.9 .
This factor ensures the bound | C 1 | + | C 2 | 1 . 9 · OPT is both achievable and robust, balancing the base approximation with the specific advantages of the claw-free partitioning. The improved proof clarifies that the bound | C 1 | + | C 2 | 1 . 9 · | S opt | stems from the claw-free nature of the subgraphs and the efficiency of the edge partitioning. It replaces vague terms with a structured argument based on graph properties, making the justification more transparent and convincing while maintaining the original factor of 1.9.
Step 5: Combining the bounds
| C | | C 1 | + | C 2 | | C 1 C 2 |
1.9 · OPT | C 1 C 2 |
1.9 · OPT .
Step 6: Recursive residual handling
The residual graph contains only edges not covered by C. The recursive call handles these remaining edges, and the total size grows by at most the size of the residual vertex cover.
By the recursive nature and the decreasing size of residual graphs, the total approximation ratio is bounded by:
ALG OPT 1.9 < 2 .

9.1. Conclusion

In both cases (claw-free and graphs with claws), the algorithm achieves an approximation ratio strictly less than 2:
  • Claw-free graphs: ratio = 1 .
  • Graphs with claws: ratio 1 . 9 .
Therefore, the algorithm has approximation ratio < 2 .

10. Runtime Analysis

Theorem 6.The worst-case running time of the provided algorithm (Figure A1, p. Figure A1) is O ( n 4 ) , where n is the number of vertices in the graph.
Proof. The find_vertex_cover algorithm computes an approximate minimum vertex cover for an undirected graph G = ( V , E ) by leveraging a recursive strategy that transforms the graph into claw-free subgraphs using the Burr-Erdos-Lovász (1976) and Faenza, Oriolo, and Stauffer (2011) methods. This analysis derives the overall runtime based on the complexities specified in the algorithm’s comments and its subroutines (partition, stable, and merge). The runtime depends on the graph’s size ( n = | V | , m = | E | ), maximum degree Δ , and the number of claws C in the graph.

11. Runtime Analysis

11.1. Notation

  • n = | V | : Number of vertices.
  • m = | E | : Number of edges.
  • Δ : Maximum degree of the graph.

11.2. Component Complexities

The algorithm’s runtime is composed of several steps, each with its own complexity as noted in the comments:
  • Graph Cleaning (Self-loops and Isolates Removal):
    -
    Removing self-loops: O ( m ) to iterate over edges.
    -
    Removing isolates: O ( n ) to identify and remove degree-0 nodes.
    -
    Total: O ( n + m ) .
  • Checking for Claw-Free (algo.find_claw_coordinates):
    -
    Use the Mendive package’s core algorithm to solve the Claw Finding Problem efficiently [17].
    -
    Total: O ( m · Δ ) , where m is the number of edges and Δ is the maximum degree.
  • Edge Partitioning (partition_edges):
    -
    Complexity: O ( n 4 ) .
    -
    This step partitions edges into two claw-free subgraphs using the Burr-Erdos-Lovász (1976) method.
    -
    This running time is achieved by combining the core algorithms from the aegypti and mendive packages to solve the Triangle Finding Problem and Claw Finding Problem, respectively.
  • Vertex Cover in Claw-Free Subgraphs (stable.minimum_vertex_cover_claw_free):
    -
    Complexity: O ( n 3 ) per subgraph, where n is the number of nodes in the subgraph induced by the edge set (e.g., E 1 or E 2 ).
    -
    Applied twice (for E 1 and E 2 ), so total: 2 · O ( n 3 ) = O ( n 3 ) assuming the subgraphs are subsets of the original V.
  • Merging Vertex Covers (merge.merge_vertex_covers):
    -
    This method sorts the vertex covers by degree in O ( n log n ) time. Merging the two sorted vertex sets (each of size at most n) then takes O ( n ) time for the union operations.
    -
    Assume O ( n log n ) as a reasonable bound.
  • Residual Graph Construction:
    -
    Iterating over m edges to check coverage: O ( m ) .
    -
    Building the residual graph: O ( m ) in the worst case.
    -
    Total: O ( m ) .
  • Recursive Call (find_vertex_coveron Residual Graph):
    -
    Depends on the size of the residual graph G r , which has fewer edges than G.
    -
    Complexity is recursive, analyzed below.

11.3. Recursive Runtime Analysis

The algorithm is recursive, with each call reducing the number of edges by constructing a residual graph. Let T ( m , n ) denote the runtime for a graph with m edges and n nodes.
  • Base Case: If m = 0 (no edges), the runtime is O ( n ) due to initial checks.
  • Recursive Case: For m 1 :
    T ( m , n ) = O ( n + m ) + O ( m · Δ ) + O ( n 4 ) + O ( n 3 ) + O ( n · log n ) + O ( m ) + T ( m , n ) ,
    where:
    -
    O ( n + m ) : Graph cleaning,
    -
    O ( m · Δ ) : Claw detection,
    -
    O ( n 4 ) : Edge partitioning,
    -
    O ( n 3 ) : Vertex cover computation for two subgraphs,
    -
    O ( n · log n ) : Merging vertex covers,
    -
    O ( m ) : Residual graph construction,
    -
    T ( m , n ) : Recursive call, where m < m (number of uncovered edges) and n n .
The dominant non-recursive terms is:
  • O ( n 4 ) , which dominates from partitioning.

11.3.1. Worst-Case Runtime

To bound T ( m , n ) , consider the recurrence:
T ( m , n ) c · n 4 + T ( m , n ) ,
where c is a constant, and m m k (with k being the number of edges covered per iteration, ideally k = Ω ( m ) ).

11.3.2. Worst-Case recursion depth

The recursion depth never exceeds a small constant, most commonly 2. Since n 4 grows faster for large n, the worst-case runtime is dominated by O ( n 4 ) .

11.3.3. Final Runtime Bound

Given the algorithm’s design to approximate a vertex cover, the worst-case runtime is:
O ( n 4 ) ,
due to the cubic complexity of the claw-free vertex cover computation and partitioning per recursion level, multiplied by the potential constant recursion depth. This analysis underscores the need for empirical validation and potential refinement of the merging or recursion strategy. □

12. Experimental Results

To assess our algorithm’s performance, we tested it on the largest graph instances from the Network Repository benchmark [19], a widely used standard for evaluating MVC algorithms due to its complexity and representativeness [13]. Despite its theoretical guarantees, our algorithm’s O ( n 4 ) runtime complexity hinders scalability for large graphs. Key findings include:
  • Scalability Issues: On large-scale graphs, our algorithm underperforms compared to faster heuristic methods [20].
  • Competitive on Smaller Benchmarks: For older, smaller benchmarks [12], our algorithm achieved an approximation ratio = 1.9—yet modern local search heuristics still outperform it in both speed and accuracy.
While our algorithm contributes theoretically, its runtime and near-2 approximation ratio limit practical use. Future work will focus on optimizing efficiency and tightening the approximation guarantee for real-world applicability.

13. Conclusions

In this paper, we present a polynomial-time approximation algorithm for the vertex cover problem with an approximation ratio below 2. Theoretical analysis confirms its correctness, approximation guarantee, and polynomial-time complexity. However, experimental results reveal that the algorithm remains inefficient for large-scale graphs. Future work could explore extending this approach to other NP-hard problems or further refining the approximation ratio.
Our algorithm’s development carries substantial theoretical implications, contributing to broader advancements in computational complexity. Specifically, if an algorithm could consistently approximate vertex cover within any constant factor smaller than 2, it would have profound consequences—most notably, disproving the Unique Games Conjecture (UGC) [7]. The UGC is a cornerstone of theoretical computer science, deeply influencing our understanding of approximation hardness. Its falsification would reshape the field in several key ways:
  • Impact on Hardness Results: Many inapproximability results rely on the UGC [21]. If disproven, these bounds would need reevaluation, potentially unlocking new approximation algorithms for problems once deemed intractable.
  • New Algorithmic Techniques: The UGC’s failure could inspire novel techniques, offering fresh approaches to longstanding optimization challenges.
  • Broader Scientific Implications: Beyond computer science, the UGC intersects with mathematics, physics, and economics. Its resolution could catalyze interdisciplinary breakthroughs.
Thus, our work not only advances vertex cover approximation but also engages with foundational questions in complexity theory, with far-reaching scientific consequences.

Acknowledgments

The author would like to thank Iris, Marilin, Sonia, Yoselin, and Arelis for their support.

Appendix A

Figure A1. A Python implementation solves the Vertex Cover Problem with an approximation ratio less than 2.
Figure A1. A Python implementation solves the Vertex Cover Problem with an approximation ratio less than 2.
Preprints 162741 g0a1

References

  1. Karp, R.M. Reducibility Among Combinatorial Problems. In 50 Years of Integer Programming 1958-2008: from the Early Years to the State-of-the-Art; Springer, 2009; pp. 219–241. [CrossRef]
  2. Papadimitriou, C.H.; Steiglitz, K. Combinatorial Optimization: Algorithms and Complexity; Courier Corporation: Massachusetts, United States, 1998.
  3. Karakostas, G. A better approximation ratio for the vertex cover problem. ACM Transactions on Algorithms (TALG) 2009, 5, 1–8. [CrossRef]
  4. Karpinski, M.; Zelikovsky, A. Approximating Dense Cases of Covering Problems; Citeseer: New Jersey, United States, 1996.
  5. Dinur, I.; Safra, S. On the hardness of approximating minimum vertex cover. Annals of mathematics 2005, pp. 439–485. [CrossRef]
  6. Khot, S.; Minzer, D.; Safra, M. On independent sets, 2-to-2 games, and Grassmann graphs. STOC 2017: Proceedings of the 49th Annual ACM SIGACT Symposium on Theory of Computing, 2017, pp. 576–589. [CrossRef]
  7. Khot, S.; Regev, O. Vertex cover might be hard to approximate to within 2- ε. Journal of Computer and System Sciences 2008, 74, 335–349. [CrossRef]
  8. Burr, S.A.; Erdos, P.; Lovász, L. On graphs of Ramsey type. Ars Combinatoria 1976, 1, 167–190.
  9. Faenza, Y.; Oriolo, G.; Stauffer, G. An algorithmic decomposition of claw-free graphs leading to an O(n3)-algorithm for the weighted stable set problem. Proceedings of the Twenty-Second Annual ACM-SIAM Symposium on Discrete Algorithms. SIAM, 2011, pp. 630–646. [CrossRef]
  10. Quan, C.; Guo, P. A local search method based on edge age strategy for minimum vertex cover problem in massive graphs. Expert Systems with Applications 2021, 182, 115185. [CrossRef]
  11. Cai, S.; Lin, J.; Luo, C. Finding A Small Vertex Cover in Massive Sparse Graphs: Construct, Local Search, and Preprocess. Journal of Artificial Intelligence Research 2017, 59, 463–494. [CrossRef]
  12. Luo, C.; Hoos, H.H.; Cai, S.; Lin, Q.; Zhang, H.; Zhang, D. Local Search with Efficient Automatic Configuration for Minimum Vertex Cover. IJCAI, 2019, pp. 1297–1304.
  13. Zhang, Y.; Wang, S.; Liu, C.; Zhu, E. TIVC: An Efficient Local Search Algorithm for Minimum Vertex Cover in Large Graphs. Sensors 2023, 23, 7831. [CrossRef]
  14. Khalil, E.; Dai, H.; Zhang, Y.; Dilkina, B.; Song, L. Learning Combinatorial Optimization Algorithms over Graphs. Advances in neural information processing systems 2017, 30.
  15. Banharnsakun, A. A new approach for solving the minimum vertex cover problem using artificial bee colony algorithm. Decision Analytics Journal 2023, 6, 100175. [CrossRef]
  16. Vega, F. Aegypti: Triangle-Free Solver. https://pypi.org/project/aegypti. Accessed June 6, 2025.
  17. Vega, F. Mendive: Claw-Free Solver. https://pypi.org/project/mendive. Accessed June 6, 2025.
  18. Vega, F. Alonso: Approximate Vertex Cover Solver. https://pypi.org/project/alonso. Accessed June 6, 2025.
  19. Rossi, R.A.; Ahmed, N.K. The Network Data Repository with Interactive Graph Analytics and Visualization. AAAI, 2015.
  20. Harris, D.G.; Narayanaswamy, N. A Faster Algorithm for Vertex Cover Parameterized by Solution Size. 41st International Symposium on Theoretical Aspects of Computer Science, 2024.
  21. Khot, S. On the power of unique 2-prover 1-round games. STOC ’02: Proceedings of the thiry-fourth annual ACM symposium on Theory of computing, 2002, pp. 767–775. [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated