Computer Science and Mathematics

Sort by

Article
Computer Science and Mathematics
Computer Networks and Communications

Richard Murdoch Montgomery

Abstract:

This article presents the Differential Entangled Topology (DET) model, a sophisticated mathematical framework designed to simulate the complex, dynamic, and evolving nature of consciousness. By conceptualizing consciousness as an intricate network of entangled points distributed on the surface of a unit sphere, the model captures the fluid and unpredictable interactions that characterize conscious experience. Each point represents a neural region or conscious element, with its position undergoing differential, random shifts that simulate the process of entanglement. These shifts, governed by a specified entanglement factor, reflect the ongoing reconfiguration of consciousness over time. The system's progression is quantitatively analyzed through topological entropy, providing a measure of the system's complexity and the degree of unpredictability inherent in the entanglement process. The model's evolution is visualized in a 3D space, where bright points are projected onto the sphere's surface, and the background is shaded in a soft rosaceous hue, symbolizing the fundus of the system. This framework offers a novel mathematical perspective on the interconnected, non-linear nature of consciousness, providing insight into its continuous transformation and the underlying dynamics that drive cognitive states.

Brief Report
Computer Science and Mathematics
Applied Mathematics

Shahid Mubasshar

Abstract: The propagation of membrane tension perturbations is a potentially critical mechanism in the mechanical signal transduction across the surfaces of living cells. Tethered proteins in the cell membrane play a crucial role in the propagation of the membrane tension. Intact cell membranes in eukaryotic cells possess unique characteristics, such as transmembrane proteins bound to the underlying cortex, rendering them immobile over timescales of minutes to hours. These immobile obstacles significantly alter the dynamics of lipid flow. While existing simplified lipid flow models provide fundamental insights into membrane tension dynamics, they fall short in accounting for complex phenomena like vesicle crumpling. To address this, we propose a more sophisticated model of lipid bilayers, solving the Stokes equations in a two-dimensional framework with embedded obstacles. We employ the finite element method and the FEniCS library to solve the weak form of the Stokes equations, providing a more accurate representation of membrane behaviour under physiological conditions.
Review
Computer Science and Mathematics
Computer Science

Leonardo Juan Ramirez Lopez,

Genesis Gabriela Morillo Ledezma

Abstract: In higher education, especially in programming-intensive fields like computer science, safeguarding students' source code is crucial to prevent theft that could impact learning and future careers. Traditional storage solutions like Google Drive are vulnerable to hacking and alterations, highlighting the need for stronger protection. This work explores digital technologies that enhance source code security, with a focus on Blockchain and NFTs. Due to Blockchain’s decentralized and immutable nature, NFTs can be used to control code ownership, improving security, traceability, and preventing unauthorized access. This approach effectively addresses existing gaps in protecting academic intellectual property. However, as Bennett et al. highlight, while these technologies have significant potential, challenges remain in large-scale implementation and user acceptance. Despite these hurdles, integrating Blockchain and NFTs presents a promising opportunity to enhance academic integrity. A successful adoption in educational settings may require a more inclusive and innovative strategy.
Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Samuel Chukwuemeka Egere

Abstract: In order to anticipate traffic, modern traffic systems now employ non-parametric techniques like machine learning. The traffic management system’s execution depends on sensory data sent to it. A minor infrastructure breakdown, like a power outage, can significantly alter traffic patterns in the area. When this failure happens at a busy crossroads, it increases the time that cars idle, creates a traffic bottleneck, and may even result in a crash. This incident makes it clear that the existing system has weak places. A self-organizing traffic system called Virtual Traffic Light can be used to solve these problems. In this system, the vehicles at the intersection can collaboratively manage the traffic flow using data from every vehicle present. Federated machine learning can be adopted when the vehicles collaborate to predict traffic because data privacy is at the center of its operation. In this paper, we worked with traffic data from Austin Texas and focused on key metrics such as execution time and prediction accuracy of multiple federated prediction models. Among the models used, our results suggest that Stochastic Gradient Descent Regressor and Random Forest Regressor are a good choice for traffic prediction in our proposed Virtual Traffic Light system.
Article
Computer Science and Mathematics
Algebra and Number Theory

Weicun Zhang

Abstract:

The Riemann Hypothesis (RH) is proved based on a new expression of the completed zeta function ξ(s), which was obtained through pairing the conjugate zeros ρi and ρi in the Hadamard product, with consideration of the multiplicity of zeros. That is,ξ(s)=ξ(0)ρ(1sρ)=ξ(0)i=1(1sρi)(1sρi) =ξ(0)i=1(βi2αi2+βi2+(sαi)2αi2+βi2)miwhere ξ(0)=12, ρi=αi+jβi, and ρi=αijβi, with 0<αi<1 and βi0 as real numbers. mi1 is the multiplicity of ρi, and 0<β1β2.Then, according to the functional equation ξ(s)=ξ(1s), we have:i=1(1+(sαi)2βi2)mi=i=1(1+(1sαi)2βi2)miOwing to the divisibility contained in the above equation and the uniqueness of mi, each polynomial factor can only divide (and thereby equal) the corresponding factor on the opposite side of the equation.Thus, we obtain:(1+(sαi)2βi2)mi=(1+(1sαi)2βi2)mi,i=1,2,3,,This is further equivalent to:αi=12,0<β1<β2<β3<,i=1,2,3,,Thus, we conclude that the Riemann Hypothesis is true.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Jinsick Kim,

Byeongsoo Koo,

Moonju Nam,

Kukjin Jang,

Jooyeoun Lee,

Myoungsug Chung,

Yungseo Song

Abstract:

This study examines the security implications of generative artificial intelligence (GAI), focusing on models such as ChatGPT. As GAI technologies are increasingly integrated into industries like healthcare, education, and media, concerns are growing regarding security vulnerabilities, ethical challenges, and potential for misuse. To address these concerns, this research analyzes 1,047 peer-reviewed academic articles from the SCOPUS database using scientometric methods, including term frequency-inverse document frequency (TF-IDF) analysis, keyword centrality analysis, and latent dirichlet allocation (LDA) topic modeling. The results highlight significant contributions from countries such as the United States, China, and India, with leading institutions like the Chinese Academy of Sciences and the National University of Singapore driving research on GAI security. In the keyword centrality analysis, "ChatGPT" emerged as a highly central term, reflecting its prominence in the research discourse. However, despite its frequent mention, "ChatGPT" showed lower proximity centrality than terms like "model" and "AI." This suggests that while ChatGPT is broadly associated with other key themes, it has a less direct connection to specific research subfields. Topic modeling identified six major themes, including AI and security in education, language models, data processing, and risk management. The analysis emphasizes the need for robust security frameworks to address technical vulnerabilities, ensure ethical responsibility, and manage risks in the safe deployment of AI systems. These frameworks must not only incorporate technical solutions but also ethical accountability, regulatory compliance, and continuous risk management. This study underscores the importance of interdisciplinary research that integrates technical, legal, and ethical perspectives to ensure the responsible and secure deployment of GAI technologies.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Aziida Nanyonga,

Hassan Wasswa,

Keith Joiner,

Ugur Turhan,

Graham Wild

Abstract:

Despite its recent success in various industries, artificial intelligence has not received full acceptance; hence, it has been fully deployed by the aviation industry. This is partly attributed to, among other factors, the AI (Artificial Intelligence) model works as a black-box model with no clear explanations of how outputs are generated from the input samples. Aviation is an extremely sensitive application field, and this model’s opaqueness makes it hard for a human user in the aviation industry to trust such a model. The work in this study examines the classification performance of various AI algorithms. Then it applies the SHAP (SHapley Additive exPlanations) framework to generate and visualize global-based model explanations to understand which features are learned for the decision boundary of each model and how much each model contributes to the final model output. We also deployed a variation autoencoder to handle the imbalanced class distribution nature of the ATSB (Australian Transport Safety Bureau) dataset. We recorded competitive classification performance in accuracy, precision, recall, and F1-score for a three-class supervised learning-based classification problem.

Article
Computer Science and Mathematics
Probability and Statistics

Ayuba Jack Alhassan,

S. Ejaz Ahmed,

Dursun Aydin,

Ersin Yilmaz

Abstract: This study includes a comprehensive evaluation of six penalty estimation strategies for partially linear models (PLRMs), focusing on their performance in the presence of multicollinearity and their ability to handle both parametric and nonparametric components. The methods under consideration include Ridge regression, Lasso, Adaptive Lasso (aLasso), smoothly clipped absolute deviation (SCAD), ElasticNet, and minimax concave penalty (MCP). In addition to these established methods, we also incorporate Stein-type shrinkage estimation techniques that are standard and positive shrinkage, and assess their effectiveness in this context. To estimate the PLRMs, we considered a kernel smoothing technique grounded in penalized least squares. Our investigation involves a theoretical analysis of the estimators' asymptotic properties and a detailed simulation study designed to compare their performance under a variety of conditions, including different sample sizes, numbers of predictors, and levels of multicollinearity. The simulation results reveal that aLasso and shrinkage estimators, particularly the positive shrinkage estimator, consistently outperform the other methods in terms of Mean Squared Error (MSE) relative efficiencies (RE), especially when the sample size is small, and multicollinearity is high. Furthermore, we present a real data analysis using the Hitters dataset to demonstrate the applicability of these methods in a practical setting. The results of the real data analysis align with the simulation findings, highlighting the superior predictive accuracy of aLasso and the shrinkage estimators in the presence of multicollinearity. The findings of this study offer valuable insights into the strengths and limitations of these penalty and shrinkage strategies, guiding their application in future research and practice involving semiparametric regression.
Technical Note
Computer Science and Mathematics
Probability and Statistics

Hening Huang

Abstract:

This technical note investigates a p-value paradox that emerges in the conventional proportion test. The paradox isdefined as the phenomenon where “decisions made on the same effect size from data of different sample sizes may be inconsistent.” It is illustrated with two examples from clinical trial research. We argue that this p-value paradox stems from the use (or misuse) of p-values to compare two proportions and make decisions. We propose replacing the conventional proportion test and its p-value with estimation statistics that include both the observed effect size and a reliability measure known as the signal content index (SCI).

Essay
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Fengyu Li,

Xianyong Meng,

Ke Zhu,

Jun Yan,

Lining Liu,

Pingzeng Liu

Abstract: In order to ensure the price stability of niche agricultural products and enhance farmers' income, the study delves into the pattern of ginger price fluctuation rule and its main influencing factors. By combining seasonal decomposition STL, long and short-term memory network LSTM, attention mechanism ATT and Kolmogorov-Arnold network, a combined STL-LSTM-ATT-KAN prediction model is developed, and the model parameters are finely tuned by using multi-population adaptive particle swarm optimization algorithm (AMP-PSO). Based on an in-depth analysis of actual data on ginger prices over the past decade, the STL-LSTM-ATT-KAN model demonstrated excellent performance in terms of prediction accuracy: its mean absolute error (MAE) was 0.111, mean squared error (MSE) was 0.021, root mean squared error (RMSE) was 0.146, and the coefficient of determination (R²) was 0.998.This study provides the Ginger Industry, agricultural trade, farmers and policy makers with digitalized and intelligent aids, which are important for improving market monitoring, risk control, competitiveness and guaranteeing the stability of supply and price.
Article
Computer Science and Mathematics
Computer Networks and Communications

Barnty William,

Clement Adebayo

Abstract: This study investigates the role of Information Systems (IS) and technology in optimizing planning functions within Nigeria’s telecom companies. The purpose of the research is to explore how the adoption of IS, such as Enterprise Resource Planning (ERP), Customer Relationship Management (CRM), and Business Intelligence (BI), can enhance strategic planning capabilities in the telecom sector. Using a mixed-methods approach, the study combines quantitative surveys with qualitative interviews to gather insights from industry professionals. Key findings reveal that IS adoption significantly improves decision-making (85%), resource allocation (78%), and market analysis (71%). However, challenges such as poor infrastructure (76%), skills gaps (64%), and high implementation costs (58%) hinder the full potential of these technologies. The study concludes that while IS plays a crucial role in optimizing planning functions, addressing infrastructure deficiencies and investing in workforce development are essential for maximizing its effectiveness. The research provides recommendations for telecom companies and policymakers to enhance infrastructure, support employee training, and explore emerging technologies to further improve strategic planning and maintain a competitive advantage in the rapidly evolving telecom industry.
Article
Computer Science and Mathematics
Analysis

Radu Precup,

Andrei Stan

Abstract:

In this paper, we extend the concept of b-metric spaces to the vectorial case, where the distance is vector-valued, and the constant in the triangle inequality axiom is replaced by a matrix. For such spaces, we establish results analogous to those in the b-metric setting: fixed-point theorems, stability results, and a variant of Ekeland’s variational principle. As a consequence, we also derive a variant of Caristi’s fixed-point theorem.

Article
Computer Science and Mathematics
Robotics

Lachlan Chumbley,

Benjamin Meyer,

Akansel Cosgun

Abstract: We introduce the Australian Supermarket Object Set, a dataset comprising 50 readily available supermarket items with high-quality 3D textured meshes. Designed for both real-world and simulated environments, this dataset supports benchmarking in robotic manipulation and computer vision tasks. This paper details the dataset’s construction process, object categorization, and its potential to enhance reproducibility and advance research in robotics and computer vision.
Article
Computer Science and Mathematics
Computer Science

Yutaka Yoshida,

Emi Yuda,

Kiyoko Yokoyama

Abstract: The ability to respond swiftly and accurately to visual stimuli is critical for safe driving. Traditional Psychomotor Vigilance Test (PVT) primarily assess response time (RT) using finger inputs, but these do not directly evaluate foot responses essential for vehicle control. This study introduces a novel Foot Psychomotor Vigilance Test (Foot PVT) designed to measure RTs of the foot in response to simulated traffic lights. The Foot PVT integrates a traffic light display interface with a three-pedal system, simulating basic driving condi-tions. RTs are recorded for three colors (blue, yellow, red) displayed in a randomized order, and response accuracy is evaluated based on pedal input. The system also measures cor-rection times for errors, offering insights into a driver's ability to recover from mistakes. In-itial validation with six participants (25±3 years old) demonstrated consistent and com-parable RTs with existing studies. By addressing the gap between visual stimuli and foot responses, the Foot PVT provides a unique tool for assessing psychomotor performance in the context of driving. This application holds potential for screening driver capabilities, particularly among elderly individuals and novice drivers, and for contributing to the development of more comprehensive traffic safety systems.
Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Muhammad Faisal,

Ery Muchyar Hasiri,

Darniati Darniati,

Titik Khawa Abd Rahman,

Billy Eden William Asrul,

Hamdan Gani,

Respaty Namruddin,

Najirah Umar,

Nurul Aini,

Sri Wahyuni

+2 authors
Abstract: In response to the global prioritization of environmental protection, nations increasingly focus on transitioning to low-carbon economies as a central strategy for achieving sustainable socio-economic growth. This study introduces a novel framework designed to evaluate, cluster, and map urban carbon policies by employing an integrated methodology that hybrid Multi-Objective Optimization (MOO), Multi-Criteria Decision-Making (MCDM), and Particle Swarm Optimization - Self Organizing Map (PSOM) techniques. Furthermore, a machine learning model utilizing the Random Forest algorithm is developed to enhance predictive capabilities, incorporating adaptive weighting and dynamic clustering for expert-informed analysis of sustainability strategies. Applying this hybrid approach to testing the dataset reveals that the predictive results can effectively guide city-level carbon trading strategies, thereby supporting government initiatives based on technology-based policy-making. The findings demonstrate that the proposed framework can robustly analyze and visualize complex datasets, contributing to mapping carbon trading policies against cities. Future research directions include exploring the integration of Internet of Thinks (IoT) systems to refine carbon trading strategies further, thus advancing the development of smart cities and promoting a more adaptive and sustainable urban environment.
Review
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Chenguang Lu

Abstract: Does semantic communication require a semantic information theory parallel to Shannon's information theory, or can Shannon's work be generalized for semantic communication? This paper advocates for the latter and introduces the semantic information G theory (with "G" denoting generalization). The core approach involves replacing the distortion constraint with the semantic constraint, achieved by utilizing a set of truth functions as a semantic channel. These truth functions enable the expression of semantic distortion, semantic information measures, and semantic information loss. Notably, the maximum semantic information criterion is shown to be equivalent to the maximum likelihood criterion and parallels the Regularized Least Squares criterion. The G theory is compatible with machine learning methodologies, offering enhanced capabilities for handling latent variables, often addressed through Variational Bayes. This paper systematically presents the generalization of Shannon's information theory into the G theory and its wide-ranging applications. The applications involve semantic communication, machine learning, constraint control, Bayesian confirmation, portfolio theory, and information value. Furthermore, insights from statistical physics are discussed: Shannon information is equated to free energy, semantic information to the free energy of local equilibrium systems, and information efficiency to the efficiency of free energy in performing work. The paper also proposes refining Friston's minimum free energy principle into the maximum information efficiency principle. Lastly, it discusses the limitations of the G theory in representing the semantics of complex data.
Article
Computer Science and Mathematics
Mathematical and Computational Biology

José Alberto Rodrigues

Abstract: Persistent homology is a powerful tool in topological data analysis that captures the multi-scale topological features of data. In this work, we provide a mathematical introduction to persistent homology and demonstrate its application to protein-protein interaction networks. We combine persistent homology with algebraic connectivity, a graph-theoretic measure of network robustness, to analyze the topology and stability of PPI networks. An example is provided to illustrate the methodology and its potential applications in systems biology.
Article
Computer Science and Mathematics
Computer Science

Qian Yu,

Yuchen Yin,

Shicheng Zhou,

Huailing Mu,

Zhuohuan Hu

Abstract: Financial fraud in listed companies has grown increasingly complex, challenging traditional detection methods. This paper introduces the Transformer-style Convolutional Neural Network (CNN-Transformer), a unified architecture that integrates CNNs' capability for localized feature extraction with Transformers' strength in capturing long-range contextual dependencies. CNN-TRANSFORMER employs large-kernel convolutions and a self-attention mechanism to enhance feature interaction while maintaining computational efficiency via a linear-complexity Hadamard product approach. Evaluated on a dataset from China's A-share market, CNN-TRANSFORMER outperforms models like ResNet, MLP, and standalone CNNs or Transformers in accuracy, precision, recall, and AUC. The model's capacity to simultaneously extract localized features and model comprehensive contextual relationships enables robust detection of sophisticated fraudulent patterns. This research underscores the potential of CNN-Transformer hybrids in financial fraud detection, offering a scalable and interpretable AI solution. Future work may focus on integrating unstructured data, improving transparency, and optimizing efficiency for broader applications.
Article
Computer Science and Mathematics
Computer Networks and Communications

Filip Jerkovic,

Nurul I. Sarkar,

Jahan Ali

Abstract:

Homomorphic Encryption (HE) introduces new dimensions of security and privacy within federated learning (FL) and Internet of Things (IoT) frameworks that allow preservation of user privacy when handling data for FL occurring Smart Grid (SG) technologies. In this paper, we propose a novel SG IoT framework to provide a solution of predicting energy consumption while preserving user-privacy in a smart grid system. The proposed framework is based on the integration of FL, edge computing, and HE principles to provide a robust and secure framework to conduct machine learning workloads end-to-end. In the proposed framework, edge devices are connected to each other using P2P networking and the data exchanged between peers is encrypted using CKKS fully HE. The results obtained show that the system can predict energy consumption as well as preserve user privacy in SG scenarios. The findings provide an insight into the SG IoT framework that can help network researchers and engineers to contribute further towards developing a next generation SG IoT system.

Article
Computer Science and Mathematics
Artificial Intelligence and Machine Learning

Iosif Iulian Petrila

Abstract: The neural information organizing and processing principles are presented in a general transdisciplinary axiomatic form, highlighting the fundamental neural informational characteristics and processes as a guiding informational synthesis useful in various fields from neurosciences to neural processing units design and artificial intelligent software implementations. The proposed neural information organizing and processing principles highlights the fundamental characteristics of neural information: function, memorization, nondeterminism, fragmentation, aggregation, nonlinearization, geometrization, parallelization, adaptation, and objectivation. The presented principles are formulated in order to facilitate transdisciplinary utility even if were synthesized through neural specific informational organization in computing implementations and approaches as generalization, abstraction and in correlation with biological neuronal systems, especially viewed from an informational perspective.

of 434

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2025 MDPI (Basel, Switzerland) unless otherwise stated